• Competitive
  • Raleigh, NC, USA
  • Permanent, Full time
  • Credit Suisse -
  • 17 Jun 19

Big Data Hadoop Developer #131881

We Offer
The Treasury Global MI team works on the critical Liquidity Risk and regulatory reporting aspects of LRP program. The team consists of developers, IT BA's and testers based in NY, Raleigh, Wroclaw and Pune. We use Agile methodologies and uses Scrum as framework for project deliveries.
  • You will be fascinated with Big Data Hadoop Developer role with Oracle PL/SQL hands-on. Deep knowledge in a dynamic and international team of Agile developers working on development of key regulatory and risk applications used for Liquidity Management and Reporting by Credit Suisse Treasury
  • You will work as a Big Data Developer designs and develops application code, implements technical solutions, and configures applications in various environments in response to business problems
  • You will closely liaise with Architects, Business Analysts and Change Partners to understand the functional requirements and develop technical solutions
  • You will analyze, design, develop, support and maintain Big Data Hadoop environment and code base, support traditional Oracle Data warehouse and processes designed using Oracle PL/SQL for data loading used
  • You will assist and participate in the team's effort to ensure that the implemented results do meet the partners requirements and perform the necessary IT testing and validation activities

Credit Suisse maintains a Working Flexibility Policy, subject to the terms as set forth in the Credit Suisse United States Employment Handbook.

You Offer
  • You have 2+ years of overall experience as a Big Developer: Deep knowledge in using Cloudera Hadoop components such as HDFS, Sentry, HBase, Impala, Hue, Spark, Hive, Kafka, YARN, ZooKeeper
  • You are familiar with Cloudera/MapR
  • Deep knowledge on Big Data querying tools, such as Pig, Hive, and Impala
  • You are proficient with integration of data from multiple data sources
  • You are experienced with NoSQL databases, such as HBase, Cassandra, MongoDB
  • Deep knowledge of various ETL techniques and frameworks, such as Flume
  • You have run into various messaging systems, such as Kafka or RabbitMQ
  • You have validated background with Big Data ML toolkits, such as Mahout, SparkML, or H2O
  • You have 2+ years of practice as an ETL or PL/SQL developer for an application which includes but not limited to Oracle Database, PL/SQL programming, Informatica, Unix, Shell scripting, SVN, Control M framework, Performance tuning
  • You have deep understanding of Agile software development methodologies, specially Scrum is desirable
  • You have finish degree from an accredited university, preferably in Computer Science or related discipline, or comparable industry experience