Hadoop Administrator-Big Data Admin
Our Corporate Technology team relies on smart, driven people like you to develop applications and provide tech support for all our corporate functions across our network. Your efforts will touch lives all over the financial spectrum and across all our divisions: Global Finance, Corporate Treasury, Risk Management, Human Resources, Compliance, Legal, and within the Corporate Administrative Office. You'll be part of a team specifically built to meet and exceed our evolving technology needs, as well as our technology controls agenda.
Corporate Technology is looking for a Big Data Database Administrator (DBA) with a background in supporting large Hadoop clustered environments. Skill sets include (but not limited to) database tuning, manage/support database release cycles, ensure the availability, stability and performance of all database environments which include Development, QA, UAT, SIT production and disaster recovery databases.
This position is anticipated to require the use of one or more High Security Access (HSA) systems. Users of these systems are subject to enhanced screening which includes both criminal and credit background checks, and/or other enhanced screening at the time of accepting the position and on an annual basis thereafter. The enhanced screening will need to be successfully completed prior to commencing employment or assignment.
Key responsibilities in this role:
- Responsible for troubleshooting and development on Hadoop technologies like HDFS, Hive, Pig, Flume, MongoDB, Accumulo, Sqoop, Zookeeper, Spark, MapReduce2, YARN, HBase, Tez, Kafka, and Storm.
- Be a part of a POC team and help build new Hadoop clusters Performing cluster coordination services via Zookeeper.
- Assist MapReduce programs running on the Hadoop cluster.
- Pre-processing using Hive and Pig.
- Develop highly scalable and web services with exceptional performance, for data tracking.
- Managing and deploying HBase.
- Translate, load and exhibit unrelated data sets in various formats and sources like JSON, text files, Kafka queues, and log data.
- Fine tune applications and systems for high performance and higher volume throughput
- General operational expertise such as good troubleshooting skills, understanding of system's capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks.
- Excellent knowledge of Linux as Hadoop runs on Linux.
- Responsible for implementation and support of the Enterprise Hadoop environment.
- Involves designing, capacity arrangement, cluster set up, performance fine-tuning, monitoring, structure planning, scaling and administration.
- Knowledge of Troubleshooting Core Java Applications is a plus.
- Include DBA Responsibilities like data modeling, design and implementation, software installation and configuration, database backup and recovery, database connectivity and security.
When you work at JPMorgan Chase & Co., you're not just working at a global financial institution. You're an integral part of one of the world's biggest tech companies. In 14 technology hubs worldwide, our team of 40,000 technologists design, build and deploy everything from enterprise technology initiatives to big data and mobile solutions, as well as innovations in electronic payments, cybersecurity, machine learning, and cloud development. Our $9.5B annual investment in technology enables us to hire people to create innovative solutions that will not only transform the financial services industry, but also change the world.
At JPMorgan Chase & Co. we value the unique skills of every employee, and we're building a technology organization that thrives on diversity. We encourage professional growth and career development, and offer competitive benefits and compensation. If you're looking to build your career as part of a global technology team tackling big challenges that impact the lives of people and companies all around the world, we want to meet you.