Manager - Projects Senior Kafka Developer
What makes Cognizant a unique place to work? The combination of rapid growth and an international and innovative environment! This is creating many opportunities for people like YOU - people with an entrepreneurial spirit who want to make a difference in this world.
At Cognizant, together with your colleagues from all around the world, you will collaborate on creating solutions for the world's leading companies and help them become more flexible, more innovative and successful. Moreover, this is your chance to be part of the success story: we are looking for a Senior Kafka Developer to join our communication and technology Team. About Cognizant Digital Business
Tell about your practice - what is that you do? Why work for your practice? About AIA
Cognizant's AIA Practice have been formed to support our clients with Database, analytics related requirements. As we are strengthening our Netherlands and European operations, the AIA Practice wants to leverage our thought-leadership capabilities to drive clients to accelerate the pace of their inevitable towards analytics applications and their journey. Cognizant has implemented the strategic global relationships to support our aspirations going forward.
More information? Please visit https://be.cognizant.com/sites/cognizant-digital-business/SitePage/285308/cognizant-digital-business-onecdb About role
Explain the role, main focus and responsibilities
As a Senior Kafka Developer
, you will get an opportunity to work on delivery of the Data hub (event streaming processing) project. You will get to understand the client business requirements and contribute to the design, concentrate on the pure technical solution, learn new technologies, and working in agile environment.
You will become an active member of the DevOps team that works closely with other disciplines/roles. You are T-shaped professional and responsible for the whole life cycle (i.e. development, operations) of in-house or packaged applications according to development/test standards and corporate architecture policies including security and data privacy guidelines. Our ideal candidate (Key Skills & Experience)
What you can expect (use below or add specifics to your team)
- 10+ years of IT experience with end-to-end lifecycle projects.
- Minimum 3-5 years of relevant experience in Kafka, HDFS, Hive, MapReduce,
- Worked and developed at least one data ingestion pipeline
- Should have good exposure to Core Java or Other Programming languages like Scala and Python position.
- Excellent understanding of Object-Oriented Design & Patterns
- Experience working independently and as part of a team to debug application issues working with configuration files\\\\databases and application log files.
- Should have good knowledge on optimization & performance tuning
- Experience in working with shared code repository (VSS, SVN or Git)
- Good Experience in Basic SQL/Shell scripting ,awareness about Hadoop
- Should be able to work or enhance on predefined frameworks
- Should be able to communicate effectively with Customers
- Become part of a the 'flag ship' success story
- Mediator: able to guide the team in adopting the architecture vision and software design principles;
- Negotiation: being able to negotiate trade-offs between non-functional requirements
- Being able to act as a mature sparring partner with the Solution Architect.
- Open, 'can do' team spirit
- Environment where you can make your own ideas reality & Drive your own career