Data Pipeline Developer

  • Competitive Salary
  • London, England, United Kingdom
  • Permanent, Full time
  • BGC Partners
  • 25 Apr 19

Main purpose of the role: This is a senior development role working in the team responsible for the firm's Big Data analytics platform, with a specialism in Java or Scala required.

Main purpose of the role:
This is a senior development role working in the team responsible for the firm's Big Data analytics platform, with a specialism in Java or Scala required. The successful candidate will work in the data team to build out the existing platform to support additional technology and business use-cases involving both on-premise and Cloud implementations. The platform delivers data solutions across BGC Group including use-cases for: market data analytics, regulatory reporting, surveillance and revenue analysis.

Key responsibilities:

The objectives of the role:
  • To work in partnership with the business/business analysts to identify key requirements for implementation
  • To identify any technical requirements for new products
  • To work with all the business and technology departments to ensure all business and technical requirements are met
  • To identify and manage any integration work
  • To analyse, design and build any such projects
  • To provide input into the current and on-going system architecture
  • To liaise with other development teams as necessary to implement cross-team projects
  • Be alert to Conduct Risk issues, specifically the risk of harm to client interests, market integrity and/or competition in financial markets due to inappropriate practices or behaviours across the firm
To undertake and manage:
  • Systems analysis and design
  • Systems development
  • Systems documentation
  • Production support and out-of-hours system maintenance

Skills / experience required:

Minimum development experience working with Big Data platforms covering the following technologies:

  • Java or Scala on Linux. Docker beneficial
  • Hortonworks, Hive
  • Kafka, KafkaConnect, KafkaStreaming
  • Spark, SparkSQL, Yarn, Ansible
  • Substantial database experience: Relational and NoSQL data modelling.
  • Experience of software development in a financial services environment advantageous
  • Willingness to keep up to date with latest technology trends and proactively identify appropriate areas into which they can be applied
  • Solid Computing Degree

Additional Beneficial Skills / Experience:

  • Cassandra / Datastax, ElasticSearch, LogStash, Kibana
  • Experience in addressing efficient data storage and querying against very large stores
  • Experience in accommodating key requirements of MIFID II in designs/implementations
  • AWS: Data Pipeline, S3, EMR, Lamda, DynamoDB, RedShift
  • Experience in the use of high performance messaging middleware such as Tibco RV or Solace
Personal attributes:
  • Team player with excellent inter-personal skills and confident communication skills.
  • Able to effectively disseminate knowledge and experience to less experienced team members.
  • Must be able to deal with our customers effectively, ie development teams working across the globe using Messaging and Data products, alongside external vendors who use the products.
  • Must be able to deal with and adapt to change extremely effectively.
  • Must be proactive in generating ideas and effective at developing solutions that are balanced, proportionate and effective. It is critical that this is achieved in collaboration with the global team.
  • Ability to work and function under pressure, handle multiple tasks, and shifting priorities
  • "Self-starter" always looking to improve quality of process and deliverables and keen to take a lead role in that process