Asset Management Recruitment Specialist
A forward-thinking asset management company is looking to hire an ambitious, intellectually curious and creative Database Engineer to join an innovative new team within the business. This small group act as the data science engine for the firm, providing new insights which can aid investments, client teams and engagement teams. This involves providing the means to use data to give insights with the purpose of building competitive advantage through cutting-edge insights, reports and tools, which improve client outcomes and establish global market leadership.
This role will work on the integration, management, and modelling of structured and unstructured data sources, allowing the development of a scalable data interface, as well as facilitating the incorporation of new data sources.
The Data Engineer will be the bridge between the Quant and UI Developers in the team. This person will be responsible for ingesting new data sourced by the Quant Developer into a repository that will be used by other members of the team for generating tools and reports. Additionally, the Data Engineer will ensure the smooth running of the databases by monitoring exceptions, optimising queries and completing all the necessary testing procedures to ensure data integrity. The successful candidate must be comfortable with multi-tasking and working on multiple projects in parallel. Prior experience with managing a DevOps environment would be an advantage.
Role & Responsibilities
- Maintain data as they are ingested and integrate them with the firm’s data warehouse.
- Integrate new data sources in the team’s data warehouse.
- Review daily production process: ensure all tasks run as expected, monitor exceptions, and address any issues in a timely fashion to the team.
- Produce all the required views, stored procedures, and table structures to facilitate data analytics.
- Work with the data science and visualization teams to allow easy access to the underlying data.
- Manage the DevOps environment:
- building and setting up new development tools and infrastructure
- working on ways to automate and improve development and release processes
- Designing and implementing highly performant data ingestion pipelines from multiple sources using Azure Databricks, Azure Data Factory, or Apache Spark
- Partner with IT on maintaining the environment, collaborating on upgrades and setting the strategy for the team
- Produce all the relevant documentation explaining the data flows and comply with the guidelines that the IT department has set.
- Create all the necessary testing procedures to ensure data consistency, and regular updating of the data sources.
Skills & Experience required
- Excellent knowledge of MSSQL, including optimizing queries, stored procedures, views, schema design and dimensional modelling.
- Technical proficiency in Python or R, and NoSQL databases is an advantage.
- Experience working with large datasets.
- Previous experience working with non-financial data and using alternative data, such as geospatial, text, etc., is an advantage.
- Familiar with the software development life cycle (SDLC).
This is a great opportunity for a creative and curious data professional to join a team at the forefront of innovation within the investment industry
Please note that only successful applicants can be contacted.