Data Engineer Data Engineer …

BNY Mellon Investment Management
in Pittsburgh, PA, United States
Permanent, Full time
Last application, 08 Jul 20
Competitive
BNY Mellon Investment Management
in Pittsburgh, PA, United States
Permanent, Full time
Last application, 08 Jul 20
Competitive
Data Engineer


Mellonis a global multi-specialist manager dedicated to serving our clients with afull spectrum of single and multi-asset investment strategies and solutions.With roots dating back to 1933, Mellon has been innovating across asset classesfor generations and has the combined scale and capabilities to offer clients abroad range of solutions. From asset class expertise to broad market exposures,clients drive what we do. We are holistic in approach, client driven andcommitted to investment excellence. We aim to be a key partner for our clientsby delivering customized investment outcomes and best-in-class service.

RoleOverview

TheData Engineer is responsible for building and supporting systems to transform,store, and improve processes around data for Mellon Research. This role will focus on the Mellon researchdata pipeline, warehouse, databases, and BI tooling. He/she will work with business analyst, datascientists, and other data engineers to facilitate ETL/ELT processes that move,clean, and store data. The engineer willalso be tasked with creating data accessibility points and tooling to enablereporting insights with ease of use and maintenance in mind. The data engineer is expected to provideinput to end state design and schema while enforcing best practices.
Responsibilities

Design,build, and maintain efficient and progressive data infrastructure for Mellonresearch across disparate research silos in San Francisco, Boston, and Pune focusingon creating a transparent data environment.

  • Engage in a variety oftactical projects including but not limited to ETL, storage, visualization,reporting, web- scraping, and dashboard development

  • Support, document, andevolve (re-architect as needed) existing core data stores

  • Utilize ETL tooling tobuild, template, and rapidly deploy new pipelines for gathering and cleaningdata

  • Analyze existing datastores / data marts, clean, and migrate into a centralized data lake

  • Work with Technologyand Research leads to implement central and/or virtualized warehousing solutions

  • Develop APIs foraccessing data, for use by business users (i.e., researchers and portfoliomanagers)

  • Configure Tableaudashboards and reports while serving as SME for end consumers of data

  • Identify and deployadvanced BI tooling on top of datasets including AI/ML/DL techniques and algorithms

  • Assist in the designand development of enterprise data standards and best practices

  • Use modern tooling tofocus on progressive technology and expand business capabilities and time tomarket

Work closely with business analysts, data scientists, andtechnologists through full project lifecycles which will provide deep insighton research needs, business processes, and research practices.

  • Gather requirements andanalyze solution options

  • Develop solutions and defineand execute test plans

  • Define and implementoperational procedures

  • Automate the researchand review of data quality issues to ensure data accuracy and reliability

  • Resolve data integrityand data validation issues.

  • Produce ad-hoc queriesand reports for non-standard requests from Data Scientists and Data Consumers.

  • Become SMEs on the fullsuite of solutions delivered by the Research Data Engineering team with an eyeto identify, analyze, and interpret trends or patterns to identify new solutionoptions, define process improvement opportunities and generate valueopportunities for our business partners.


Qualifications


  • Bachelor's degree or equivalent workexperience required

  • 6+ years of experienceas a data engineer, software engineer, or similar

  • Strong Experiencebuilding ETL pipelines and knowledge of ETL best practices

  • Experience with overalldata architecture and data routing design

  • Familiarity with dataquality control tools and processes
    Strong communication skills and a keen attention to detail

TechnicalQualifications:

Candidateis not expected to have expertise in all technical areas listed but should behighly proficient in several of these including:

  • SQL, R, Python, Matlab,SSIS, Pentaho/Kettle, Excel, Tableau, MongoDB, Kafka, Hive/Spark, Parquet

  • Experience with CI/CD,container, and frameworks: GitLab, Selenium, Docker, Kubernetes

  • Disciplines: MicroserviceArchitecture, Design Patterns

  • Environment Tooling: Agile,JIRA, Confluence

  • Familiarity with RDBSand/or NoSQL and related best practices

Niceto Have Qualifications:

  • Experience working in investment research and/orquantitative finance
  • Advanced Degree or CFA

  • Development experiencewith R or Python in a data-science or research setting

  • Knowledge/Experience with financial data provider API's(Bloomberg/Factset/Datastream/MSCI)
  • Experience in EAGLE PACE Access and Oracle

  • Knowledge/Experiencewith the following technologies:
    • Symphony (STC)
    • .Net Core
    • Snowflake
    • .Net Core
    • Dataiku
    • Cloud and distributed computing experience
    • Big Data Experience


BNY Mellon is an Equal Employment Opportunity/Affirmative Action Employer.
Minorities/Females/Individuals With Disabilities/Protected Veterans.

Our ambition is to build the best global team - one that is representative and inclusive of the diverse talent, clients and communities we work with and serve - and to empower our team to do their best work. We support wellbeing and a balanced life, and offer a range of family-friendly, inclusive employment policies and employee forums.

Primary Location: United States-Pennsylvania-Pittsburgh
Internal Jobcode: 45144
Job: Asset Management
Organization: Mellon With TOH ADJ-HR13428
Requisition Number: 2006649
Close
Loading...