What is BlockFi?
BlockFi's mission is to provide liquidity, transparency and efficiency to digital financial markets by creating products that meet the needs of consumers and corporations across the globe. We build bridges between traditional finance and digital markets that enable growth for all participants.
We're a team of builders and strivers, proud to champion financial inclusivity and offer economic opportunities around the globe. We provide the same inclusivity to our team members. BlockFi is a place where diversity is celebrated, individuality is recognized, and every single team member is valued. We are changing the status quo to be the first financial company that people love, and we rely on our people to make it happen!
BlockFi is looking for a Senior Data Engineer
to join our growing team! About the Team
The Data Platforms organization is responsible for the end-to-end data needs of BlockFi products and services, composed of following teams: Data Engineering, Data Strategy, Master Data Management, & Machine Learning Eng. The Senior Data Engineer will be a part of the Enterprise Data Engineering organization serving all of BlockFi data needs. The Data Engineering Team designs, builds and supports data platforms, products, pipelines, governance frameworks that powers analytics, business insights, data science and machine learning. We aim to make data a competitive advantage for BlockFi by empowering our business partners with industry-leading insights and tools so that they can make fast, bold decisions with trusted data to create unsurpassed client experiences and grow our market share. We enable automation at scale that helps reduce risk, improves speed and eliminates manual processes. Your Mission
As a Senior Data Engineer, you will build big data platforms, data architectures, data products, and data pipelines, needed to enable data analysts, data scientists and other coworkers across BlockFi to make data driven decisions. You will design and implement technical solutions, and mentor junior engineers. We are looking for proactive, collaborative, and adaptive engineers who have real world distributed systems experience at scale. Architect/Design:
- Work with diverse stakeholders to ensure our data platforms are built for availability, reliability, resilience, scalability, performance, and security from the ground up.
Deliver and Own Solutions:
- Write design proposals and review proposals from other data engineers. Ensure tradeoffs are clearly and publicly documented, and that designs are aligned with business goals.
- Responsible for creating and executing on plans and designs end to end: estimating, prototyping, implementing, testing, maintaining, debugging, and supporting high-quality software in production.
- Work with stakeholders to understand and document both functional and quality attribute requirements.
- Adhere to quality standards through cross-team communication, mentoring, code review, and backlog grooming.
- Accountable for system availability and monitoring system health; ensure alerts, metrics, and runbooks are in place; and debug issues in production.
- Quickly learn new tools and technologies, develop an understanding of existing systems, and identify and tackle high impact work.
- Proactively seek to learn about the company, products, processes, and culture. Align technical decisions with business goals.
- Technical Breadth as well as Depth in Several Areas: 5+ years of experience as a data engineer with extensive experience in architecting data warehouses, data lakes, and data platforms for consumption by analytics and data science/ML. Extensive experience in building data pipelines (batch ETL, micro batches, and real-time streaming).
- Technical Ownership: Experience owning data platforms end-to-end (from data generation/ingestion to curates/aggregated layers), designing, estimating, implementing, testing, maintaining, debugging, and supporting high-quality software in production. Experience in building foundational, curated and aggregated data layers enabling self-service business intelligence (easily consumable by non-technical users)
- Communication: Excellent communication, presentation and interpersonal skills.
- Collaboration: Empathetic and does the legwork required for building consensus. Always seeks out feedback on technical designs, solutions, and code.
- Initiative and focus on outcomes: Works independently and takes initiative while maintaining transparency and collaboration. Can deliver high quality solutions without assistance. Proactively identifies problems and comes to conversations with possible solutions.
- Adaptive: Ability and motivation to quickly learn new languages, technologies and tools. Pragmatic bias toward outcomes, and technical decisions that solve real business problems.
- Successful candidates will have:
- Strong knowledge of cloud data platforms (either one of AWS, GCP, Snowflake, Azure)
- Strong skills in Python, ETL transformations, data modeling, & feature engineering
- Experience with SQL adapters such as Ecto and managing SQL schema changes with code
- Experience with AWS cloud services: S3, EC2, RDS, Aurora, Redshift (or other cloud services)
- Experience with real-time stream-processing systems: Kinesis, EventBridge, Confluent, Kafka or similar.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing data pipelines, architectures and data sets.
- Strong business acumen, critical thinking and technical abilities along with problem solving skills.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores.
We benefit from the great work our employees do each day. That is why we are committed to providing a variety of awesome benefits to help them live their best lives.
- Competitive Compensation because we value your experience and expertise
- Unlimited vacation / sick days because everyone deserves time for R&R
- Flexible work environment because we are a geographically dispersed team and we believe in balance
- A close-knit team of enthusiastic, collegial and driven people to work alongside in a highly meritocratic environment because teamwork makes the dreamwork
BlockFi has experienced incredible growth since our launch in August 2017. Our client base has grown to more than 225,000 (and counting), and the company now boasts more than $15 billion in assets on our platform. We recently completed a Series D funding round placing the company's valuation at $3 billion, and our team now has more than 500 people worldwide. We have established ourselves as a crypto market leader, and as we expand our product suite and geographic footprint, we expect our addressable market to grow exponentially.
BlockFi's leadership team has decades of experience in the traditional financial services and banking world, and we take a conservative approach to regulation that will position us well for sustainable long-term growth and expansion.
Our team is comprised of highly motivated professionals from diverse backgrounds. We are aiming to become the leading lender in crypto and are poised to redefine the global financial ecosystem for the better. In addition:
- BlockFi is one of the first companies to ever offer crypto-backed loans and the only company whose founding team has an institutional understanding of the debt capital markets and regulatory landscape in the U.S.
- $100 MM of Series A, B, and C funding led by Valar Ventures with participation from Susquehanna, Winklevoss Capital, Fidelity, Galaxy Digital, Akuna Capital, and Morgan Creek
- $350 MM of Series D funding led by Bain Capital Ventures, partners of DST Global, Pomp Investments and Tiger Global
- We are moving quickly and have already deployed substantial capital into the space, proving our ability to execute and capture customer demand