Cadre AI Logo

Cadre AI

Head of Data Engineering

Posted 3 Days Ago
Be an Early Applicant
In-Office
San Diego, CA
Senior level
In-Office
San Diego, CA
Senior level
As Head of Data Engineering, you will architect and scale Snowflake data warehouse solutions, lead client engagements, build a data engineering practice, and mentor data engineers.
The summary above was generated by AI
Head of Data Engineering

San Diego, CA  ·  In-Person  ·  Full-Time

About Cadre AI

Cadre AI is an AI strategy and integration firm that builds production AI systems for B2B companies in private equity, wholesale lending, real estate, and SaaS. We don’t build decks about what AI could do. We ship systems that move revenue, compress costs, and automate the work that used to take entire teams.

The Role

The Head of Data Engineering will build and lead Cadre AI’s Data Engineering practice from the ground up. This is a senior, hands-on leadership role responsible for architecting and scaling multi-tenant Snowflake data warehouse solutions—both for our internal operations and as a core service offering for our clients.
You are the technical authority on all things data infrastructure. You work directly with clients to design cloud-native data platforms, build modern ELT/ETL pipelines, and establish data governance frameworks. As the practice grows, you will recruit and mentor a team of data engineers, evolving this function into a standalone revenue-generating practice within Cadre AI.
You can whiteboard a Snowflake architecture in the morning, pair with engineers on dbt models in the afternoon, and present a data strategy roadmap to a client’s leadership team by end of day. That range is the job.

What You’ll Do

Architect and Build Data Infrastructure
  • Design, build, and optimize multi-tenant Snowflake data warehouse architectures for Cadre AI and its clients—ensuring scalability, security, and cost efficiency
  • Develop and maintain modern ELT/ETL pipelines using dbt, Airflow, Fivetran, and custom Python-based ingestion frameworks
  • Implement data modeling best practices—star schema, snowflake schema, Data Vault—tailored to each client’s analytical and operational needs
  • Establish data governance, quality, and lineage frameworks across multi-client environments
  • Drive cloud infrastructure decisions on AWS, Azure, or GCP with a focus on Snowflake-native capabilities including Snowpark, Cortex, Streamlit, and Snowpipe
  • Build repeatable reference architectures, accelerators, and templates that can be deployed across client engagements to improve delivery speed and consistency

Lead Client Delivery and Consulting
  • Serve as the senior technical advisor and trusted consultant to clients on data strategy, architecture, and implementation
  • Lead discovery sessions, technical assessments, and data maturity evaluations for prospective and current clients
  • Translate complex business requirements into scalable data solutions and present technical roadmaps to executive stakeholders
  • Provide executive oversight on multi-client data engineering programs—ensuring projects are delivered on time, within scope, and at high quality
  • Support pre-sales efforts including scoping, estimation, proposal development, and technical solution design

Build and Scale the Practice
  • Build the Data Engineering practice from the ground up—define the service offering, pricing model, delivery methodology, and team structure
  • Recruit, hire, and mentor data engineers as the practice grows, establishing a high-performance team culture grounded in technical excellence and client service
  • Develop and maintain internal knowledge bases, playbooks, and training materials for data engineering best practices
  • Collaborate with Cadre AI’s pod leads, AI engineers, and solutions architects to integrate data engineering into broader AI transformation engagements
  • Track practice KPIs including utilization, revenue, client satisfaction, and delivery quality

Drive Thought Leadership and Ecosystem Presence
  • Represent Cadre AI as a subject matter expert in the Snowflake and modern data stack ecosystem
  • Build and maintain relationships with Snowflake account executives, partner managers, and solution engineers
  • Contribute to Cadre AI’s brand through blog posts, conference talks, community engagement, and technical content
  • Stay current on emerging data technologies and evaluate their applicability for client solutions—Databricks, Microsoft Fabric, BigQuery, Iceberg/Delta Lake, and beyond

Who You Are

  • 8+ years of professional experience in data engineering, data architecture, or a closely related technical role
  • 3+ years of hands-on experience with Snowflake as a primary data platform, including advanced features: Snowpark, Snowpipe, Tasks, Streams, and Dynamic Tables
  • 3+ years in a client-facing consulting, professional services, or agency environment—you know how to earn trust and deliver under pressure
  • Deep expertise in SQL, Python, and modern data transformation tools (dbt strongly preferred)
  • Strong experience with cloud platforms—AWS preferred; Azure and GCP also valued—including infrastructure-as-code tools like Terraform
  • Proven experience designing multi-tenant data architectures with robust access control, data isolation, and cost allocation
  • Experience with data pipeline orchestration tools such as Airflow, Dagster, Prefect, or Databricks Workflows
  • Demonstrated ability to lead technical teams and grow a practice or function from early stage
  • Exceptional communication skills—you can present to a C-suite executive and pair with a junior engineer in the same day
  • Strong understanding of data governance, data quality frameworks, and regulatory compliance including SOC 2, GDPR, and CCPA

What Sets You Apart

  • Snowflake SnowPro Advanced certifications (Architect or Data Engineer)
  • Experience with Snowflake Cortex AI, Streamlit in Snowflake, and AI/ML data preparation workflows
  • Familiarity with complementary platforms: Databricks, Microsoft Fabric, Redshift, or BigQuery
  • Experience building data products or analytics-as-a-service offerings for external customers
  • Domain experience in mortgage/financial services, IoT, SaaS, or professional services—industries where the data complexity is real
  • Experience with real-time data pipelines using Kafka or Kinesis and streaming architectures
  • Track record of contributing to the data community through speaking, open-source contributions, or published content
  • Experience managing P&L responsibility or practice-level financial metrics in a consulting environment

Why Cadre AI

  • Ground-floor ownership. You build the Data Engineering practice from scratch—the service offering, the team, the methodology. This is yours to define.
  • Direct access to leadership. You work alongside the co-founders and have a real seat at the table in shaping company strategy.
  • High-impact client work. Diverse engagements across industries with real AI transformation problems. You won’t be maintaining someone else’s legacy stack.
  • AI-native culture. We don’t just build AI for clients. We use it to run our own operations. You’ll work with people who are as obsessed with the tools as you are.
  • Upside. Competitive compensation with performance-based upside as the practice scales. Early team members share in the success they help create.
  • No bureaucracy. Small pods. Clear accountability. The best idea wins, regardless of who says it.

Cadre AI is building the future of how companies adopt and operate AI. We believe the best data systems come from engineers who understand both the infrastructure and the business outcomes it enables. If that’s how you think, we want to talk.
Compensation
The base pay range for this role is $140,000 – $160,000 per year.

Top Skills

Airflow
AWS
Azure
Dbt
Fivetran
GCP
Python
Snowflake
SQL
Terraform

Similar Jobs

5 Days Ago
In-Office
Expert/Leader
Expert/Leader
Artificial Intelligence • Information Technology • Software
The Head of Data Science Engineering & Analytics will define and execute enterprise data strategy, manage a global team, and build scalable data platforms to support AI and analytics across Zoom's business operations.
Top Skills: AirflowAWSCi/CdCloud DatabasesDatabricksDataopsDbtDevOpsFivetranKafkaMlops ToolsSnowflake
6 Days Ago
Hybrid
Expert/Leader
Expert/Leader
Artificial Intelligence • Information Technology • Machine Learning • Software
Lead Data, Software, and Platform engineering to build a high-velocity, enterprise-grade AI growth platform. Own roadmap, architecture, security (SOC2/ISO), hiring, and delivery for mid-market integrations and custom on‑prem enterprise deployments.
Top Skills: Agentic AiClickhouseIso 27001KubernetesMachine LearningOn-Prem DeploymentsSecret ManagementSoc2Web Applications
3 Hours Ago
Remote or Hybrid
Mid level
Mid level
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
Support comprehensive data protection initiatives, triage DLP events, implement data labeling, and assist with investigations and eDiscovery processes.
Top Skills: Crowdstrike FalconData Loss PreventionData Protection TechnologiesEdiscovery ProcessesSiem Query Language

What you need to know about the Melbourne Tech Scene

Home to 650 biotech companies, 10 major research institutes and nine universities, Melbourne is among one of the top cities for biotech. In fact, some of the greatest medical advancements were conceptualized and developed here, including Symex Lab's "lab-on-a-chip" solution that monitors hormones to predict ovulation for conception, and Denteric's vaccine for periodontal gum disease. Yet, the thousands of people working in the city's healthtech sector are just getting started, to say nothing of the tech advancements across all other sectors.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account