Contract

Data Architecture Lead – Cloud, ELT & Analytics (Director level) 

Posted on 20 June 25 by Gregory Morganoff

  • REMOTE
  • $ - $
Logo

Powered by Tracker

Job Description

Data Architecture Lead – Cloud, ELT & Analytics (Director level) 

Contract to Hire
Remote
- with offices in NYC for those who prefer hybrid or onsite
EST hours


Position Summary

We’re seeking a highly technical, hands-on Data Architecture and Integration Lead to drive the design, development, and evolution of an enterprise data ecosystem. This role is ideal for a seasoned engineer and architect who enjoys writing code, building scalable data platforms, and shaping long-term data strategy.

This role serves as a senior individual contributor who partners closely with executive leadership to align enterprise data architecture with strategic business goals, while remaining deeply hands-on in the design and implementation of modern data solutions. You’ll architect and implement modern data pipelines, optimize data flows, and lead the adoption of advanced solutions such as data fabric, LLMs, and AI-enabled metadata - while directly contributing to codebases and solution design.

This position requires deep technical fluency with Snowflake, AWS, Informatica, APIs, and enterprise data platforms. You’ll serve as a strategic engineer-leader who bridges business needs with technical execution - designing models, coding pipelines, mentoring developers, and evangelizing modern architecture principles across the organization.

Key Responsibilities

  • Engage directly with senior leadership and cross-functional executives to translate business objectives into scalable data architecture strategies and drive alignment across technology, analytics, and operational priorities.
  • Architect and implement modern, cloud-native data pipelines (batch and streaming) using Snowflake, AWS, Informatica, and Python.
  • Design enterprise-wide data models and domain architectures, integrating diverse systems like Salesforce, NetSuite, and SAP into a unified, governed data platform.
  • Develop and maintain API-driven data ingestion and integration solutions, including REST/SOAP-based services and event-driven architectures.
  • Lead the hands-on implementation of ELT/ETL workflows, data validation, testing, and performance optimization across mission-critical applications.
  • Champion and build out data fabric and active metadata capabilities, leveraging AI and LLMs for data discovery, lineage, and self-service.
  • Collaborate with data engineers, analysts, and product stakeholders to translate business requirements into scalable technical solutions.
  • Mentor and guide junior engineers and ETL developers, introducing engineering best practices, code reviews, and continuous integration/deployment pipelines.
  • Own and improve operational support processes for all data flows, ensuring reliability, traceability, and auditability.
  • Establish standards for data governance, lineage, versioning, and documentation across all layers of the stack.

Required Qualifications

  • 10+ years of hands-on experience in data architecture, data engineering, and enterprise-scale ETL/ELT development.
  • Proven expertise architecting and deploying solutions on Snowflake, AWS (Lambda, Glue, S3, Kinesis), and Informatica.
  • Deep knowledge of SQL, Python, and scripting for automation, validation, and transformation logic.
  • Strong background integrating platforms like Salesforce, SAP Commerce Cloud, NetSuite, and third-party APIs into centralized data warehouses.
  • Experience implementing data fabric, semantic modeling, and active metadata systems (e.g., Collibra, Alation).
  • Familiarity with AI/ML and LLM-based tooling for metadata discovery, query generation, and pipeline optimization.
  • Track record of owning architecture and engineering delivery end-to-end, from whiteboard to production.
  • Bachelor’s degree in Computer Science, Engineering, Mathematics, or related discipline.
  • Snowflake SnowPro Core or Data Engineer certification is a strong plus.

Preferred Attributes

  • Passion for working with data at scale and enabling others to use it responsibly.
  • Ability to thrive in a hybrid strategic/technical role where coding, mentoring, and influencing coexist.
  • Experience building CI/CD workflows, unit testing, and version-controlled data transformation pipelines (e.g., with dbt, GitHub Actions).
  • Strong communication skills and comfort presenting architecture to technical and non-technical audiences.

Job Information

Rate / Salary

$ - $

Sector

Data Engineering

Category

Data Engineering for Insights & Analytics

Skills / Experience

Not Specified

Benefits

Not Specified

Our Reference

JOB-11972

Job Location