Contract

Senior Data Engineer

Posted on 12 March 26 by Ricardo Rivas

  • WEST DES MOINES,IA
  • $0.00 - $0.00
Logo

Powered by Tracker

Job Description

STRATEGIC STAFFING SOLUTIONS HAS AN OPENING!

This is a Contract Opportunity with our company that MUST be worked on a W2 Only. No C2C eligibility for this position. Visa Sponsorship is Available! The details are below.

“Beware of scams. S3 never asks for money during its onboarding process.”

Job Title: Senior Data Engineer
Contract Length: 18+ Month contract
Location: WEST DES MOINES, IA 50266
Work Schedule: 3 days onsite/ 2 days remote

Ref# 245525

The Data Engineer will design, develop, and maintain data pipelines and ETL/ELT workflows supporting batch and real-time data processing. This role focuses on building scalable data infrastructure, implementing modern data storage solutions, and ensuring reliable data delivery for reporting and downstream applications.

Key Responsibilities

  • Design and develop ETL/ELT workflows and data pipelines for batch and real-time processing.
  • Build and maintain data pipelines for reporting and downstream applications using open source frameworks and cloud technologies.
  • Implement operational and analytical data stores leveraging Delta Lake and modern database concepts.
  • Optimize data structures for performance and scalability across large datasets.
  • Collaborate with architects and engineering teams to ensure alignment with target state architecture.
  • Apply best practices for data governance, lineage tracking, and metadata management, including integration with Google Dataplex for centralized governance and data quality enforcement.
  • Develop, schedule, and orchestrate complex workflows using Apache Airflow, including designing and managing Airflow DAGs.
  • Troubleshoot and resolve issues in data pipelines to ensure high availability and reliability.

Required Technical Skills

  • Strong understanding of data structures, data modeling, and lifecycle management.
  • Hands-on experience designing and managing ETL/ELT data pipelines.
  • Advanced PySpark skills for distributed data processing and transformation.
  • Experience implementing open table formats using NetApp Iceberg.
  • Knowledge of the Hadoop ecosystem, including HDFS and Hive.
  • Experience with cloud platforms, including GCP (BigQuery, Dataflow), Delta Lake, and Dataplex for governance and metadata management.
  • Programming and orchestration using Python, Spark, and SQL.
  • Strong experience with Apache Airflow, including authoring and maintaining DAGs for complex workflows.
  • Strong understanding of relational and distributed database systems and reporting concepts.

 

Job Information

Rate / Salary

$0.00 - $0.00

Sector

IT/Software/Technology

Category

Not Specified

Skills / Experience

Technology and Data

Benefits

Not Specified

Our Reference

JOB-245525

Job Location