Job Description
Capstone IT is helping our client to find a Data Engineer to lead modernization efforts of their data pipeline and architecture. This focuses on migrating existing pipelines from Informatica and Teradata to modern cloud-based technologies like AWS, Snowflake, and MongoDB. The engineer will be hands-on with data modeling, ETL/ELT, and will play a key role in driving cloud data solutions using Python and document databases. This is a role for someone with strong technical acumen, an innovative mindset, and a passion for modern data technologies.
Top 4 Skills
- AWS & Snowflake Data Architecture
- ETL/ELT and Data Warehousing (Teradata, Informatica)
- Python Programming for Data Engineering
- MongoDB and Document Database Knowledge
Responsibilities
- Lead cloud data modernization from Teradata/Informatica to AWS and Snowflake
- Design and implement scalable data models and warehouses in Snowflake
- Develop clean, testable, and documented code following defined standards
- Collaborate in Agile ceremonies and proactively contribute to team goals
- Maintain data accuracy, availability, and meet NFRs
- Participate in on-call rotation post-permanent hiring
- Adhere to security and communication protocols
Qualifications
- Strong experience in AWS, Snowflake, and cloud data pipelines
- 8+ years of experience in Data Engineering
- Background in data warehousing and ETL tools (Teradata, Informatica)
- Proficient in Python and scripting for data workflows
- Experience with MongoDB or other document-based databases preferred
- Familiar with Agile development methodologies
- Strong communicator and collaborator