Data Engineer #142074

Posted on 23 April 25 by David Parker

  • Lansing, mi
  • $ - $
Logo

Powered by Tracker

Job Description

Title: Data Engineer #142074
Type: Contract

Location: Lansing, MI (Onsite 2 days a week remote 3 days)

Client: State of Michigan

Interview Process: Candidates submitted must be willing to come onsite (Lansing, MI) for interviews. Manager may request virtual and/or onsite interviews

CANDIDATES MUST BE LOCAL TO MID-MICHIGAN

 

 As a technical lead, the resource participates in a variety of analytical assignments that provide for the enhancement, integration, maintenance, and implementation of projects.  The resource will also provides technical oversight to other developers in the team that support other critical applications . Not having a resource on staff will lead to MDHSS failing to maintain, enhance, and support the modernized MDSS that can lead to errors causing application outages, data integrity issues and can eventually lead to incorrect information being processed and reporting of the patient information.

Position Summary

  • Lead the design and development of scalable and high-performance solutions using AWS services.
  • Experience with Databricks, Elastic search, Kibanna, S3.
  • Experience with Extract, Transform, and Load (ETL) processes and data pipelines.
  • Write clean, maintainable, and efficient code in Python/Scala.
  • Experience with AWS Cloud-based Application Development
  • Experience in Electronic Health Records (EHR) HL7 solutions.
  • Implement and manage Elastic Search engine for efficient data retrieval and analysis.
  • Experience with data warehousing, data visualization Tools, data integrity
  • Execute full software development life cycle (SDLC) including experience in gathering requirements and writing functional/technical specifications for complex projects.
  • Excellent knowledge in designing both logical and physical database model
  • Develop database objects including stored procedures, functions,
  • Extensive knowledge on source control tools such as GIT
  • Develop software design documents and work with stakeholders for review and approval.
  • Exposure to flowcharts, screen layouts and documentation to ensure logical flow of the system requirements
  • Experience working on large agile projects.
  • Experience or Knowledge on creating CI/CD pipelines using Azure Devops

Skills Needed:

  • 12+ years developing complex database systems.
  • 8+ years Databricks.
  • 8+ years using Elastic search, Kibanna.
  • 8+ years using Python/Scala.
  • 8+ years Oracle.
  • 5+ years experience with Extract, Transform, and Load (ETL) processes and developing Data Pipelines.
  • 5+ years experience with AWS.
  • Over 5+ years experience with data warehousing, data visualization Tools, data integrity .
  • Over 5+ years using CMM/CMMI Level 3 methods and practices.
  • Over 5+ years implemented agile development processes including test driven development.
  • Over 3+ years Experience or Knowledge on creating CI/CD pipelines using Azure Devops- Nice to have

Job Information

Rate / Salary

$ - $

Sector

Not Specified

Category

Not Specified

Skills / Experience

Not Specified

Benefits

Not Specified

Our Reference

JOB-8365

Job Location