Contract

Lead Cloud Engineer (GCP)

Posted on 25 March 26 by Suganya Prabhakar

  • Charlotte, NC
  • $ - $
Logo

Powered by Tracker

Job Description

Strategic Staffing Solutions is currently looking for a Lead Cloud Engineer for a W2 contract opportunity with one of our largest clients!

This is a W2 contract opportunity, and the candidates should be willing to work on our W2 ONLY, NO C2C.

Lead Cloud Engineer
Senior Engineer (S3): strong independent contributor with deep Spark/GCP experience

Location: Charlotte, NC
Type: W2 Contract – 12 months
Work Schedule: Onsite-Hybrid
Schedule: 3 days in office

Overview

This role supports the Model Risk Management platform used to run statistical risk models and large-scale data workloads. The engineer will help maintain and enhance a cloud-native platform used by statisticians and data scientists working with millions of customer accounts. The platform runs on Google Cloud Platform and supports distributed data processing using Apache Spark.

Experience with large-scale datasets, GCP services, Spark, and containerized microservices is essential. Experience in financial services is preferred but not required.

Key Responsibilities

Platform Engineering

  • Deploy, configure, and maintain OpenShift clusters or GCP projects for Spark workloads
  • Support platform capabilities for statistical model execution

Distributed Data Processing

  • Design and implement large-scale data processing workflows using Apache Spark
  • Tune Spark jobs using Kubernetes orchestration and auto-scaling

Application & Tooling

  • Build Python/Django services and microservices supporting platform users
  • Configure tooling and internal applications for data processing and model execution

Automation & CI/CD

  • Build and maintain CI/CD pipelines using GitHub Actions, Sonar, Harness, Helm
  • Monitoring & Troubleshooting
  • Monitor Spark jobs and cluster health using Prometheus, Grafana, and GCP tools
  • Debug distributed systems and optimize resource utilization

Security & Compliance

  • Implement RBAC and encryption for data in transit and at rest
  • Collaboration
  • Work closely with cross-functional teams to define requirements and support deployments

Qualifications

  • 3+ years with Apache Spark
  • 2+ years Django development
  • 1+ years creating/maintaining conda environments
  • 2+ years with OpenShift/Kubernetes
  • Experience working with GCP or another major cloud provider

Technical Skills

  • Spark frameworks (PySpark, Scala, or Java)
    OpenShift/Kubernetes administration
  • Docker container experience
  • Python, Scala, or Java programming
  • Experience with GitHub Actions, Helm, Harness
  • Knowledge of distributed systems and cloud storage (S3, GCS, HDFS)

Education

Bachelor’s degree in Computer Science, Engineering, or related field

 

 

  

   “Beware of scams. S3 never asks for money during its onboarding process.”

 

Job Information

Rate / Salary

$ - $

Sector

Banking

Category

Not Specified

Skills / Experience

Not Specified

Benefits

Not Specified

Our Reference

JOB-245706

Job Location