Sr Snowflake Developer

Posted on 24 May 23 by Wendi Meiser

  • Owings Mills, MD
  • -

Powered by Tracker

Job Description


Sr Snowflake Developer


Owings Mills, MD (2 days onsite, 3 days remote. Must be onsite day 1)


Compass Pointe has partnered with a global financial company in the Owings Mills, MD area that is looking for a Sr Snowflake Developer.

• Knowledge of SQL language and cloud-based technologies (SQL - 10 years).
• Data warehousing concepts, data modeling, metadata management.
• Data lakes, multi-dimensional models, data dictionaries.
• Performance tuning and setting up resource monitors.
• Snowflake modeling – roles, databases, schemas.
• SQL performance measuring, query tuning, and database tuning.
• ETL tools with cloud-driven skills.
• Ability to build analytical solutions and models.
• Root cause analysis of models with solutions.
• Hadoop, Spark, and other warehousing tools.
• Managing sets of XML, JSON, and CSV from disparate sources.
• SQL-based databases like Oracle SQL Server, and Teradata.
• Snowflake warehousing, architecture, processing, administration.
• Data ingestion into Snowflake.
• Enterprise-level technical exposure to Snowflake applications.


  • 10+ years of experience.
  • 5+ years of experience with Snowflake.
  • Create, test, and implement enterprise-level apps with Snowflake.
    • Design and implement features for identity and access management.
    • Create authorization frameworks for better access control.
    • Implement novel query optimization, major security competencies with encryption.
    • Solve performance issues and scalability issues in the system.
    • Transaction management with distributed data processing algorithms.
    • Possess ownership right from start to finish.
    • Build, monitor, and optimize ETL and ELT processes with data models.
    • Migrate solutions from on-premises setup to cloud-based platforms.
    • Understand and implement the latest delivery approaches based on data architecture.
    • Project documentation and tracking based on understanding user requirements.
    • Perform data integration with third-party tools including architecting, designing, coding, and testing phases.
    • Manage documentation of data models, architecture, and maintenance processes.
    • Continually review and audit data models for enhancement.
    • Maintenance of ideal data pipeline based on ETL tools.
    • Coordination with BI experts and analysts for customized data models and integration.
    • Code updates, new code development, and reverse engineering.
    • Performance tuning, user acceptance training, application support.
    • Maintain confidentiality of data.
    • Risk assessment, management, and mitigation plans.
    • Regular engagement with teams for status reporting and routine activities.
    • Migration activities from one database to another or on-premises to cloud.

Job Information

Rate / Salary



Not Specified


Not Specified

Skills / Experience

Not Specified


Not Specified

Our Reference


Job Location