Key Responsibilities:
- Design and implement scalable data models and transformation pipelines using DBT on Snowflake.
- Write efficient and maintainable PL/SQL code for complex data processing and transformation tasks.
- Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions.
- Optimize Snowflake performance through query tuning, clustering, and resource management.
- Ensure data quality, integrity, and governance through testing, documentation, and monitoring.
- Participate in code reviews, architecture discussions, and continuous improvement initiatives.
- Maintain and enhance CI/CD pipelines for DBT projects.
Required Qualifications:
- 3+ years of experience in data engineering or a related field.
- Strong hands-on experience with DBT (modular SQL development, testing, documentation).
- Proficiency in Snowflake (data warehousing, performance tuning, security).
- Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages.
- Solid understanding of data modeling concepts (star/snowflake schemas, normalization).
- Experience with version control systems (e.g., Git) and CI/CD practices.
- Familiarity with orchestration tools (e.g., Airflow, dbt Cloud, Prefect) is a plus.
Job Type: Contract
Contract length: 12 months
Pay: $5, $6,500.00 per month
Work Location: In person