The Senior Data Engineer is responsible for designing, building, and maintaining large-scale, secure, and high-performance data pipelines supporting critical Financial Services workloads.
The role focuses on data modernization, regulatory data aggregation, and AI/ML enablement across domains such as Core Banking, Payments, Risk, Treasury, and Regulatory Reporting.
Key Responsibilities
Design, implement, and optimize ETL/ELT data pipelines using Apache Spark, PySpark, Databricks, or Azure Synapse.
Maintain CI/CD pipelines for data infrastructure using Azure DevOps / Terraform / GitHub Actions.
Required Technical Skills
Category Tools / Technologies
Languages Python, PySpark, SQL, Scala
Data Platforms Azure Data Lake, Synapse, Databricks, Snowflake
Orchestration Apache Airflow, Azure Data Factory, dbt
Streaming Kafka, Confluent, Event Hubs
Governance Apache Atlas, Azure Purview, Collibra
Security Encryption, RBAC, Tokenization, Audit Logging
CI/CD & IaC Terraform, Azure DevOps, GitHub Actions
Experience and Qualifications
6 – 10 years of experience in data engineering, with at least 3 years in BFSI (banking, insurance, or capital markets).
Certifications preferred: Microsoft Azure Data Engineer Associate, Databricks Data Engineer Professional, Snowflake SnowPro Core.
Key Attributes
Strong analytical and problem-solving mindset.
Job Type: Contract
Contract length: 12 months
Pay: $5, $7,000.00 per month
Work Location: In person