Are you a passionate ETL Developer with expertise in cloud-based data engineering, automation, and large-scale data processing?
Join a dynamic team where you'll play a key role in designing, developing, and optimizing enterprise-level data solutions.
Work on cutting-edge technologies in cloud, big data, and automation.Collaborate with a high-performing team in a fast-paced environment.Shape the future of data engineering with innovative solutions.
What You’ll DoDesign, develop, test, and maintain ETL/ELT data pipelines.Develop high-quality, scalable code in Python, SQL, Unix Shell, and PySpark.Work with AWS & Azure, leveraging services like S3, EC2, EMR, Lambda, Redshift, Databricks Delta Lake, and Terraform.Implement CI/CD pipelines using Jenkins, Docker, Kubernetes, Ansible, and GitHub.Optimize data models and database performance for Data Warehouses and Data Lakes.Collaborate with cross-functional teams and mentor junior developers.Troubleshoot, debug, and enhance existing systems for better efficiency.
What We’re Looking ForMUST Have 8+ years of technical experience in banking (financial services experience is a plus).Experience and skills in Home Lending.Strong background in ETL/ELT, data extraction, transformation, and integration.Hands-on expertise in cloud-based data engineering (AWS & Azure).Experience with big data technologies (Hadoop, Spark, Hive, Snowflake, Redshift).Solid knowledge of SQL, PL/SQL, Postgres, MySQL, Oracle, or DB2.Familiarity with Data Modelling (Star Schema, Data Vault 2.0).Data migration experience, SSIS, SSRS (Desirable)Strong problem-solving, analytical, and communication skills.Experience working in Agile development environments.Experience with Power BI and data visualization tools.Exposure to streaming data processing and serverless architectures.Hands on programming experience in writing Python, SQL, Unix Shell scripts, Pyspark scripts, in a complex enterprise environmentExperience in Terraform, Kubernetes and DockerExperience with Source Control Tools - Github or BitBucket