Requirements
Possess a bachelor’s degree in Computer Science, Computer Engineering, or a related field (specialization in Software Engineering is a plus).2–3 years of experience in data engineering or software engineering with expertise in data warehousing, big data platforms, cloud technologies, and automation toolsStrong data analysis, data verification and problem-solving abilities.Analytical, meticulous, and team player.Effective communication skills for collaboration across teams.Ability to manage multiple tasks in a dynamic environment.Self-motivated and possess initiative to learn new skills and technologies. Technical Skills required:
Proficiency in data warehouse design including relational databases (MS SQL Server), NoSQL, and ETL pipelines using Python or ETL tools (e.g., Microsoft SSIS, Informatica IPC) and data warehousing concepts, database optimization, and data governance.Familiarity with Python web application and API development tools (e.g., Flask, Requests) and web scraping tools (e.g., BeautifulSoup, Scrapy).Skilled in Power BI, including DAX and Power Query, for creating reports and dashboards.Experience with architecting and implementing Microsoft Azure services, including Azure Data Factory, Data Lake Storage, App Service, and Azure SQL, as well as CI/CD pipelines using Azure DevOps.Knowledge of Machine Learning tools (e.g., AutoML platforms like Azure AutoML or DataRobot) and ML libraries (e.g., Scikit-learn, TensorFlow, PyTorch, Keras).Familiarity with big data technologies (e.g., Hadoop, Hive, Spark) and Databricks platform.