Job Category: Engineering
Job Type: Full Time
Job Location: Kirulapone
Responsibilities:
- Gather business requirements, assess data gaps, and design scalable data architectures.
- Develop and implement data pipelines for data lakes and warehouses to support analytics.
- Build and manage ETL/ELT workflows ensuring data accuracy and cost efficiency.
- Perform data profiling, cleansing, and transformation for reporting.
- Monitor and optimize data warehouse performance, resolving bottlenecks.
- Ensure data security, compliance, and governance.
- Collaborate with Data Analysts to provide clean, structured data for reporting.
- Maintain and troubleshoot data pipelines, proactively addressing issues.
- Stay updated on industry trends and best practices in data engineering and analytics.
Requirements:
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
- 4+ years of experience with ETL/ELT pipelines, streaming data, and orchestration frameworks.
- Expertise in big data platforms, hybrid architectures, and cloud-native tools.
- Proficiency in SQL, Python, Java, with strong experience in Azure Databricks and SSIS automation. Familiarity with Azure Data Factory and Azure Synapse is a plus.
- Experience with large datasets, complex data models, and data modeling.
- Strong problem-solving, troubleshooting, and communication skills.
- Ability to work independently and manage projects effectively.
- Experience with Power BI or other visualization tools is a plus.
- Knowledge of auditing, logging, and ETL operations monitoring.