Job Description
Company Description
👋🏼 We’re Nagarro.
We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 39 countries, to be exact). Our work culture is dynamic and non-hierarchical. We’re looking for great new colleagues. That’s where you come in!
Job Description
REQUIREMENTS:
- Total experience 9 years+
- Strong experience in Azure Data Engineering / Data Integration
- Deep hands-on expertise in Azure Data Factory (Must Have)
- Strong understanding of ETL / ELT concepts (Must Have)
- Hands-on experience with ADF pipelines, triggers, datasets, and linked services
- Experience with Mapping Data Flows and Integration Runtimes (Self-hosted & Azure)
- Strong experience with Azure Data Lake Storage (ADLS Gen2)
- Experience integrating on-premise and cloud data sources
- Strong SQL skills (Azure SQL / Synapse)
- Experience with Azure Synapse Analytics
- Familiarity with CI/CD pipelines using Azure DevOps
- Understanding of data orchestration, scheduling, and dependency management
- Experience with monitoring, logging, and error handling in ADF
- Good understanding of performance tuning and cost optimization
- Strong problem-solving and communication skills
- Good to Have: Java (Strong)
RESPONSIBILITIES:
- Architect and build scalable data integration solutions using Azure Data Factory
- Design, develop, and maintain ADF pipelines, triggers, datasets, and linked services
- Implement ETL / ELT workflows for batch and near real-time data processing
- Develop and optimize ADF Mapping Data Flows
- Integrate ADF with Azure services like Azure SQL, Synapse, Data Lake, and Blob Storage
- Handle performance tuning, monitoring, and cost optimization of pipelines
- Implement and manage CI/CD pipelines for ADF using Azure DevOps
- Troubleshoot production issues and perform root cause analysis
- Define architecture standards, best practices, and reusable frameworks
- Mentor and guide data engineers through hands-on support and code reviews
- Ensure data reliability, scalability, and high availability
- Collaborate with stakeholders to understand data integration requirements
Qualifications
Bachelor’s or master’s degree in computer science, Information Technology, or a related fields









