Job Description
Company Description
About Nagarro
In a changing and evolving world, challenges are ever more unique and complex. Nagarro helps to transform, adapt, and build new ways into the future through a forward-thinking, agile, and caring mindset. Today, we are 18,000+ experts across 37+ countries, forming a Nation of Nagarrians, ready to help our customers succeed.
The nature of IT & digital product engineering has reached an incredible state of velocity and transition. We must adapt and meet it with an agile mindset that isn’t afraid to iterate towards the perfect solution. If we only solve today’s problems, it’s not enough. We must do more. We must courageously embrace the future, with vision and clarity about where technology & business are heading. Thinking breakthroughs gets us there.
Nagarro - https://www.nagarro.com/en
Job Description
Job Purpose:
The purpose of the Data Engineer is to leverage their data expertise and data related technologies, in line with the Data Architecture Roadmap, to advance technical thought leadership for the Enterprise, deliver fit for purpose data products, and support data initiatives. In addition, Data Engineers enhance the data infrastructure of the bank to enable advanced analytics, machine learning and artificial intelligence by providing clean, usable data to stakeholders. They also create data pipelines, Ingestion, provisioning, streaming, self service, API and solutions around big data that support the Bank’s strategy to become a data driven organization.
Job Responsibilities:
- Responsible for the day to day maintenance of data pipelines, providing support to the data squads including performing data related tasks (such as data profiling, data cleaning, data configurations, data support, data validation, data quality assurance) in a Data Epic and assisting in basic data pipelines, data ingestion and supporting Data Engineers in their Epics.
- Data Infrastructure: Support and maintain the Data Infrastructure to ensure it is secure, available and reliable.
- Data Pipeline Build (Ingestion, Provisioning, Streaming and API): Maintain data pipelines starting with Data Virtualization, then progressing to Data Ingestion and finally to Data Provisioning. Ensure that data pipelines are monitored and run successfully and configure and support minor changes in data pipelines.
- Data Visualization: Create virtual data bases and assist in creating data extracts for the business in response to business needs.
- Documentation and Data Analysis: Collaborate with the Data Analyst to perform data profiling, data validation, and data documentation in support of Epics.
- Data Warehouse Monitoring and Support: Monitor data pipelines and infrastructure to provide first line support, resolve issues and ensure that the warehouse meets its SLA timelines for data availability and warehouse reliability.
- Cloud Monitoring and Support Services: Ensure cloud processes (Compute and Storage) are monitored and managed daily, to ensure that Cloud pipelines run successfully.
- Operations: Support and run daily operational reports to ensure that all jobs ran successfully and that the data warehouse is maintained according to standard.
- Collaboration: Work collaboratively with business stakeholders to gain business knowledge, understand data extracts and improve and optimize business queries.








