Job description
DataOps Team Lead
Who we are:
Adaptiq is a technology hub specializing in building, scaling, and supporting R&D teams for high-end, fast-growing product companies in a wide range of industries.
About the Product:
Bigabid is on a mission to fuel mobile app growth through AI-powered prediction and precision targeting. With over 50TB of data processed daily, 4M+ requests per second, and 1B+ unique users reached weekly, they’re scaling like the biggest names in tech - but with startup agility.
At Bigabid, they’re building intelligent systems that predict which mobile apps users will love - and connect them at the perfect moment. Their real-time user modeling engine continuously ingests and analyzes massive streams of behavioral data to power highly targeted ad campaigns with exceptional performance. As a result, they are seeing remarkable growth and close to zero customer churn. To support their hyper-growth and continue propelling the growth of some of the biggest names in the mobile industry, they offer a wide range of opportunities for different skill levels and experiences.
About the Role:
We’re now looking for a DataOps Team Lead to own the stability, observability, and quality of the data pipelines and processes. This is a 50⁄50 hands-on and leadership role, perfect for someone who thrives at the intersection of data engineering, operational excellence, and cross-functional collaboration.
You’ll be part of the mission-critical DataOps layer, ensuring data health and reliability across a product that directly influences business outcomes . You’ll support the engine that empowers some of the world’s top app developers to grow smarter and faster
Key Responsibilities:
- Lead a DataOps team of 2 (and growing), with ownership over Bigabid’s core data quality and observability processes
- Build, maintain, and monitor robust data pipelines and workflows (Python + Airflow)
- Act as the go-to person for identifying and resolving data issues affecting production systems
- Coordinate with multiple teams: Data Engineering, Product, BI, Operations, Backend, and occasionally Data Science
- Own projects such as metadata store development, anomaly detection systems, and scalable data quality frameworks
- Balance strategic project leadership with hands-on scripting, debugging, and optimizations
- Promote a culture of quality, reliability, and clear communication in a fast-moving, high-volume environment.
Required Competence and Skills:
- 3+ years of experience in data engineering/data operations, with at least 1 year of team or project leadership
- Proficient in Python for scripting and automation (clean, logical code – not full-stack development)
- Strong experience with Airflow (hands-on, not through abstraction layers)
- Solid understanding of SQL and NoSQL querying, schema design, and cost-efficient querying (e.g., Presto, document DBs)
- Exposure to tools like Spark, AWS, or similar is a big plus
- Comfortable managing incident escalation, prioritizing urgent fixes, and guiding teams toward solutions
- Analytical, communicative, and excited to work with smart, mission-driven people
Nice-to-Have Skills:
- Previous experience as a NOC or DevOps engineer
- Familiarity with PySpark.
Why Us?
We provide 20 days of vacation leave per calendar year (plus official national holidays of the country you are based in).
We provide full accounting and legal support in all countries where we operate.
We utilize a fully remote work model with a powerful workstation and co-working space in case you need it.
We offer a highly competitive package with yearly performance and compensation reviews.