Data Engineer / Apache Airflow Specialist

🇺🇦 Ukraine - Remote
📊 Data🔵 Mid-level

Job description

Intetics Inc. is a global technology company providing custom software application development, distributed professional teams, software product quality assessment, and “all-things-digital” solutions. Based on its proprietary business model of Remote In-Sourcing®, advanced Technical Debt Reduction Platform (TETRA™), and measurable quality management platform ( Predictive Software Engineering), Intetics enables clients to achieve measurable business results.

Position Details

Position: Data Engineer / Apache Airflow Specialist

Level: Senior

Technologies: Apache Airflow, Python, Flask, Elasticsearch, Unix/Linux, Oracle, PostgreSQL, GitLab

Workload: 1100 hrs/year

Location: Remote — work from anywhere

English level: Advanced

Education: Technical degree

Role Description

Intetics is looking for an experienced Data Engineer / Apache Airflow Specialist to join our distributed team for a data-driven project focused on large-scale ETL workflows, data indexing, and performance optimization.

The specialist will design, implement, and optimize data pipelines using Apache Airflow, manage database performance on Oracle and PostgreSQL, and support Elasticsearch integration for efficient data retrieval and search operations.

You will also be responsible for ensuring smooth deployment pipelines in GitLab and collaborating with a cross-functional engineering team to enhance the quality and reliability of complex data processes.

Technical Responsibilities and Skills

What You’ll Do:

  • Develop, orchestrate, and maintain complex Apache Airflow DAGs for ETL and data-processing pipelines.
  • Build and optimize Python-based ETL scripts, integrating with Flask APIs when needed.
  • Design and manage Elasticsearch indexing and performance tuning workflows.
  • Handle Unix/Linux scripting and operations for automation and monitoring.
  • Work with Oracle and PostgreSQL databases for large-scale data processing.
  • Implement and maintain GitLab CI/CD pipelines for build, test, and deploy stages.
  • Collaborate with the project team to ensure scalability, reliability, and quality of data solutions.

What We’re Looking For:

  • ≥ 3 years of Apache Airflow DAG orchestration.
  • ≥ 5 years of Python (ETL focus), with Flask API experience as a plus.
  • ≥ 3 years of Elasticsearch (data indexing & optimization).
  • ≥ 3 years of Unix/Linux scripting & operations.
  • ≥ 3 years with Oracle or PostgreSQL (ideally both).
  • ≥ 3 years of GitLab pipelines (build/test/deploy).
  • Advanced English and a technical degree.

Nice-to-Have / Bonus:

  • Experience with Great Expectations or similar data-quality tools.
  • Airflow on Kubernetes.
  • Proven performance tuning experience.
Share this job:
Please let Intetics know you found this job on Remote First Jobs 🙏

Benefits of using Remote First Jobs

Discover Hidden Jobs

Unique jobs you won't find on other job boards.

Advanced Filters

Filter by category, benefits, seniority, and more.

Priority Job Alerts

Get timely alerts for new job openings every day.

Manage Your Job Hunt

Save jobs you like and keep a simple list of your applications.

Search remote, work from home, 100% online jobs

We help you connect with top remote-first companies.

Search jobs

Hiring remote talent? Post a job

Apply