Platform Data Engineer (DataOps / Orchestration)

πŸ‡¦πŸ‡· Argentina - Remote
πŸ“Š DataπŸ”΅ Mid-level

Job description

Azumo is currently looking for a senior Platform Data Engineer to support and stabilize a modern cloud data platform during a period of growth and integration. This position is FULLY REMOTE based in Latin America-

This role sits at the intersection of data engineering and data operations (DataOps). Your primary responsibility will be to ensure that the frameworks used to build, deploy, and operate data pipelines are scalable, reliable, and maintainable, while reducing operational burden on the internal data team. This is a platform-centric role focused on orchestration and reliability rather than a pure BI or SQL-only data engineering position.

At Azumo, we strive for excellence and strongly believe in professional and personal growth. We want each individual to be successful and pledge to help each achieve their goals while at Azumo and beyond. Challenging ourselves and learning new technologies is at the core of what we do.

Based in San Francisco, California, Azumo is an innovative software development firm helping organizations make insightful decisions using the latest technologies in data, cloud and mobility. We combine expertise in strategy, data science, application development and design to drive digital transformation initiatives for companies of all sizes.

If you are qualified for the opportunity and looking for a challenge please apply online at https://azumo.workable.com or connect with us at [email protected]

The Platform Data Engineer (DataOps) will be based remotely.

Basic Qualifications:

  • Senior/Staff-level experience as a Platform Data Engineer or DataOps Engineer.
  • Strong Python experience for orchestration and framework-level work.
  • Hands-on production experience with Prefect (strongly preferred) or Airflow.
  • Proven experience with dbt in a production data warehouse environment and Snowflake.
  • Experience with AWS (data workloads, scheduled or batch processing).
  • Proficiency with GitLab CI/CD and Terraform for infrastructure-as-code.
  • Experience operating and maintaining production data pipelines at scale.

Preferred Qualifications:

  • Experience with data observability tooling (e.g., Monte Carlo or similar).
  • Exposure to MLOps or ML pipeline orchestration.
  • Experience joining an existing data platform mid-stream to stabilize and harden it.
  • Prior experience in time-boxed platform enablement or advisory engagements.

Responsibilities:

  • Own and improve data orchestration patterns (Scheduling, Retries, Backfills).

  • Strengthen deployment and operational patterns to reduce “firefighting” for internal teams.

  • Ensure data pipelines and models are production-grade and operationally sound.

  • Use Terraform to support infrastructure-as-code for data platform components.

  • Document patterns and decisions to ensure internal teams can operate independently.

  • Paid time off (PTO)

  • U.S. Holidays

  • Training

  • Udemy free Premium access

  • Mentored career development

  • Free English Courses

  • Profit Sharing

  • $US Remuneration

Share this job:
Please let Azumo know you found this job on Remote First Jobs πŸ™

Similar Remote Jobs

Project: Career Search

Rev. 2026.2

[ Remote Jobs ]
Direct Access

We source jobs directly from 21,000+ company career pages. No intermediaries.

01

Discover Hidden Jobs

Unique jobs you won't find on other job boards.

02

Advanced Filters

Filter by category, benefits, seniority, and more.

03

Priority Job Alerts

Get timely alerts for new job openings every day.

04

Manage Your Job Hunt

Save jobs you like and keep a simple list of your applications.

21,000+ SOURCES UPDATED 24/7
Apply