Data Operations Engineer Intern

🇺🇸 United States - Remote
📊 Data Intern

Job description

About Us

Abacus Insights is changing the way healthcare works for you. We’re on a mission to unlock the power of data so health plans can enable the right care at the right time—making life better for millions of people. No more data silos, no more inefficiencies. Just smarter care, lower costs, and better experiences.

Backed by $100M from top VCs, we’re tackling big challenges in an industry that’s ready for change. And while GenAI is still new for many, we’ve already mastered turning complex healthcare data into clear, actionable insights. That’s our superpower—and it’s why we’re leading the way.

Abacus, innovation starts with people. We’re bold, curious, and collaborative—because the best ideas come from working together. Ready to make an impact? Join us and let’s build the future, together.

About the Role

We are seeking a Data Operations Engineer Intern to join our TechOps organization within the Connector Factory team. This role provides hands-on experience supporting production grade data pipelines responsible for ingesting, transforming, validating, and delivering healthcare data from numerous external sources. As an intern, you will work closely with senior data, platform, and operations engineers to monitor pipeline health, debug data and system issues, automate operational workflows, and improve data reliability. You will gain exposure to both batch and streaming architectures and learn how modern data platforms are deployed and operated using AWS services such as Lambda, EMR, EKS, and Databricks.

Your day to day:

  • Monitor production data pipelines and systems, identifying failures, latency issues, schema changes, and data quality anomalies.
  • Debug pipeline failures by analyzing logs, metrics, SQL outputs, and upstream/downstream dependencies.
  • Assist in root cause analysis (RCA) for data incidents and contribute to implementing corrective and preventive solutions.
  • Support the maintenance and optimization of ETL/ELT workflows to improve reliability, scalability, and performance.
  • Automate recurring data operations tasks using Python, shell scripting, or similar tools to reduce manual intervention.
  • Assist with data mapping, transformation, and normalization efforts, including alignment with Master Data Management (MDM) systems.
  • Collaborate on the generation and validation of synthetic test datasets for pipeline testing and data quality validation.
  • Shadow senior engineers to deploy, monitor, and troubleshoot data workflows on AWS, Databricks, and Kubernetes-based environments.
  • Ensure data integrity and consistency across multiple environments (development, staging, production).
  • Clearly document bugs, data issues, and operational incidents in Jira and Confluence, including reproduction steps, impact analysis, and resolution details.
  • Communicate effectively with cross-functional, onsite, and offshore teams to escalate issues, provide status updates, and track resolutions.
  • Participate in Agile ceremonies and follow structured incident and change management processes

What you bring to the team:

  • Strong interest in data engineering, data operations, and production data systems.
  • Currently pursuing or recently completed a Master’s degree in Computer Science, Data Science, Engineering, Statistics, or a related quantitative discipline.
  • Solid understanding of ETL/ELT architectures, including ingestion, transformation, validation, orchestration, and error handling.
  • Proficiency in SQL, including complex joins, aggregations, window functions, and debugging data discrepancies at scale.
  • Working knowledge of Python for data processing, automation, and operational tooling.
  • Familiarity with workflow orchestration tools such as Apache Airflow, including DAG design, scheduling, retries, and dependency management.
  • Experience or exposure to data integration platforms such as Airbyte, including connector-based ingestion, schema evolution, and sync monitoring.
  • Understanding of Master Data Management (MDM) concepts and tools, with exposure to platforms such as Rhapsody, Onyx, or other enterprise MDM solutions.
  • Knowledge of data pipeline observability, including log analysis, metrics, alerting, and debugging failed jobs.
  • Exposure to cloud platforms (preferably AWS), with familiarity in services such as S3, Lambda, EMR, EKS, or managed data processing services.
  • Ability to communicate technical issues clearly and concisely, including writing actionable bug reports and collaborating on incident resolution.
  • Strong documentation habits and attention to detail in operational workflows.

What we would like to see, but not required:

  • Experience with cloud data warehouses such as Snowflake or BigQuery.
  • Familiarity with Databricks, Apache Spark, or distributed data processing frameworks.
  • Hands-on experience building automation for data operations or reliability engineering.
  • Exposure to healthcare data standards, regulated data environments, or HIPAA-compliant systems.

Compensation: Compensation for this role is based on experience, skills, and location, and includes base salary plus eligibility for performance bonuses and equity grants.

What you’ll get in return:

  • Unlimited paid time off – recharge when you need it
  • Work from anywhere – flexibility to fit your life
  • Comprehensive health coverage – multiple plan options to choose from
  • Equity for every employee – share in our success
  • Growth-focused environment – your development matters here
  • Home office setup allowance – one-time support to get you started
  • Monthly cell phone allowance – stay connected with ease #LI-RF1 #LI-Remote

Our Commitment as an Equal Opportunity Employer

As a mission-led technology company helping to drive better healthcare outcomes, Abacus Insights believes that the best innovation and value we can bring to our customers comes from diverse ideas, thoughts, experiences, and perspectives. Therefore, we dedicate resources to building diverse teams and providing equal employment opportunities to all applicants. Abacus prohibits discrimination and harassment regarding race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws.

At the heart of who we are is a commitment to continuously and intentionally building an inclusive culture—one that empowers every team member across the globe to do their best work and bring their authentic selves. We carry that same commitment into our hiring process, aiming to create an interview experience where you feel comfortable and confident showcasing your strengths. If there’s anything we can do to support that—big or small—please let us know.

Share this job:
Please let Abacus Insights know you found this job on Remote First Jobs 🙏

Similar Remote Jobs

Find Remote Jobs

Connect with top companies hiring for remote jobs, work-from-home roles, and 100% online jobs worldwide.

Discover Hidden Jobs

Unique jobs you won't find on other job boards.

Advanced Filters

Filter by category, benefits, seniority, and more.

Priority Job Alerts

Get timely alerts for new job openings every day.

Manage Your Job Hunt

Save jobs you like and keep a simple list of your applications.

Apply