Detroit Labs Logo

Senior Data Engineer - Airflow and PySpark

๐Ÿ’ฐ $160k-$180k
๐Ÿ‡บ๐Ÿ‡ธ United States - Remote
๐Ÿ“Š Data๐ŸŸฃ Senior

Job Description

This role is based in Auburn Hills, Michigan, and follows a hybrid work model, requiring in-office presence three days per week.

Please note: Due to the in-office requirement, we will only be considering candidates that are local to the Metro Detroit area at this time.

Detroit Labs was founded in 2011 with a vision for building digital products, services, and the teams that power them. We create digital solutions that transform the way our clients do business. We build genuine relationships based on respect, trust, and results. We foster a diverse and inclusive culture that values people - providing them with the tools, resources, and support they need to thrive professionally, exceed client expectations, and be themselves at work. We have a variety of client teams we work with ranging from startups to Fortune 500 companies so there are always new and exciting projects going on.

Detroit Labs is looking for an experienced Data Engineer with expertise in Airflow and Pyspark to join an exciting project with an industry leading automotive client. As a Senior Data Engineer & Technical Lead, you will play a pivotal role in delivering major data engineering initiatives within the Data & Advanced Analytics space. This position requires hands-on expertise in building, deploying, and maintaining robust data pipelines using Python, PySpark, and Airflow, as well as designing and implementing CI/CD processes for data engineering projects.

Our Application

At Detroit Labs, a member of our team will read over every application (including yours), and will review your resume in addition to your responses to the application questions.

To help us get to know you better, we encourage you to answer these questions genuinely and honestly. We value each applicant and want to learn about the real you. Be yourself in your responses, and our team will look forward to understanding what you can bring to this role!

7+ years of Data Engineering experience building production-grade data pipelines using Python and PySpark

  • Experience designing, deploying, and managing Airflow DAGs in enterprise environments
  • Experience maintaining CI/CD pipelines for data engineering workflows, including automated testing and deployment
  • Experience with cloud workflows and containerization, using Docker and cloud platforms (GCP preferred) for data engineering workloads
  • Knowledge and ability to follow twelve-factor design principles
  • Experience and ability to write object-oriented Python code, manage dependencies, and follow industry best practices
  • Proficiency with Git for source code management and collaboration (commits, branching, merging, GitHub/GitLab workflows).
  • Experience working with command lines in Unix/Linux-like environments
  • Solid understanding of SQL for data ingestion and analysis
  • Engineering mindset. Able to write code with an eye for maintainability and testability
  • Collaborative mindset. Comfortable with code reviews, paired programming, and using remote collaboration tools effectively
  • Detroit Labs is not currently able to hire candidates who will reside outside of the United States during their term of employment

Responsibilities

  • Design, develop, and optimize scalable data pipelines using Python and PySpark for batch and streaming workloads

  • Build, schedule, and monitor complex workflows using Airflow, ensuring reliability and maintainability

  • Architect and implement CI/CD pipelines for data engineering projects using GitHub, Docker, and cloud-native solutions (GCP preferred)

  • Apply test-driven development (TDD) practices and automate unit/integration tests for data pipelines

  • Implement secure coding best practices and design patterns throughout the development lifecycle

  • Work closely with Data Architects, QA teams, and business stakeholders to translate requirements into technical solutions

  • Create and maintain technical documentation, including process/data flow diagrams and system design artifacts

  • Lead and mentor junior engineers, providing guidance on coding, testing, and deployment best practices

  • Analyze and resolve technical issues across the data stack, including pipeline failures and performance bottlenecks

  • Cross-train team members outside the project team (e.g., operations support) for full knowledge coverage

  • The salary range for this role is from $160,000 - $180,000 commensurate with experience

  • Full medical, dental, vision benefits

  • 401K contribution options

  • Quarterly outings and events

  • Paid holidays and vacation time

  • Parental leave program

  • Monthly budgets for โ€œteam funโ€ bonding events

  • Free lunch for various company meetings and Lunch & Learns

  • Access to our mentorship program and employee resource groups (ERGs)

  • Volunteer opportunities

  • All-company remote-friendly activities

  • Plenty of Detroit Labs swag

Share this job:
Please let Detroit Labs know you found this job on Remote First Jobs ๐Ÿ™

54 similar remote jobs

Explore latest remote opportunities and join a team that values work flexibility.

Remote companies like Detroit Labs

Find your next opportunity with companies that specialize in Iphone, Ipad Development, Android Development, Mobile Strategy, and Mobile Design. Explore remote-first companies like Detroit Labs that prioritize flexible work and home-office freedom.

Project: Career Search

Rev. 2026.2

[ Remote Jobs ]
Direct Access

We source jobs directly from 21,000+ company career pages. No intermediaries.

01

Discover Hidden Jobs

Unique jobs you won't find on other job boards.

02

Advanced Filters

Filter by category, benefits, seniority, and more.

03

Priority Job Alerts

Get timely alerts for new job openings every day.

04

Manage Your Job Hunt

Save jobs you like and keep a simple list of your applications.

21,000+ SOURCES UPDATED 24/7
Apply