Senior Data Engineer

Job description

Your New Team The ApartmentIQ team is revolutionizing how property managers optimize their operations. We provide a powerful platform that centralizes critical property data, offering actionable insights and intuitive tools to enhance efficiency and decision-making. Our product empowers users to streamline everything from leasing and maintenance to resident communication and financial reporting.

Behind the scenes, our data platform runs on a modern AWS-based tech stack designed to support big data architectures and machine learning models at scale. We believe in fostering an inclusive environment built on mutual trust, continuous learning, and a commitment to simplicity, where every engineer can contribute and grow.

The Role

As a Senior Data Engineer, you’ll be the technical backbone of the data layer that powers Daylight — ApartmentIQ’s revenue-management product that delivers real-time rent recommendations to property managers. You’ll design, build, and own the ingestion framework that pulls operational data from a variety of property-management systems, transforms it into analytics-ready models, and serves it to the machine-learning workflows that forecast demand and optimize pricing.

Working hand-in-hand with data scientists, you’ll ensure every byte flowing through Daylight is trustworthy, traceable, and available at the cadence our algorithms require. You’ll architect cloud-native, Terraform-managed infrastructure; implement scalable batch and streaming ETL/ELT jobs in Python; and layer in observability, testing, and data-quality guards so teams can iterate on models with confidence. You’ll also build and own core MLOps components that power model training, inference, and deployment — ensuring our ML systems are reliable, repeatable, and production-ready.

Beyond coding, you’ll collaborate with product managers, backend engineers, and customer-facing teams to translate business requirements—like a new rent rule or occupancy forecast—into performant data solutions. If you thrive on end-to-end ownership, relish tough debugging sessions, and want to see your work directly influence rent recommendations across thousands of units, we’d love to meet you.

Responsibilities

  • Design, build, and maintain scalable MLOps infrastructure to support model training, deployment, monitoring, and continuous integration/continuous delivery (CI/CD) of ML models.
  • Develop and manage robust data pipelines to extract, transform, and load (ETL/ELT) data from a variety of structured and unstructured sources.
  • Collaborate with data scientists and ML engineers to understand model requirements and ensure production readiness of data and model workflows.
  • Debug complex data issues and ML pipeline failures, collaborating closely with data scientists and ML engineers to diagnose root causes in data or algorithm behavior.
  • Debug data / algorithmic related problems in production for user-facing applications
  • Design and optimize data storage solutions using modern data warehousing, and relational database systems.
  • Codify and manage cloud infrastructure using Infrastructure as Code tools, primarily Terraform, to ensure reproducibility, scalability, and auditability across environments.
  • Implement observability, alerting, and data quality frameworks to ensure pipeline health and uphold data integrity.

Qualifications

  • 5+ years of software engineering experience, including 3+ years working directly with data-intensive systems, pipelines, and infrastructure.
  • Display a strong sense of ownership and delivering end-to-end systems — from architecture and implementation to CI/CD, observability, and infrastructure management.
  • Runs toward problems: has zero tolerance for bugs/issues, leans into complex issues, and proactively investigates and resolves failures.
  • Strong debugging capabilities; seeks root causes, not band-aids — whether it’s a data anomaly, algorithmic quirk, or system failure.
  • Strong Python experience — can write clear, idiomatic code and understands best practices.
  • Comfortable writing SQL queries to analyze relational data.
  • Experience with Terraform or other Infrastructure-as-Code tools for provisioning cloud-based infrastructure (e.g., AWS, GCP).
  • Hands-on experience designing and implementing big data architectures, streaming or batch ETL pipelines, and understanding the trade-offs between complexity, performance, and cost.
  • Experience with data lakes, data warehouses, relational databases, and document stores, and when to use each.
  • Math or CS background preferred and/or experience working with algorithms.
  • Uses LLMs and AI agents to enhance engineering productivity and explore solution spaces creatively.
  • Operates effectively in fast-paced, startup environments; adapts quickly and communicates clearly.
  • Strong collaborator and communicator, deeply integrated with the team, and proactively shares context and decisions.

Bonus Skills and Experience

  • Ruby and/or Ruby on Rails framework.
  • Writing performant code using methods like parallelism and concurrency.
  • AWS services: SageMaker, Lambda, Redshift, OpenSearch, Kinesis.
  • Experience with distributed systems.

Why Our Team

  • 100% remote across the U.S., with quarterly in-person gatherings for team offsites
  • Competitive Compensation
  • Flexible vacation and parental leave policies
  • Medical, Dental, and Vision Insurance
  • 100% paid Short-Term Disability, Long-Term Disability, and Life Insurance Program
  • 401k Program
  • A supportive, learning-first culture where you’ll help shape the next generation of AI-driven marketing tools for the apartment rental industry
Share this job:
Please let Rentable know you found this job on Remote First Jobs 🙏

Similar Remote Jobs

Benefits of using Remote First Jobs

Discover Hidden Jobs

Unique jobs you won't find on other job boards.

Advanced Filters

Filter by category, benefits, seniority, and more.

Priority Job Alerts

Get timely alerts for new job openings every day.

Manage Your Job Hunt

Save jobs you like and keep a simple list of your applications.

Search remote, work from home, 100% online jobs

We help you connect with top remote-first companies.

Search jobs

Hiring remote talent? Post a job

Frequently Asked Questions

What makes Remote First Jobs different from other job boards?

Unlike other job boards that only show jobs from companies that pay to post, we actively scan over 20,000 companies to find remote positions. This means you get access to thousands more jobs, including ones from companies that don't typically post on traditional job boards. Our platform is dedicated to fully remote positions, focusing on companies that have adopted remote work as their standard practice.

How often are new jobs added?

New jobs are constantly being added as our system checks company websites every day. We process thousands of jobs daily to ensure you have access to the most up-to-date remote job listings. Our algorithms scan over 20,000 different sources daily, adding jobs to the board the moment they appear.

Can I trust the job listings on Remote First Jobs?

Yes! We verify all job listings and companies to ensure they're legitimate. Our system automatically filters out spam, junk, and fake jobs to ensure you only see real remote opportunities.

Can I suggest companies to be added to your search?

Yes! We're always looking to expand our listings and appreciate suggestions from our community. If you know of companies offering remote positions that should be included in our search, please let us know. We actively work to increase our coverage of remote job opportunities.

How do I apply for jobs?

When you find a job you're interested in, simply click the 'Apply Now' button on the job listing. This will take you directly to the company's application page. We kindly ask you to mention that you found the position through Remote First Jobs when applying, as it helps us grow and improve our service 🙏

Apply