GCP Data Engineer

🇺🇸 United States - Remote
💻 Software Development🔵 Mid-level

Job description

Emids is a leading provider of digital transformation solutions to the healthcare industry, serving payers, providers, life sciences, and technology firms. Headquartered in Nashville, Emids helps bridge critical gaps in providing accessible, affordable, and high-quality healthcare by providing digital transformation services, custom application development, data engineering, business intelligence solutions, and specialized consulting services to all parts of the healthcare ecosystem. With nearly 2,500 professionals globally, Emids leverages strong domain expertise in healthcare-specific platforms, regulations, and standards to provide tailored, cutting-edge solutions and services to its clients.

Job Title: GCP Data Engineer

Location: Dallas, TX (On-site or only Local preferred)

Job Description:

Overview:

We are seeking a GCP Data Engineer to design, build, and maintain large-scale data infrastructure and processing systems. The role involves creating scalable solutions to support data-driven applications, analytics, and business intelligence. A key responsibility is migrating historical data from Teradata to the new GCP system and developing data pipelines to streamline data loading into BigQuery.

Key Responsibilities:

  • Integrate data from diverse sources, including databases, APIs, and streaming platforms.
  • Optimize data processing and query performance through fine-tuning data pipelines, database configurations, and partitioning strategies.
  • Implement and monitor data quality checks and validations to ensure reliable data for analytics and applications.
  • Apply security measures to safeguard sensitive data, coordinating with security teams to ensure encryption, access controls, and regulatory compliance.
  • Collaborate with cross-functional teams, including data scientists, analysts, software engineers, and business stakeholders.
  • Design and develop data infrastructure components such as data warehouses, data lakes, and data pipelines.
  • Establish and maintain auditing, monitoring, and alerting mechanisms to ensure data governance and system performance.
  • Explore and implement new frameworks, platforms, or cloud services to enhance data processing capabilities.
  • Utilize DevSecOps practices to incorporate security throughout the development lifecycle.

Position Summary:

  • Acquire a thorough understanding of enterprise data systems and relevant processes for project delivery.
  • Contribute to project estimation and provide insights to technical leads.
  • Participate in Agile scrum activities, project status meetings, and user story grooming/design discussions.
  • Analyze complex data structures from various sources and design large-scale data engineering pipelines.
  • Develop robust ETL pipelines, design database systems, and create data processing tools using programming skills.
  • Perform data engineering tasks including ETL development, testing, and deployment.
  • Collaborate with developers on ETL job/pipeline development and integrate components for automation.
  • Document data engineering processes, workflows, and systems for reference and knowledge sharing.
  • Ensure data accuracy, completeness, and consistency through quality checks and validation processes.
  • Work effectively with team members to deliver business solutions.

Preferred Qualifications:

  • Experience with GCP tools such as BigQuery, Cloud SQL, Python, Cloud Composer/Airflow, Cloud Storage, and Dataflow/Data Fusion.
  • Practical experience with Teradata utilities (BTEQ, TPT, FastLoad) and SQL queries.
  • GCP Data Engineer certification is strongly preferred.
  • Proficiency with multiple tools and programming languages for data analysis and manipulation.
  • Strong problem-solving and critical thinking abilities.
  • Effective communication and collaboration skills within and across teams.
  • Knowledge of Flask, JavaScript, HTML, CSS, and Django.
  • Familiarity with BI tools such as MicroStrategy and Tableau.
  • Understanding of software development methodologies including waterfall and Agile.
  • Experience in the healthcare or PBM domain is preferred.

Required Qualifications:

  • 7+ years of experience in building and executing data engineering pipelines.
  • 6+ years of experience with Python.
  • 7+ years of experience with SQL.
  • 7+ years of hands-on experience with bash shell scripts, UNIX utilities, and UNIX commands.
  • 5+ years of experience with GCP, including BigQuery and Cloud SQL.
  • 5+ years of experience with various databases such as Teradata, DB2, Oracle, and SQL Server.
  • Experience in healthcare and PBM systems is preferred.

Here at Emids we’re not scared of differences. It’s how we break new ground. As we scale and elevate the experience of our clients in the Healthcare & Life Sciences Space and ultimately have an impact on every patient from every walk of life, the team we build must be reflective of the diversity that we serve. Together, we’ve built and will continue to grow, a diverse and inclusive culture where everyone has a seat at the table and the space to be their most authentic self. Emids believes in being an Equal Opportunity Employer and we support, celebrate, and cherish all the things that make our teammates who they are.

Share this job:
Please let Emids know you found this job on Remote First Jobs 🙏

Similar Remote Jobs

Benefits of using Remote First Jobs

Discover Hidden Jobs

Unique jobs you won't find on other job boards.

Advanced Filters

Filter by category, benefits, seniority, and more.

Priority Job Alerts

Get timely alerts for new job openings every day.

Manage Your Job Hunt

Save jobs you like and keep a simple list of your applications.

Search remote, work from home, 100% online jobs

We help you connect with top remote-first companies.

Search jobs

Hiring remote talent? Post a job

Frequently Asked Questions

What makes Remote First Jobs different from other job boards?

Unlike other job boards that only show jobs from companies that pay to post, we actively scan over 20,000 companies to find remote positions. This means you get access to thousands more jobs, including ones from companies that don't typically post on traditional job boards. Our platform is dedicated to fully remote positions, focusing on companies that have adopted remote work as their standard practice.

How often are new jobs added?

New jobs are constantly being added as our system checks company websites every day. We process thousands of jobs daily to ensure you have access to the most up-to-date remote job listings. Our algorithms scan over 20,000 different sources daily, adding jobs to the board the moment they appear.

Can I trust the job listings on Remote First Jobs?

Yes! We verify all job listings and companies to ensure they're legitimate. Our system automatically filters out spam, junk, and fake jobs to ensure you only see real remote opportunities.

Can I suggest companies to be added to your search?

Yes! We're always looking to expand our listings and appreciate suggestions from our community. If you know of companies offering remote positions that should be included in our search, please let us know. We actively work to increase our coverage of remote job opportunities.

How do I apply for jobs?

When you find a job you're interested in, simply click the 'Apply Now' button on the job listing. This will take you directly to the company's application page. We kindly ask you to mention that you found the position through Remote First Jobs when applying, as it helps us grow and improve our service 🙏

Apply