Data Engineer

🇧🇷 Brazil - Remote
📊 Data🔵 Mid-level

Job description

This role has a specialized focus on building and maintaining robust, scalable, and automated data pipelines and plays a key role in optimizing our data infrastructure and enabling efficient data delivery across the organization. As the organization enhances its cloud data platform (Snowflake or something similar), this role will be instrumental in implementing and managing CI/CD processes, infrastructure as code (Terraform), and data transformation workflows (dbt).

Job Responsibilities:

  • Design, build, and maintain scalable and resilient CI/CD pipelines for data applications and infrastructure, with a focus on Snowflake, dbt, and related data tools.

  • Implement and manage Snowflake dbt projects for data transformation, including developing dbt models, tests, and documentation, and integrating dbt into CI/CD workflows.

  • Develop and manage infrastructure as code (IaC) using Terraform to provision and configure cloud resources for data storage, processing, and analytics on GCP.

  • Automate the deployment, monitoring, and management of Snowflake data warehouse environments, ensuring optimal performance, security, and cost-effectiveness.

  • Collaborate with data engineers and data scientists to understand their requirements and provide robust, automated solutions for data ingestion, processing, and delivery.

  • Implement and manage monitoring, logging, and alerting systems for data pipelines and infrastructure to ensure high availability and proactive issue resolution.

  • Develop and maintain robust automation scripts and tools, primarily using Python, to streamline operational tasks, manage data pipelines, and improve efficiency; Bash scripting for system-level tasks is also required.

  • Ensure security best practices are implemented and maintained across the data infrastructure and pipelines.

  • Troubleshoot and resolve issues related to data infrastructure, pipelines, and deployments in a timely manner.

  • Participate in code reviews for infrastructure code, dbt models, and automation scripts.

  • Document system architectures, configurations, and operational procedures.

  • Stay current with emerging DevOps technologies, data engineering tools, and cloud best practices, particularly related to Snowflake, dbt, and Terraform.

  • Optimize data pipelines for performance, scalability, and cost.

  • Support and contribute to data governance and data quality initiatives from an operational perspective.

  • Help implement AI features

  • Bachelor’s degree in Computer Science, Engineering, or a related technical field expertise which is equivalent.

  • 5+ years of hands-on experience in a DevOps, SRE, or infrastructure engineering role.

  • 3+ years of experience specifically focused on automating and managing data infrastructure and pipelines.

  • 1+ years of experience enabling AI features

Others:

  • Strong, demonstrable experience with Infrastructure as Code tools, particularly Terraform.

  • Strong background in DevOps principles and practices, and hands-on experience in building business intelligence solutions.

  • Highly experienced in automation, and problem-solving skills with proficiency in cloud technologies.

  • Ability to collaborate effectively with data engineers, analysts, and other stakeholders to ensure the reliability and performance of our data ecosystem.

  • Proven experience with dbt for data transformation, including developing models, tests, and managing dbt projects in a production environment.

  • Hands-on experience managing and optimizing Snowflake data warehouse environments.

  • Demonstrable experience with data modeling techniques for ODS, dimensional modeling (Facts, Dimensions), and semantic models for analytics and BI.

  • Strong proficiency in Python for automation, scripting, and data-related tasks. Experience with relevant Python libraries is a plus. Strong Bash scripting.

  • Solid understanding of CI/CD principles and tools (e.g., Bitbucket Runners, Jenkins, GitLab CI, GitHub Actions, Azure DevOps).

  • Experience with cloud platforms (GCP preferred, AWS, or Azure) and their data services.

  • Experience with containerization technologies (e.g., Docker, Kubernetes) is a plus.

  • Knowledge of data integration tools and ETL/ELT concepts.

  • Familiarity with monitoring and logging tools.

  • Strong SQL skills.

  • Ability to work independently and as part of a collaborative team in an agile environment.

  • Strong communication skills, with the ability to explain complex technical concepts clearly.”

  • Fully remote with office optional. You decide when you would like to work from home and when from the office.

  • Flexible timings. You decide your work scheduled.

  • Market competitive compensation (in $$).

  • Insane learning and growth

Share this job:
Please let Remotebase know you found this job on Remote First Jobs 🙏

Similar Remote Jobs

Benefits of using Remote First Jobs

Discover Hidden Jobs

Unique jobs you won't find on other job boards.

Advanced Filters

Filter by category, benefits, seniority, and more.

Priority Job Alerts

Get timely alerts for new job openings every day.

Manage Your Job Hunt

Save jobs you like and keep a simple list of your applications.

Search remote, work from home, 100% online jobs

We help you connect with top remote-first companies.

Search jobs

Hiring remote talent? Post a job

Frequently Asked Questions

What makes Remote First Jobs different from other job boards?

Unlike other job boards that only show jobs from companies that pay to post, we actively scan over 20,000 companies to find remote positions. This means you get access to thousands more jobs, including ones from companies that don't typically post on traditional job boards. Our platform is dedicated to fully remote positions, focusing on companies that have adopted remote work as their standard practice.

How often are new jobs added?

New jobs are constantly being added as our system checks company websites every day. We process thousands of jobs daily to ensure you have access to the most up-to-date remote job listings. Our algorithms scan over 20,000 different sources daily, adding jobs to the board the moment they appear.

Can I trust the job listings on Remote First Jobs?

Yes! We verify all job listings and companies to ensure they're legitimate. Our system automatically filters out spam, junk, and fake jobs to ensure you only see real remote opportunities.

Can I suggest companies to be added to your search?

Yes! We're always looking to expand our listings and appreciate suggestions from our community. If you know of companies offering remote positions that should be included in our search, please let us know. We actively work to increase our coverage of remote job opportunities.

How do I apply for jobs?

When you find a job you're interested in, simply click the 'Apply Now' button on the job listing. This will take you directly to the company's application page. We kindly ask you to mention that you found the position through Remote First Jobs when applying, as it helps us grow and improve our service 🙏

Apply