Job description
SMASH, Who we are?
We are agents for tech professionals in Costa Rica and Colombia that help them build careers in the United States.
We believe in long-lasting relationships with our talent. We invest time getting to know them and understanding what they seek as their professional next step.
We aim to find the perfect match. As agents, we pair our talent with our US clients, not only by their technical skills but as a cultural fit. Our core competency is to find the right talent fast.
We purposefully move away from the “contractor” or “outsourcing” type of relationship. Our clients don’t want contractors or “just a service.” Neither does our talent.
Our Benefits
Wellness Coverage
Remote Work
Birthday day off
Recognition and rewards system
Referrals Program
Business skill coaching
English classes for Smashers and relatives
Learning opportunities
This position is Remote to work with a US Company; you will require to have Citizenship or a work permit from Costa Rica to apply for this role.
Role summary
You will design, build, and maintain modern data pipelines and analytics platforms using Snowflake and cloud-native tooling. This role focuses on scalable data transformations, reliable orchestration, and enabling analytics-ready datasets for US-based stakeholders.
Responsibilities
Design, build, and maintain scalable data pipelines and transformations.
Develop and manage analytics workflows using DBT.
Orchestrate data pipelines using Apache Airflow.
Build and optimize data models within Snowflake for analytics and reporting.
Write clean, efficient Python code for data processing and automation.
Collaborate with analytics, product, and engineering teams to define data requirements.
Monitor pipeline reliability, performance, and data quality.
Implement best practices for version control, testing, and documentation.
Support continuous improvement of the data platform and tooling.
Requirements – Must-haves
Proven experience as a Data Engineer or similar role.
Hands-on experience with Snowflake as a cloud data warehouse.
Strong experience using DBT for data modeling and transformations.
Proficiency in Python for data engineering tasks.
Experience orchestrating pipelines with Apache Airflow.
Strong SQL skills for data transformation and optimization.
Understanding of data warehousing concepts and best practices.
Ability to work independently in a remote, fast-paced environment.
Nice-to-haves (optional)
Experience with cloud platforms (AWS, GCP, or Azure).
Familiarity with data quality or observability tools.
Experience supporting BI tools such as Power BI or Looker.
Exposure to streaming or real-time data pipelines.
Languages
- English B2+, Spanish C1
