Job description
About the Role
We are seeking a true Snowflake data engineering “Expert” that will hit the ground running. This role will be critical to the success of the Dynatron SaaS and DaaS product portfolio. Your focus will be ensuring our data pipelines, data warehouse, and overall data quality are rock-solid and will support the company’s strategic vision around data.
To be successful in this role you must have:
- BOTH leadership and hands-on experience.
- You will be a thought leader in evaluating and recommending best practices.
- You will also be hands-on in implementing these best practices.
- Provide technical leadership and mentorship to a team of data engineers, fostering a culture of collaboration, innovation, and continuous learning.
- An ability to challenge the current architecture based on your prior experience and justify your recommendations. Don’t just accept the status quo.
- Desire to become an “expert” on all aspects of Dynatron data
- Extensive experience designing, implementing and automating best-of-breed data pipelines using platforms such as Snowflake, AWS, Python, Airbyte/Fivetran, Airflow, and Stored Procedures
- Extensive experience designing, modeling and implementing data warehouses on the Snowflake platform. You MUST have expert experience in Snowflake.
- Extensive experience with approaches to ensure data quality throughout the data pipelines and in the data warehouse and addressing data quality issues with the current architecture
- Collaborate closely with cross-functional teams, including product owners, software engineers, data scientists and business stakeholders, to understand data requirements and deliver solutions that meet evolving business needs.
- Lead by example, demonstrating a commitment to excellence, integrity, and professionalism in all aspects of your work.
Must Have (non-negotiable) Skills:
- 5+ years of Expert Level Snowflake design and implementation experience.
- 5+ years of experience designing and implementing other data warehouse technologies such as Databricks and Redshift
- 5+ years of AWS experience, including S3 and managed database services (Aurora, DynamoDB, RDS)
- 5+ years of experience with hands-on Python development
- 10 years experience working with relational databases such as MySQL or other SQL based relational databases
- 10 years experience with CI/CD pipelines and version control systems (e.g., Bitbucket) for managing codebase and deployments.
In Return for Your Expertise, You Will Receive
Excellent benefits including health, dental, vision, and short term disability insurance, stock options, work from home and flexible scheduling depending on job requirements, professional development opportunities, 11 paid holidays, and 15 days PTO. Home office setup support for remote employees. The chance to work for an organization that puts people first and fosters a culture of teamwork by embracing our 5 core values:
- Success Driven – We strive for excellence with continuous improvement and grit.
- Delivering Results – We deliver a high quality of work, and we don’t confuse effort with results.
- Sense of Urgency – We know our priorities and take decisive action.
- Accountability – We take extreme ownership and deal with the consequences of our actions.
- Positive Attitude – We have a positive mindset, and we enjoy what we do.
Total Compensation: $170,000
Dynatron Software is an Equal Opportunity Employer and encourages all qualified individuals to apply.