Summary
Azumo is seeking a highly motivated Big Data Engineer to develop and enhance data infrastructure using Spark, Kafka, Snowflake or similar frameworks. The position offers remote work and competitive compensation based on experience and potential. Azumo values professional growth, community involvement, and open source initiatives.
Requirements
- BS or Master’s degree in Computer Science, related degree, or equivalent experience
- 5+ years experience with data-related and data management responsibilities
- Deep expertise in designing and building data warehouses and big data analytics systems
- Practical experience manipulating, analyzing and visualizing data
- Self-driven and motivated, with a strong work ethic and a passion for problem solving
Responsibilities
- Collaborate with a growing team and bright engineering minds in big data computing
- Design and develop scalable, high performant big data infrastructure using Spark, Kafka, Snowflake or similar frameworks
Preferred Qualifications
- Experience with cloud-based managed services like Airflow, Glue, Elastic stack, Amazon Redshift, Snowflake, BigQuery, Azure SQL Db, EMR, Azure, Databricks, Altiscale or Qubole
- Prior experience with notebooks using Jupyter, Google Collab, or similar
Benefits
- Paid time off (PTO)
- U.S. Holidays
- Training
- Udemy free Premium access
- Mentored career development
- Free English Courses
- Profit Sharing
- $US Remuneration