Job description
Remote β LATAM
About Us
AccelOne provides custom software development and design services for various companies across the US and Latin America. We are built on core principles of transparency, communication, and accountability, and we aim to deliver exceptional solutions for our clients.
About Our Project
Our client is a U.S.-based industry leader in digital experience personalization. Their platform enables global retail and e-commerce brands to deliver real-time, data-driven experiences across digital channels. With a strong focus on optimization and customer engagement, they serve a wide range of enterprise clients and operate at scale across international markets. Their solutions leverage advanced analytics and decisioning engines to help companies drive performance and personalization.
Role Summary
As a Data Engineer, you will be responsible for designing, building, and maintaining real-time and batch data pipelines. You will work with cross-functional teams to develop data products and ensure reliable, scalable, and high-performance data infrastructure.
This is a great opportunity for someone who enjoys working in a fast-paced environment and is passionate about building efficient and maintainable data systems using modern cloud technologies.
Responsibilities
Design, build, and maintain scalable real-time and batch data pipelines using Kafka, AWS, and Python.
Develop event-driven data workflows leveraging Kafka topics, Lambda functions, and downstream sinks (e.g., Snowflake, S3).
Work with cross-functional teams to translate business requirements into reliable data products.
Build and maintain robust ETL processes across Snowflake and MySQL for reporting and analytics.
Ensure high-quality data delivery by writing modular, well-tested Python code.
Monitor data pipelines using tools such as CloudWatch, Grafana, or custom metrics.
Contribute to technical design, architectural reviews, and infrastructure improvements.
Maintain data integrity and scalability in a high-volume data environment.
Continuously explore and implement best practices in data engineering, streaming, and analytics infrastructure.
Qualifications
Basic Requirements:
4+ years of experience as a data engineer in a production environment.
Fluency in Python, with strong experience in a Linux environment.
Strong SQL skills, with hands-on experience in Snowflake and MySQL.
Solid understanding of data modeling, schema design, and performance optimization.
Proficiency with Python data libraries: NumPy, Pandas, SciPy.
Experience building and maintaining real-time data pipelines using Kafka (Apache Kafka or AWS MSK).
Familiarity with AWS services including Lambda, Kinesis, S3, and CloudWatch.
Experience working with event-driven architectures and streaming data patterns.
Knowledge of RESTful APIs, data integration, and version control workflows (e.g., Git).
Comfortable in Agile development environments with strong testing and code review practices.
What We Offer
Remote Work: Enjoy flexibility and a competitive compensation package.
Professional Growth: Access to career development opportunities, training, and certifications.
Inclusive Environment: We foster a people-first culture where everyone can thrive professionally and personally.
At AccelOne, we value our team and prioritize a supportive, balanced work environment. Join us in delivering top-tier solutions to our clients while advancing your career in a rewarding setting.