Senior Data Engineer

at Wavicle Data Solutions
  • Remote - United States

Remote

Data

Senior

Job description

A BIT ABOUT WAVICLE

Wavicle Data Solutions is a founder-led, high-growth consulting firm helping organizations unlock the full potential of cloud, data, and AI. We’re known for delivering real business results through intelligent transformation—modernizing data platforms, enabling AI-driven decision-making, and accelerating time-to-value across industries.

At the heart of our approach is WIT—the Wavicle Intelligence Framework. WIT brings together our proprietary accelerators, delivery models, and partner expertise into one powerful engine for transformation. It’s how we help clients move faster, reduce costs, and create lasting impact—and it’s where your ideas, skills, and contributions can make a real difference.

Our work is deeply rooted in strong partnerships with AWS, Databricks, Google Cloud, and Azure, enabling us to deliver cutting-edge solutions built on the best technologies the industry has to offer.

With over 500 team members across 42 cities in the U.S., Canada, and India, Wavicle offers a flexible, digitally connected work environment built on collaboration and growth.

We invest in our people through:

-Competitive compensation and bonuses

-Unlimited paid time off

-Health, retirement, and life insurance plans

-Long-term incentive programs

-Meaningful work that blends innovation and purpose

If you’re passionate about solving complex problems, exploring what’s next in AI, and being part of a team that values delivery excellence and career development—you’ll feel right at home here.

THE OPPORTUNITY

Wavicle Data Solutions, LLC seeks a Sr. Data Engineer in Oak Brook, IL.

WHAT YOU WILL GET TO DO

  • Create the conceptual, logical and physical data models.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of sources like Hadoop, Spark, AWS Lambda, etc.
  • Lead and/or mentor a small team of data engineers.
  • Design, develop, test, deploy, maintain and improve data integration pipeline.
  • Develop pipeline objects using Apache Spark / Pyspark / Python or Scala.
  • Design and develop data pipeline architectures using Hadoop, Spark and related AWS Services.
  • Load and performance test data pipelines built using the abovementioned technologies.
  • Communicate effectively with client leadership and business stakeholders.
  • Participate in proposal and/or SOW development.
  • Utilize Big Data technologies to architect and build a scalable data solution.
  • Identify any gaps and work with the client to resolve in a timely manner.
  • Stay current with technology trends in order to make useful recommendations to match the client requirements.
  • Responsible for the design and execution of abstractions and integration patterns (APIs) to solve complex distributed computing problem.
  • Participate in pre-sales activities, including proposal development, RFI/RFP response, shaping a solution from the client’s business problem.
  • Act as a thought leader in the industry by creating written collateral (white papers or POVs), participate in speaking events, create/participate in internal and external events (lunch ‘n learns, online speaking events, etc.).

WHAT YOU BRING TO THE TEAM

  • Master’s Degree in Computer Science, Computer Engineering, Computer Application, Information Systems, or a directly related field, or the equivalent and 2+ years of experience in related occupations.

  • 2 + years of experience in the following:

  • One or more of the following ETL tools to build data pipelines/data warehouses: Talend Big Data, Informatica, DataStage, or Abinitio

  • Programming using Scala, Python, R, or Java

  • ETL pipeline implementation using one or more of the following AWS services: Glue, Lambda, EMR, Athena, S3, SNS, Kinesis, Data-Pipelines, or Pyspark

  • Using real-time streaming systems: Kafka/Kafka Connect, Spark, Flink or AWS Kinesis

  • Implementing big-data solutions in the Hadoop ecosystem: Apache Hadoop, MapReduce, Hive, Pig, Sqoop, NoSQL, or Databricks

  • One or more of the following databases: Snowflake, AWS Redshift, Oracle, SQL Server, Teradata, Netezza, Hadoop, Mongo DB or Cassandra

  • Building conceptual, logical or physical database designs using the following tools: ErWin, Visio or Enterprise Architect

  • Designing and implementing data pipelines in on-prem and cloud environment.

  • Telecommuting is permitted. 40 hours/week. Must also have authority to work permanently in the U.S. Applicants who are interested in this position may apply at www.jobpostingtoday.com Ref# 86603 for consideration.

WAVICLE BENEFITS

  • Health Care Plan (Medical, Dental & Vision)
  • Retirement Plan (401k, IRA)
  • Life Insurance (Basic, Voluntary & AD&D)
  • Unlimited Paid Time Off (Vacation, Sick & Public Holidays)
  • Short Term & Long Term Disability
  • Employee Assistance Program
  • Training & Development
  • Work From Home
  • Bonus Program

$100,000 - $150,000 a year

EQUAL OPPORTUNITY EMPLOYER

Wavicle is an Equal Opportunity Employer and committed to creating an inclusive environment for all employees. We welcome and encourage diversity in the workplace regardless of race, color, religion, national origin, gender, pregnancy, sexual orientation, gender identity, age, physical or mental disability, genetic information or veteran status.

Share this job:
Please let Wavicle Data Solutions know you found this job on Remote First Jobs 🙏

Benefits of using Remote First Jobs

Discover Hidden Jobs

Unique jobs you won't find on other job boards.

Advanced Filters

Filter by category, benefits, seniority, and more.

Priority Job Alerts

Get timely alerts for new job openings every day.

Manage Your Job Hunt

Save jobs you like and keep a simple list of your applications.

Search remote, work from home, 100% online jobs

We help you connect with top remote-first companies.

Search jobs

Hiring remote talent? Post a job

Frequently Asked Questions

What makes Remote First Jobs different from other job boards?

Unlike other job boards that only show jobs from companies that pay to post, we actively scan over 20,000 companies to find remote positions. This means you get access to thousands more jobs, including ones from companies that don't typically post on traditional job boards. Our platform is dedicated to fully remote positions, focusing on companies that have adopted remote work as their standard practice.

How often are new jobs added?

New jobs are constantly being added as our system checks company websites every day. We process thousands of jobs daily to ensure you have access to the most up-to-date remote job listings. Our algorithms scan over 20,000 different sources daily, adding jobs to the board the moment they appear.

Can I trust the job listings on Remote First Jobs?

Yes! We verify all job listings and companies to ensure they're legitimate. Our system automatically filters out spam, junk, and fake jobs to ensure you only see real remote opportunities.

Can I suggest companies to be added to your search?

Yes! We're always looking to expand our listings and appreciate suggestions from our community. If you know of companies offering remote positions that should be included in our search, please let us know. We actively work to increase our coverage of remote job opportunities.

How do I apply for jobs?

When you find a job you're interested in, simply click the 'Apply Now' button on the job listing. This will take you directly to the company's application page. We kindly ask you to mention that you found the position through Remote First Jobs when applying, as it helps us grow and improve our service 🙏

Apply