Job description
Company Description
BETSOL is a cloud-first digital transformation and data management company offering products and IT services to enterprises in over 40 countries. BETSOL team holds several engineering patents, is recognized with industry awards, and BETSOL maintains a net promoter score that is 2x the industry average.
BETSOL’s open source backup and recovery product line, Zmanda (Zmanda.com), delivers up to 50% savings in total cost of ownership (TCO) and best-in-class performance.
BETSOL Global IT Services (BETSOL.com) builds and supports end-to-end enterprise solutions, reducing time-to-market for its customers.
BETSOL offices are set against the vibrant backdrops of Broomfield, Colorado and Bangalore, India.
We take pride in being an employee-centric organization, offering comprehensive health insurance, competitive salaries, 401K, volunteer programs, and scholarship opportunities. Office amenities include a fitness center, cafe, and recreational facilities.
Job Description
Position Overview
We are looking for a highly skilled data warehouse engineer with exceptional ETL skills to play a crucial role in designing and implementing cutting-edge data pipeline infrastructures and models while providing essential support for our data warehouse. Success in this role will be fueled by expertise of data pipelines, data warehousing, and excellent communication skills. This position will be working extensively with our health plan solution, focusing on bringing external data sources, building data marts, and preparing for a migration to Azure Cloud and Fabric One.
Note: This is a senior position requiring candidates who can hit the ground running with extensive ETL and data warehousing experience.
Responsibilities:
- Develop and maintain data pipelines, API-based or file-based data flows between source systems and the data warehouse
- Use innovative tools and techniques to automate common data preparation and integration tasks with the goal of reducing defects and ensuring data quality
- Implement best practices to ensure the integrity of data with exception-handling routines
- Provides source to target mapping development, support, and maintenance.
- Lead troubleshooting efforts and formulate interdisciplinary task force groups for ETL issues
- Design, develop, and deploy data structures and data transformations in the enterprise data warehouse using Python, SSIS, ADF
- Maintain and extend the Epic Caboodle platform and develop custom Caboodle data modeling components
- Form relationships and coordinate with business stakeholders to identify data needs, clarify requirements, and implement solutions
- Contribute to the department’s short-term and long-term strategic plan
- Make appropriate recommendations on the management of data extraction, and analysis
- Maintain knowledge of the current regulations and technologies related to data management
- Assist with data governance initiatives in the areas of data quality, data security, metadata, and master data management
- Actively contribute to all aspects of the data project lifecycle including request intake and acknowledgment, project estimation, time-tracking, and prioritization of tasks.
- Be an exemplary team player with excellent collaboration skills
- Exhibit outstanding customer service skills with stakeholders
- Perform other duties as required or assigned.
Qualifications
Must-have qualifications:
- 8-10 years of experience as a Data Engineer
- Strong ETL experience (SSIS, data warehousing concepts) - this is fundamental to success in this role
- Health plan experience strongly preferred
- Experience with bringing external data sources into data marts
- Strong business knowledge of health plan operations and data
Experience Mix Options
- While Epic experience is highly valued, candidates with strong ETL experience and health plan knowledge but limited Epic experience will be considered
- Candidates from health plan organizations (like Blue Shield) with robust ETL skills may be a good fit even without extensive Epic background
Additional Qualifications
- In-depth knowledge of SQL, data warehouses, and data transformation techniques
- Proven experience with designing and building data pipelines
- Expert knowledge of metadata management and related tools
- Advanced knowledge of data ETL concepts, processes, and tools such as MS SSIS, ADF
- Advanced knowledge of Python
- Ability to read and understand various data structures
- Ability to work independently and as part of a team
- Strong analytical, technical, and troubleshooting skills
- Ability to assess requirements from multiple sources and their impact on potential solutions
- Ability to work in a complex environment
- Ability to be organized and proficient at tracking tasks, defining next steps, and following project plans
- Advanced knowledge of database and data warehousing concepts, including data lakes, relational and dimensional database design concepts, and data modeling practices
- Intermediate knowledge of Jupyter Notebooks
- Familiarity with Agile project management methods such as SCRUM, Lean, and/or Kanban
- Advanced knowledge of healthcare data structures, workflows, and concepts, from Electronic Health Record systems like Epic
- Knowledge of Azure cloud platform, Fabric data platform, ADF, and DevOps is highly preferred
Education
- Bachelor’s degree in a technical, scientific, and/or healthcare discipline; or equivalent work experience.
Additional Information
Licensure/Certifications
- Epic Cogito, Clarity, and Caboodle certifications are required within 120 days of hire
- All certifications must be maintained throughout employment
Additional Information
- Hours: Must be able to accommodate Pacific time zone hours
- Location: Remote
- Note: The position does not offer visa transfer or sponsorship options