Senior Data Engineer

🇵🇱 Poland - Remote
📊 Data🟣 Senior

Job description

🌴 Form of work: long-term, full-time, 100% remote 👈

⏰ Start: ASAP 👈

Hi! 👋

We are looking for a Senior Data Engineer with strong experience in Databricks, Python/PySpark, and Snowflake for our client - a major U.S. legal industry leader. The project focuses on modernizing and migrating systems and data from Snowflake to Databricks: from source extraction, through transformation and data quality validation, to loading into new cloud-based environments.

We’re searching for someone with a consulting mindset (comfortable working directly with the client), who can operate independently within the project and navigate complex and changing requirements. This role includes full ownership of the Snowflake migration track (not a team contributor role).

The ideal candidate has deep technical expertise in Apache Iceberg, Unity Catalog governance, and modern Databricks capabilities (Workflows, Serverless, Lakehouse Monitoring).

Due to the client’s U.S. location, some overlap with EST hours is required (until around 6 PM CET). During the first weeks (onboarding), additional afternoon availability may be needed.

Scope of responsibilities: 📍 Architecture & Design

- Snowflake data warehouse inventory and dependency mapping

- Migration architecture using Apache Iceberg for cross-platform interoperability

- Governance framework in Unity Catalog (role-based access, row-level security)

- Full + incremental historical sync strategy

📍 Migration execution

- Snowflake UNLOAD → Databricks ingestion (Auto Loader, Workflows)

- Iceberg tables enabling read-write compatibility between Snowflake and Databricks

- Medallion architecture (Bronze/Silver/Gold) using Delta Lake / Iceberg

- Row-level security implementation in Unity Catalog

- CDC patterns for Snowflake → Databricks flows

📍 Optimization & handoff

- Cost optimization (Photon, Serverless SQL, workload tiering)

- Monitoring dashboards (data quality, pipeline health, cost)

- DataPact for automated quality testing

- Production cutover support and documentation

Requirements: ⚡️ ~5 years hands-on experience as a Databricks platform engineer

⚡️ Proven Snowflake → Databricks migration delivery

⚡️ Practical knowledge of Apache Iceberg & cross-platform data interoperability

⚡️ Expertise in Unity Catalog access & security (row-level security, access control)

⚡️ AWS cloud experience

⚡️ Strong Python / PySpark, DataFrames, Delta Tables skills

⚡️ Ability to lead independent delivery under tight timelines

⚡️ Strong client-facing and stakeholder management skills

⚡️ Fluent English for everyday communication with U.S. teams (B2/C1) Nice to have:

⚡️ Databricks Professional certification

⚡️ FinOps knowledge (cost optimization, DBU monitoring) / financial-sector experience

⚡️ Infrastructure-as-Code (Terraform, DABs)

What we offer:

🎯 Transparent communication throughout recruitment and employment

🎯 Fast, candidate-friendly recruitment process

🎯 Remote-first culture – business travel kept to a minimum

🎯 Private healthcare (Medicover) & Multisport card for contractors

Share this job:
Please let CRODU know you found this job on Remote First Jobs 🙏

Similar Remote Jobs

Benefits of using Remote First Jobs

Discover Hidden Jobs

Unique jobs you won't find on other job boards.

Advanced Filters

Filter by category, benefits, seniority, and more.

Priority Job Alerts

Get timely alerts for new job openings every day.

Manage Your Job Hunt

Save jobs you like and keep a simple list of your applications.

Search remote, work from home, 100% online jobs

We help you connect with top remote-first companies.

Search jobs

Hiring remote talent? Post a job

Apply