Job description
Company Description
As Hungary’s most attractive employer in 2025 (according to Randstad’s representative survey), Deutsche Telekom IT Solutions is a subsidiary of the Deutsche Telekom Group. The company provides a wide portfolio of IT and telecommunications services with more than 5300 employees. We have hundreds of large customers, corporations in Germany and in other European countries. DT-ITS recieved the Best in Educational Cooperation award from HIPA in 2019, acknowledged as the the Most Ethical Multinational Company in 2019. The company continuously develops its four sites in Budapest, Debrecen, Pécs and Szeged and is looking for skilled IT professionals to join its team.
Job Description
Key Responsibilities
- Serve as technical lead and solution architect for a Snowflake Lakehouse platform across Non-Prod and Prod environments
- Design, implement, and govern multi-environment architecture, including DEV, UAT, DATA_DEV, PREPROD, PROD, and DATA_RESTRICTION environments
- Define and enforce best practices in zero-copy cloning, dynamic data masking, row-level security, and access control policy design
- Architect CI/CD workflows for infrastructure and data pipelines using Terraform, dbt, and Git (GitOps)
- Lead data ingestion, transformation, and provisioning strategies for Power BI via both Import and DirectQuery modes
- Direct integration with SAP (via Xtract Universal), Azure Blob Storage, and external APIs using Azure Data Factory
- Optimize query performance and storage cost efficiency, leveraging semi-structured data support (JSON, Parquet) and Snowflake’s virtual warehouse scalability
- Oversee data governance and regulatory compliance, advising on platform security, privacy-preserving analytics, and inter-organizational data sharing
- Collaborate with infrastructure and security teams on cross-domain identity, role design, and cloud provider neutrality
- Mentor Snowflake developers and analysts, build reusable assets, and support strategic platform decisions (e.g., onboarding new SAP sources, expanding PaaS integrations)
Qualifications
Required Qualifications
- 5–7 years of deep technical experience with Snowflake in enterprise-scale projects
- Proven expertise in multi-account/multi-environment Snowflake setup, cloning, role-based access control, and secure data sharing
- Strong background in data engineering and modeling, including support for complex, semi-structured data at scale
- Expert-level knowledge of dbt, Terraform, and CI/CD for data infrastructure and deployments
- Experience integrating Snowflake with enterprise BI tools (especially Power BI) and handling performance-critical datasets
- Familiarity with data governance frameworks, including data anonymization, access policies, and audit requirements
- Hands-on experience with Azure Data Factory, Azure Blob Storage, and SAP extraction tools (e.g., Xtract Universal)
- Fluency in SQL, and experience with scripting (e.g., Python) for orchestration or transformation pipelines
- Excellent communication skills; able to lead architectural discussions with both technical and non-technical stakeholders
- Fluent English (German a plus)
Nice to Have
- Experience working in telecom, real estate, or procurement domains with a strong regulatory/data governance focus
- Understanding of Snowflake’s pricing model, usage monitoring, and cost optimization strategies
- Experience with metadata and lineage tools (e.g., Collibra, Alation, or similar)
- Familiarity with integration patterns using Tardis (Telekom Architecture for Decoupling and Integration of Services)
Additional Information
\* Please be informed that our remote working possibility is only available within Hungary due to European taxation regulation.
\* Please be informed that our remote working possibility is only available within Hungary due to European taxation regulation.