Data Engineer (m/f/d)

Sevlievo April 20, 2026 Full Time

Shape & Create - And make it happen!

The Villeroy & Boch Group with its Ideal Standard brand is one of the world's leading manufacturers in the ceramics and lifestyle sector. With our innovative and stylish products from the Dining & Lifestyle and Bath & Wellness segments, we have been creating moments and rooms to feel good in since 1748. Our success is based on the passion, design expertise and innovative strength of our more than 13,000 employees in 42 countries.

Want to become part of us? #shapeandcreate

Your responsibilities:

  • Pipeline Design & Operation: You develop and manage complex ETL/ELT pipelines using modern loading strategies (delta mechanisms, CDC, upserts).
  • Data Integration: You connect diverse data sources – from cloud applications like Salesforce to classic SQL servers and modern REST interfaces.
  • SAP Ecosystem: You utilize tools such as SAP-SLT, SAP Data Intelligence, or Datasphere to efficiently extract and transform data into the cloud environment.
  • Data Modeling: You support the development and optimization of our Data Warehousing (primarily in BigQuery) to create a high-performance and scalable data foundation.
  • Quality Assurance: You ensure the integrity of our data streams through clean coding and regular monitoring. 

Your profile

  • Education: Bachelor/Master Degree in Business Informatics, Information Technology, Mathematics or comparable.
  • Experience: At least 3 years of professional experience, preferably in industry or retail.
  • Language skills: Excellent written and spoken English skills;
  • Technical Skills:
    • Tool Stack: You bring sound experience in ETL development, ideally with a focus on CDAP / Google Cloud Data Fusion.
    • Cloud Expertise: You are proficient in using Google Cloud Tools, especially BigQuery and Cloud Storage.
    • SQL: You master SQL at a high level – whether in the cloud (BigQuery-SQL) or on-premise (Oracle, MS SQL Server).
    • Programming Skills: You are proficient in Python for automating processes or implementing complex transformations.
    • Infrastructure Understanding: Ideally, you have basic knowledge of Kubernetes (GKE) to understand or support containerized data applications.
    • Methodology: Terms like Delta/Full-Load or Change Data Capture (CDC) and relational data modeling are part of your standard repertoire.

  • Soft Skills:
    • Analytical Thinking: You love to understand data structures and translate technical challenges into efficient solutions.
    • Communication: Excellent skills in both writing and speaking to explain technical things clearly.
    • Attitude: A proactive attitude, meaning you like to take initiative, and a strong desire to learn and grow your career in database administration.
    • Teamwork: Ability to work effectively both on your own and as part of a larger team.

Think outside the box with us! #shapeandcreate

Apply on company site

How well do you match this role?

Check My Resume