Data Engineer (f/m/x)

Graz, Vienna, Pörtschach, Steiermark, Austria February 25, 2026

Over the past 25 years, NETCONOMY has grown from a startup into a team of 500 people across 10 European locations. We believe in the power of agile and cross-functional collaboration, bringing together people from diverse backgrounds to build outstanding digital solutions.

YOUR JOB

As a Data Engineer, you’ll play a key role in building modern, scalable, and high-performance data solutions on Google Cloud Platform (GCP). You’ll be part of our growing Data & AI team, designing and implementing data architectures that help clients unlock the full potential of their data.

Your job's key responsibilities are:

  • Building efficient and scalable ETL/ELT processes to ingest, transform, and load data from various structured and unstructured sources (databases, APIs, streaming platforms) into BigQuery and Cloud Storage

  • Implementing data ingestion and real-time processing using Dataflow (Apache Beam) and Pub/Sub for batch and streaming workflows

  • Developing SQL transformation workflows with Dataform, including version control, testing, and automated scheduling with built-in quality assertions

  • Creating efficient, cost-optimized BigQuery queries with proper partitioning, clustering, and denormalization strategies

  • Orchestrating complex workflows using Cloud Composer (Apache Airflow) and Cloud Functions for event-driven data processing

  • Implementing centralized data governance and metadata management using Dataplex with automated cataloging and lineage tracking

  • Monitoring and optimizing data pipelines for performance, scalability, and cost using Cloud Monitoring and Cloud Logging

  • Collaborating with data scientists and analysts to understand data requirements and deliver actionable insights

  • Staying up to date with GCP advancements in data services, BigQuery features, and data engineering best practices

Apply on company site

How well do you match this role?

Check My Resume