Data Engineer (Azure / Databricks

20103
Stockholm (Onsite)

Data Engineer (Azure / Databricks) – Stockholm

We are currently looking for a Data Engineer for an assignment within a large, data-driven organization undergoing continuous development of its data platforms and analytics capabilities.

The role is part of a cross-functional team working with modern data engineering, analytics, and platform development, with a strong focus on scalable data processing and high-quality data delivery.


Key Responsibilities

  • Design and build data processing solutions using Ab Initio (GDE, PSETS, XFR, DML)
  • Schedule and monitor jobs in Control Centre to ensure stable data pipelines
  • Integrate with Ab Initio Metadata Hub and related toolkits
  • Build and maintain data structures across platforms such as Hive, HBase, Azure SQL, Azure Storage, Teradata, and Oracle
  • Collaborate with data engineers, data scientists, and business stakeholders
  • Proactively onboard new data sources and improve existing solutions
  • Work closely with cross-functional teams to understand and meet data needs
  • Mentor team members and contribute to high engineering standards


Required Skills & Experience

The consultant should have:

  • A bachelor’s or master’s degree in a relevant field (or equivalent experience)
  • Hands-on experience with ETL tools such as Ab Initio or Informatica
  • Strong expertise in Databricks, SQL, Unix, shell scripting, and relational databases (e.g. Oracle, Teradata)
  • Experience working with cloud platforms, preferably Microsoft Azure
  • Experience with data platforms such as data lakes and data warehouses
  • Strong analytical skills and ability to troubleshoot data quality and production issues
  • Experience in mentoring and providing technical guidance
  • Experience working in agile environments (e.g. SAFe)
  • Strong communication skills and fluency in English


Meritorious Experience

It is considered a strong advantage if the consultant has experience with:

  • Snowflake
  • Neo4j (Graph Database)
  • Python
  • Streaming technologies such as Kafka
  • CI/CD, release processes, and tools such as Git, Jira


Assignment Details

  • Location: Stockholm
  • Workload: 100%
  • Start: 2026-04-07
  • End: 2026-07-15
  • Scope: ~560 hours


Application

Please submit:

  • CV in Word format and english
  • A completed requirement matrix, including:
  • Yes/No per requirement
  • Short motivation
  • Clear reference to where in the CV the requirement is fulfilled