Data Engineer

14690
1 Jun, 2025 to 30 Nov, 2025
Stockholm (30% remote)

Role:


Design, build, and maintain data pipelines on Google Cloud Platform (GCP) using services such as BigQuery, Dataplex, Dataflow/Dataproc, Pub/Sub, Cloud Functions, Cloud Run, Cloud Scheduler and Cloud Workflows.

Develop and implement data models using dbt (data build tool) to support analytical reporting and data science needs.

Collaborate with data analysts and business stakeholders to understand data requirements and translate them into data models.

Implement data quality checks and monitoring to ensure high quality of data in our data warehouse.

Optimize data pipelines and data models for performance and cost efficiency.

Document and communicate data models to the wider team and train them on how to use it effectively.

Good knowledge of Terraform

Experience in building CI/CD pipelines using GitHub Actions.

Essential requirements:

Bachelor’s degree in Computer Science, Information Systems, or a related field, or equivalent experience.

Proven experience as a Data Engineer or in a similar role.

Strong experience with Google Cloud Platform (GCP) and its data services such as BigQuery, Cloud Run, and Pub/Sub.

Experience in data modelling and transformation using dbt.

Knowledge of SQL and good experience with Python.

Strong problem-solving skills and attention to detail.

Excellent communication skills and the ability to explain complex topics in simple terms.

Cloud certification: Google Cloud Platform (GCP)



Please explain how you meet all the requirements when applying.


Utilization: 100%

Location: Stockholm

Period: 01-06-2025-30-11-2025

Last day to apply: 27-03-2025

We present regularly. This means that we sometimes remove the assignements from our website before the final application deadline. If you are interested in an assignement, we recommend that you submit your application as soon as possible.