Pauwels Consulting Logo

Data Engineer – Enterprise Data Platform

Digital
Data
Medior (3+)
Location
Brussels, Brussels-Capital
Work type
Consultancy
Work model
Fulltime, Hybrid

Responsibilities

Join an innovative team dedicated to shaping the future of the energy sector! You will join a large-scale, enterprise data organization operating in a mission-critical environment where data is treated as a strategic asset. The role sits within a mature yet continuously evolving data platform that supports analytics, reporting, data governance, and data exchange across the organization.

You will work as part of an agile product team, contributing directly to data products used by business stakeholders. This is not an ivory-tower engineering role: the focus is on building usable, reusable, and business-relevant data capabilities, combining solid engineering with a strong product mindset.

Your responsibilities include

  • Designing and building data pipelines and data transformations for analytical and operational use cases.
  • Creating efficient storage structures and materialized views across analytical technologies.
  • Developing data exchange endpoints consumed by business users and applications.
  • Implementing data pipelines using Azure-based services, including Databricks and related components.
  • Writing Python-based data processing logic in a Databricks environment.
  • Collaborating with business analysts and stakeholders to refine requirements and propose coherent data solutions.
  • Contributing to testing, data quality checks, and DevOps practices within the team.
  • Supporting data governance activities, including data modeling, lineage, glossary, and data quality.
  • Writing and maintaining clear technical documentation.

Requirements

You have hands-on experience (typically 3–5+ years) in at least two of the following areas:

  • Working with SQL Server for analytical or data integration use cases.
  • Building data pipelines using Azure services, such as Azure Databricks, Azure Data Factory, Azure Functions, Azure Stream Analytics, and Azure DevOps.
  • Implementing data pipelines or enrichments using Python in a Databricks environment.
In addition, you:
  • Are fluent in English (mandatory).
  • Are comfortable working closely with business analysts and stakeholders.
  • Contribute actively to testing, documentation, and DevOps practices.
  • Take a transversal view on data, avoiding silos and promoting reuse.
  • Demonstrate a product-oriented mindset rather than a purely technical focus.
  • Apply analytical thinking and continuous improvement in your work.
  • Communicate clearly and collaborate effectively in cross-functional teams.
Nice-to-Have
  • Experience with Power BI.
  • Experience with SAP data integration.
  • Exposure to additional technologies such as Redis, RabbitMQ, Neo4j, or Apache Arrow.
  • Fluency in German, French, or Dutch (in addition to English).

Offer

Location: Brussels

Work mode: Hybrid

Onsite presence: 2 days per week

Work regime: Full-time

Contract: Freelance or Permanent

# 92763
With a plus sign and country code (e.g. +32 400 00 00 00).
We accept Word and PDF files up to 3 MB.
Candidates must be legally authorised to work in the EU and possess the required language skills for the job location.
Not sure if this job is right for you? Chat with Alex, our AI career coach, and discover the vacancies that match your profile.