Context of the mission / Objective(s) of the job
Our client is actively searching a data related profile to increase the capacity of its Data Management team:
A senior Data engineer In Azure Data environment: Analysis, implementation and maintenance of ETL’s using Data Factory/mapping Data Flow and PySpark in Databricks
- Prepare and execute testing activities
- Design performant solutions
- Contribute to the standards applicable to those ETL’s
- Review ETL’s implemented by junior data engineers, coaching them as well
The candidate will join the Data Management team that is, among others, managing the DWH and the Data Lake of the company, and responsible for the data governance and for the tooling.
Architecture, design & implementation are managed within the team.
Mission period
Expected Start date: Asap
Duration: 12 months (with extension)
Language requirement
| Speaking | Reading | Writing | Mandatory | Preferable | |
| English | 3 | 3 | 3 | X | |
| Dutch | 3/1 | 3/1 | 3/1 | X | |
| French | 3/1 | 3/1 | 3/1 | X |
1 = Basic – 2 = Good – 3 = Fluent
*Dutch and French (one fluent the other one basic at least)
Localisation
Brussels centre.
Level of education required
Master degree in Informatics or equivalent through experience.
Required Knowledge and Experience
Personal Skills (soft skills – intangible qualities or traits that enhance our interactions)
Mandatory
- Must work well independently, and in a collaborative team environment
- Solution driven / Pragmatic
- Strong problem solving skills
- Pro-active and flexible
- Strong verbal communications skills
Preferable
- Client friendly
Business experience required (work experience)
Mandatory (Years)
- Experience with analytics use cases – Mid-level (2 y – 5 y)
Preferable
- Experience within Insurance or financial domain
- Experience on projects with different technology stacks, integration aspects
Technical experience required (hard skills related to physical or digital tools)
Mandatory (Years)
- Cloud Data platform in Azure – Mid-level (2 y – 5 y)
- Devops (Infrastructure as Code, Continuous Integration / Delivery) – Mid-level (2 y – 5 y)
- Azure Data Skills (Data Lake Gen2, Synapse) – Mid-level (2 y – 5 y)
- Experience in DataBricks – Mid-level (2 y – 5 y)
- Experience in Data Factory – Mid-level (2 y – 5 y)
- Experience in Python – Mid-level (2 y – 5 y)
- Experience in Data Flow – Mid-level (2 y – 5 y)
- Knowledge about Data Management: metadata management, lineage, data quality principles… – Mid-level (2 y – 5 y)
- Experience in Power BI and semantic data modelling – Mid-level (2 y – 5 y)
Functional experience required (job experience in particular industry and in particular function)
Mandatory (Years)
- Database modelling via modelling tools such as Erwin or Visio – Mid-level (2 y – 5 y)
- Ability to interact with Business teams and gather Business Requirements – Mid-level (2 y – 5 y)
- Ability to translate Business Requirements into functional & technical Requirements – Mid-level (2 y – 5 y)
Objective of the job
3 main axis:
- Implementation of BI Business projects
- Quality assurance on ETL’s
- Coaching of junior data engineers