Data Engineer (Amsterdam or Antwerp)
Our client is one of the largest media companies of the Benelux. Besides newspapers and magazines, several different online platforms are part of this organization. Monthly, they reach about 90% of Dutch and 80% of the Flemish population. This organization has a strong focus on these online services, where they have a strong variety of data. The possibilities to explore and analyze this data and to develop data-driven solutions are endless.
Within the organization people have a good work ethic. Colleagues are ambitious and set high goals. In addition, having fun together is considered vital and everyone enjoys working together. The culture within the organization is very open, and everyone has the freedom and space to be themselves.
What will you do?
As a Data Engineer you will be working on one of the largest data lakes of the Netherlands, which provides real-time insights and personalized applications to millions of people daily. By working closely together with several business units and data teams, you will build on data-driven applications. You will use the newest big data technologies that can efficiently handle the continuous stream of data of online media.
You will be working on:
- Unlocking and enriching structured and unstructured data from clickstream, content (text, audio and video) and operational systems;
- Activating production models on various personalization platforms;
- The further development of the Group Data Platform;
- Giving advice on applications and efficient use of data and tooling to both data analysts and scientists, as well as marketers and digital sales managers.
You will be working with 10 data engineers and data warehouse specialists.
In terms of technology, they use: SQL, Spark, Airflow, Docker, Kubernetes, Scala, Python, Redshift and AWS infrastructure
What will you get?
- A good salary;
- A 36-hour workweek;
- Personal budget of 12% (8% of which is holiday allowance);
- Participation on the profit-sharing scheme;
- Development opportunities through education, traineeships and certificates;
- Working on the newest stack;
- Knowledge sharing from tech colleagues.
Who are you?
You have a minimum of 3 years of experience in a scale-up environment. You work autonomously and you are proactive in taking on new challenges. In addition, you are:
- Capable of extracting insights from big datasets using SQL and Spark;
- Experienced in deploying and scheduling applications in the cloud on environments such as Airflow, Docker, Kubernetes and Lambda functions, or you’re prepared to learn this;
- Familiar with programming languages such as Scala and/or Python, and you embrace best practices in software development (git, coding standards, CI/CD pipelines);
- Experience with AWS database and infrastructure is a plus.