SELECT 'Hello world,' FROM db.greetings;
We are calling all Data Engineers
to apply to Aftonbladet
, Sweden's biggest news source, reaching more than 3.6 million people every day. We are renowned for our independent journalism and our credo to make Sweden a little bit better, the world a little more understandable and life a little easier.
During 2019 Aftonbladet developed a new Data Strategy that focuses on transforming Sweden's biggest newspaper into a truly data-driven company. To enable this we are building a Data & Insights team tasked with collecting, structuring, enriching and analysing all available data. And there is a lot of data to dive into: we are generating 400 million new events consisting of over 20 billion data points each and every day. We are using a modern cloud-based Data Warehouse (Snowflake/AWS), Airflow, Docker and several front-end presentation tools like f.e. Tableau & Amplitude for display and research.
Engineering is not only about creating pipelines, but also about connecting people to data. In our team, we want to do exactly that. Our mission is to "Enable all the people in Aftonbladet to do their own analysis and bring insights to actions" and we will achieve that by educating, evangelising and always supporting the business end-to-end.
Your contribution does matter and you will have a great impact on how we are doing business and in our fight for an independent and strong fact-based journalism.
You will be part of a great team consisting of other Data Analysts, Scientists and Engineers. You will report to the Head of Data & Insights.What you do: extract/ load/ transform data & monitor scheduled data flows
Who you are: A structured, analytical & motivated team player
- Build large scale ELT pipelines leveraging AWS, Snowflake, Airflow and many other state of the art data and analytics technologies.
- Document, monitor and consistently improve our data flows by working together with the central data platform teams.
- Proactively survey the technical architecture, tooling and solution landscape seeking opportunities to optimize time to data, data quality and model agility.
- Collaborate and coordinate with data engineer peers to design and ensure data flows across Schibsted Media.
- Ensure data privacy regulations are solidly and efficiently adhered to in all our data engineering work.
- Formulating user stories, prioritising and refining them until they can be well understood by everyone else.
- Co-operate and communicate - we are a team and we help each other as much as we can.
- You possess at least a bachelor's degree in a quantitative discipline (i.e. Statistics, Computer Science, Mathematics, Engineering, Informatics).
- You have worked at least 2-4 years within the field of big data or data engineering and are experienced in transforming and building data pipelines.
- You have a good business understanding and are able to switch seamlessly between drawing technical details and sketching simplified pictures.
- You are structured and analytical - you know that for every problem there exists also a solution.
- You are a great team player and a true believer of the principle: "If you want to go fast, go alone, if you want to go far, go together!".
- You are fluent in English, SQL and at least one programming language like Python, Java or Scala.
- Understanding the basics in Swedish would be beneficial - in the end, all our articles are written in Swedish and so is most of the metadata.
- You are very curious, eager to learn new things, able to admit your mistakes and willing to learn from them.
- You don't think 'scrum' is the user-interface of an old point-n'-click adventure and are able to work agile in a medium-size team.