Scania as a company is making a shift from being a supplier of trucks, buses and engines to a supplier of complete, data driven and sustainable transport solutions.
Scania Financial Services (SFS) is in the beginning of a digital transformation of its own to support this and is now forming a team to pick up speed in the development parts of their program. SFS is a 1000 people strong unit serving Scania customers, dealers and distributors in 60 markets through 16 business units (BU’s) by offering competitive and innovative financing and insurance solutions.
This position belongs to the Data Platform group at Scania IT but will be dedicated to the SFS Digital Transformation initiative.
You will work with providing needed data for different analytical purposes, i.e. both traditional analytics (DW/BI) and Advanced Analytics (Realtime solutions, ML).
You will develop automated data pipelines with data ingestion, data integration and security but also handle manual data wrangling to support specific needs. You will work in teams together with other developers and business representatives as Data Engineer / Data Pipeline Developer.
Our Data Platform is continuously being developed and is a combination of traditional RDMS and the Hadoop framework, implemented on premise but on its way to a Hybrid-/Inter Cloud implementation.
Experienced data pipeline builder and data wrangler who enjoys optimizing data flows. You will need to be comfortable in a rapidly changing environment and able to navigate at a high rate of change. You are curious and want to learn and understand the business challenges and find (new) ways to solve the same.
You enjoy working in teams and sharing/gaining knowledge with your team mates.
Wanted Skills and Experience:
A technical person with at least 3 years of experience in one or more languages such as Python, R, SQL, Spark, Scala. You have experience building, optimizing, maintaining and deploying ‘big data’ data pipelines and data sets as well as a successful history of manipulating, processing and extracting value from large datasets. You must have experience of Hadoop: Hive, HBase, HDFS, Kafka, etc. Advanced working SQL knowledge and experience working with relational databases is also a requirement.
BSc or MSc in Computer Science or related field (or equivalent experience).
Meritorious Skills and Experience
Experience from working with cloud and hybrid solutions
Good understanding of multiple architectural design patterns, understanding multidimensional models and data warehousing theories
Experience and practice from using agile working methods, CI/CD and DevOps
We offer exciting career opportunities where you develop and maintain the skills needed to meet future challenges. We offer our employees Scania's own training centre and medical center, apartment rental options, possibility to rent vacation houses through Scania, team activities and the possibility to work with the premium brand in heavy trucks in a truly global environment. Please click here to read more about Scania, our core values, benefits and much more.
If you have any questions about the position contact Rolf Nordin (Manager) +46 (0)8553 51 067 or Johanna Ingels (Recruitment Specialist), +46 (0)70 081 29 90.
Please apply before the 16th of February. Selections and interviews will be ongoing throughout the process.