Hoppa till innehåll

Data Engineer - Portfolio Brands

Publiceringsdatum
19 juni 2024
Område
Yrkesroll
Typ av anställning
Ansök nu

Denna annons är hämtad från Platsbanken / arbetsformedlingen.se

Company Description

Do you want to play a key role in forming our data foundation for the Portfolio Brands at H&M? Would you like to be the driving force within data engineering agenda for the Portfolio brands?

To support our Portfolio Brands in establishing the best possible Customer growth jwe are now looking for a Data engineer who will support in building and enhancing our data landscape for the brands.

The Portfolio Brands Group incorporates COS, Monki, Weekday, & Other Stories, ARKET, Afound, Singular Society and All in Equestrian.


Job Description

Are you ready for an exciting and new challenge? As a Data Engineer at Portfolio Brands, you’ll be responsible for designing, constructing, and maintaining the architecture of our integrated data products. Your key tasks will include building pipelines to extract, transform, and load data from various sources into data warehouses, such as Google Cloud Platform (GCP).

In this role, you’ll solve complex problems by turning vast amounts of data into actionable business insights through advanced analytics, modeling, and machine learning. Additionally, you’ll act as a partner and spokesperson for relevant stakeholders across brands and AIAD.

We value diverse technical backgrounds and believe that your passion for data will thrive in our data-driven organization. Join us and contribute to implementing data-intensive solutions that shape the future of Portfolio Brands!

  • Take end-to-end responsibility to build, optimize and support of existing and new data products towards the defined target vision
  • Design and build explorative , predictive – or prescriptive models, utilizing optimization, simulation and machine learning techniques
  • Be a champion of DevOps mindset and principles and able to manage CI/CD pipelines and terraform as well as Cloud infrastructure, in our context, it is GCP (Google Cloud Platform)
  • Ensure that our built data products work as independent units of deployment and non-functional aspects of the data products follow the defined standards for security, scalability, observability, and performance
  • Evaluate and drive continuous improvement and reducing technical debt at the Portfolio brands
  • Maintain expertise in latest data/analytics and cloud technologies.
  • Design, develop, build and maintain real-time data pipelines from a variety of sources (streaming data, APIs, data warehouse, data mesh, messages etc.)
  • Coordination with other teams to design optimal patterns for data ingest and egress, and lead and coordinate data quality initiatives and troubleshooting
  • Work close to the stakeholders around vision for existing data products and identifying new data products to support our customer needs
  • Work with analytics teams within Portfolio Brands and across H&M group around topics that relate to our data modernization initiative
  • Leverage the understanding of software architecture, software development lifecycle and software design patterns to write scalable, maintainable, well-designed and future-proof software
  • Evaluate and drive continuous improvement and reducing technical debt within PB
  • Maintain expertise in latest data/analytics and cloud technologies
  • Ensure assurance, audit, compliance and testing related to cyber security issues
  • Handle security governance and management, as well as threat assessment and information risk management


Qualifications

  • Hands on experience of 4yrs & above either as a Data Engineer on modern cloud data platforms /or advanced analytics environments or as a Software Engineer with cloud technologies and infrastructure.
  • Experience in data query languages (SQL, BigQuery or similar)
  • Experience in data centric programming using one of more programming languages Python, Java and/or Scala
  • Good understanding of different data modelling techniques and trade-off
  • Knowledge of SAS and Azure databases
  • Experience in working with Terraform and dbt
  • Experience in working with data visualization tools
  • Experience in GCP tools – Cloud Functions, Cloud Run, VertexAI, Dataflow, Dataproc and BigQuery
  • Experience in data processing framework – Beam, Spark, Hive, Flink and Iceberg
  • Have a collaborative and co-creative mindset with excellent communication skills
  • Motivated to work in an environment that allows you to work and take decisions independently


Additional Information

This is a full-time position based in Stockholm. Apply by sending in your CV in English as soon as possible, but no later than 21-July 2024. We will review and interview candidates ongoing. Due to data policies, we only accept applications through the career page.

Ansök nu

Denna annons är hämtad från Platsbanken / arbetsformedlingen.se

Sök jobbet
Ansök senast
21 juli 2024 (24 dagar kvar)
Ansök nu

Denna annons är hämtad från Platsbanken / arbetsformedlingen.se