Skip to main content

DevOps Engineer, Data & Analytics Global IT

DSV – Global Transport and Logistics

DSV is one of the very best performing companies in the transport and logistics industry. 75,000 employees in more than 90 countries work passionately to deliver great customer experiences and high-quality services – as part of the operation or in a variety of supporting roles. If you have drive and talent and enjoy responsibility, we’ll give you the support you need to explore your potential and forward your career.


DSV is seeking a DevOps Engineer, Data & Analytics Global IT. The position is located in Finland (Vantaa) 


You like to:

  • Learn new things from knowledgeable colleagues
  • Deliver working software that is thoroughly tested – both by testing the “happy path” but also by identifying edge test cases
  • Find smart solutions to complex problems
  • Break down the solutions into iterations so they can deliver value quickly in MVP versions before they are enriched with more nice-to-have functionality in later iterations
  • Automate tasks and deliver nice developer experiences for your teammates
  • Take responsibility in the team and the projects you are working on
  • Reach out to others for help or clarifications whenever you need it and to ensure alignment with others
  • Make realistic mockups of data to allow you to test things swiftly on synthetic data before you get access to production data
  • Assess the need for reusability with speed and simplicity in the current implementation to find the right balance

The Role

You will fill the role of a DevOps engineer with a focus on automation, security and improving the developer experience for your teammates. Depending on your aspirations and experience, you might also contribute with backend application development. It is expected that you have experience with many of the following technologies:

  • Automation / Scripting: bash / python / pwsh core / groovy
  • Relational database: MySQL
  • NoSQL database: MongoDB
  • Authentication: Open ID Connect 2.0 (we use Red Hat KeyCloak as identity broker)
  • Version control: Git (we use Atlassian BitBucket as a GUI on top of Git)
  • Webservices: Java REST backend services
  • Containerization: Docker
  • Container orchestration: Kubernetes
  • CI/CD Pipelines: Jenkins (our templates are written in Groovy)
  • Load balancing: NGINX
  • OS: Linux
  • Installation scripts: Ansible

There will be other adjacent competencies, which you will be expected to learn about and engage with such as:

  • Event streaming: Confluent Kafka, K streams
  • ML model serving: TensorFlow serving, Torch serving
  • Requirements: Jira
  • Documentation: Confluence 
  • Frontend technologies: React JS, Material UI, JavaScript/TypeScript, Redux
  • Test framework: Jest

Our Team

We are an ambitious team with a flat hierarchy and a mix of young and very experienced people, who are working according to the following principles:

  • We celebrate victories together
  • We take responsibility for mistakes and learn from them
  • We design for scale but build only for the near future
  • We value working software and informal alignment over tedious documentation
  • We make decisions based on knowledge and insight rather than hierarchical structures
  • Decisions are the product of conversations between people with different competencies (not one person)
  • Everyone can express their honest opinion

We have all the competencies needed to build awesome products inside the team:

  • Product owner
  • Business analysts
  • Application developers (frontend + backend)
  • Data engineers
  • Data scientists
  • ML engineers
  • DevOps engineers

The use cases

The focus of our team is to build advanced end-2-end solutions that create direct business value for DSV’s divisions, including for example:

  • Customs declaration automation,
  • Vendor invoices automation,
  • Address validation,
  • ETA prediction,
  • And many more to come…

The word “advanced” is used to underline that the use cases we solve tend to have a high degree of complexity, requiring non-deterministic problem solving (i.e., the use of ML/AI), near real-time data processing, a need for high availability, vertical and horizontal scalability, and an exceedingly high volume of transactions. However, fancy technologies and accurate ML models do not solve the issues at hand alone; we strive to combine our competencies to build holistic solutions where the underlying complexity is hidden for the user to create simple and value-adding experiences.

For further details on this permanent position, please contact Product Owner, Data & Analytics Kamil Gubriel via e-mail or Lead IT Developer, Data & Analytics Lasse Meincke

The recruitment activities have started, so send your application with salary request as soon as possible, latest by 1.2.2022. Applications in English would be appreciated