Senior IT Developer, Data & Analytics
Job Req Number: 56537
Time Type: Full Time
Do you want to build scalable applications with modern cloud technologies? Do you believe value is best created in cross-functional teams with direct user access?
Innovating and designing new products as well as tweaking and maturing existing ones, you will cooperate with data scientists, DevOps engineers, ML engineers, frontend developers, data engineers, business experts and other specialists to create the digital products of tomorrow.
Join a team focused on our most valuable digital products
You will join the Data & Analytics team. The purpose of our team is to build advanced end-to-end products that create direct business value for DSV’s divisions, including:
- Customs declaration automation
- Vendor invoices automation
- Booking transparency
- Address validation
- ETA prediction
The use cases we solve tend to have a high degree of complexity, requiring non-deterministic problem solving (i.e., the use of ML/AI), near real-time data processing, a need for high availability, vertical and horizontal scalability, and an extremely high volume of transactions.
However, fancy technologies and accurate ML models do not solve the issues at hand alone. We strive to combine our competencies to build holistic solutions where the underlying complexity is hidden for the user to create simple and value-adding experiences. We do this with close, ongoing dialogue with our end-users – that is one of the benefits of having the users internally in the organization!
Your new unit is characterized by having a startup mindset, and it is divided into cross-functional product teams with a mix of young and highly experienced colleagues. We strive to base our work on knowledge and insight rather than hierarchical structures, and we make sure that our decisions are based on conversations between people with different competencies rather than one individual.
Build high-quality software with fault tolerance and scalability
As our next Senior IT Developer, you will use the right tools in the toolbox to solve the issues at hand. You will engage in solution architecture discussions as well as building, testing, and deploying the software using our standard CI/CD pipelines.
Additionally, you will be:
- Building micro services for processing data, reading/writing to the database, exposing data to other applications, and more.
- Using the architectural patterns that are relevant for a specific context such as event-based data streaming, request-response web services, file transport jobs, and more.
- Making sure that relevant logs are created, are understandable, are shared with our central logging platform, and that the necessary alerts are setup.
- Testing your software from a functional, quality, fault-tolerance, performance, and scalability perspective.
- Building security features into the solution such as federated authentication and authorization, role-based access control, and similar.
- Engaging in a “guild” for backend developers to share knowledge, technical patterns and ways of working across product teams.
- And much more…
Using the right technology for the use case
You thrive in an environment where you can use modern cloud-based technologies that are fit-for-purpose, and you enjoy staying up to date with the latest technologies. You have a broad experience with many of the technologies in the list below, but we do not expect you to have experience with all of these:
- Backend applications: Mostly coded in Java, Scala (and some C#)
- Event streaming: Confluent Kafka (KStreams etc.)
- Database technologies: Mostly MongoDB Atlas (and some PostgreSQL, MySql)
- Version control: Git
- CI/CD Pipelines: Jenkins, Argo, AzureDevOps
- Requirements: Jira
- Documentation: Confluence
- Authentication: OIDC, OAuth2, SAML
- Containerization: Docker
- Container orchestration: Kubernetes
- Logging, monitoring & Alerting: ELK stack
Working with us, you will get to know these systems as well:
- ML Frameworks: TensorFlow / PyTorch
- ML model serving: TensorFlow serving, Torch serving
- ML model development: Mostly coded in Python
- ML Ops Platform: Kubeflow, MLFlow, SeldonCore, KNative, KServe
- Load balancing: NGINX, Cloud LBs
- Installation scripts: Ansible, Terraform
- Test framework: Jest, Junit, Jmeter
- Identity & Access Management broker: Redhat SSO
Want to know more and apply?
We will be happy to answer any questions you may have regarding the position and about your options in DSV. You are welcome to contact Staffan W. Nielsen on +45 25 41 78 37.
We look forward to receiving your application via the link below as soon as possible. We will process the applications as we receive them.