Big Data Ecosystem Expert Developer
In Nordea we are expanding in the area of Big Data Applications and Data Driven products. We are currently looking for an experienced expert big data developer who possess excellent technical and communication skills.
You will be part of an ambitious and high performing unit, who enjoy making results.
You will work in an agile team setup, highly international environment with ambitious people, who are determined to reach our common goal to meet the unique needs of our customers.
The position we offer
We are building a new competence area within Classification Engines in Denmark and have the responsibility for customer enriching engines, including price/earnings, segmentation and compliance engines.
As an expert Big Data Developer you will be responsible for a successful design and implementation. We are in the stage where we are going live with a newly developed compliance application. One of your primarily responsibilities is to do the continuous development of the application and be part of establishing DevOps in regards to Big Data technologies. You will solve complex problems on a daily basis.
You will also get the chance of working with some of the newest technology and methods in the market, including Machine Learning, Deep Learning and Hadoop Ecosystem platform from Cloudera a proven technology.
You will work in an international environment and you will be expected to cooperate very closely with your Nordic colleagues.
You will be part of a team of developers, IT analysts and testers located in Scandinavia, Poland and India. You will be situated in Høje Taastrup, Denmark.
We also offer:
- The opportunity to participate in ambitious international IT projects and influence on the big data solutions
- Comprehensive training program and education to prepare for this position
- Work based on international standards using state-of-the-art tools
- Possibility to participate in training programs enhancing your qualifications
- An incentive package that includes a medical services package for you and your family, a sports card, catering and cinema tickets funding, a possibility to join a corporate life insurance scheme
- Access to Employee Pension Scheme
The qualifications you need
To succeed in the role we expect you to have most of these qualifications:
- Deep understanding of the principles of distributed systems
- Knowledge and experience with: Apache Spark, Apache Hadoop, Sqoop, Oozie, Pig, Hive, Impala, Kafka
- Experience with NoSQL databases: MongoDB, Cassandra, HBase etc
- Experience with building ETL pipelines
- Production experience with performance tuning of Hadoop/Spark solutions
- Experience with machine learning APIs such as Spark MLlib or Goolge TensorFlow
- Familiar with OOD and design patterns
- Experience with Scala or Python is a plus
- Full-stack java development including frontend, server, database and integration
- Experience from large-scale system integration projects
- Working with RDBMS and knowledge of SQL
- Integration towards RESTful and SOAP/OSB-services
- Knowledge and experience with web servers such as Tomcat, Jetty or Weblogic
- Knowledge and experience with tools like Maven, Git, and the Atlassian suite (Stash/Bitbucket, Jira, Confluence)
- Hands on experience with Redhat, Linux shell scripting
- Experience of working with agile methodology will be highly valued
You will be part of a new unit, and we therefore expect you to positively adapt and respond to change.
We are looking for a profile with a good drive and willingness to learn new technologies; you should be a good team player and be good at sharing knowledge with other team mates.
More information and send application
For more information, please contact Carsten Mohr on +45 5547 9719.
Application deadline is 5 July 2017.