Expire bientôt Amadeus Group

Software Development Engineer - Big Data Processing

  • CDI
  • Nice (Alpes-Maritimes)
  • Développement informatique

Description de l'offre

Join us and shape the future of travel

Shaping the future of travel has always been important to us at Amadeus. Today, with technology getting smarter by the minute, that future is more exciting than ever.

We work at the heart of the global travel industry. Amadeus offers you the opportunity to learn and grow and drive your own progression in an exciting and multicultural environment.

Our people are driven by a passion for 'Where next?' If you want to shape your career and the future of travel, Amadeus is the place for you.

Business environment

There is currently in the company a massive demand for a standardized big data platform. In the scope of Travel 360 program more than 23 different applications are planned to be hosted in our λbox platform in 2017.

The BIP (Business Intelligent Platform) department is playing a key role in TPE to provide adapted, reliable, secure and efficient data storage and data processing solutions for Amadeus analytic applications. The group is responsible for:

· Big Data processing and analytics platform (a.k.a. λbox)
· Functional Monitoring
· Secured Logging and Billing Frameworks

The BIP Department is structured into three main areas: one dedicated to DevOPS activities (BOX), one dedicated to Development Experience activities (BDX) and one dedicated to Logging and Monitoring (LMA).

Within this department R&D-AQG-DUI-BIP-BDX (λbox Development eXperience) team is in charge of defining the λbox platform focusing on facilitating development of analytic applications.

Purpose

This job position corresponds to the Development eXperience activities (BDX). We are accountable for:

· Providing guidelines and support to implement efficient Spark Job: Delivering expertise on properly writing a Spark Job to applicative developers is key to the success of the different projects. In addition it will contribute to manage efficiently the resources of the platform in order to minimize the hardware cost.
· Providing MDW Helpers for usual/common operations: There are common patterns required by most of the analytics applications such as read/write from/to a Kafka topic, read/write from/to HDFS, filtering data etc... Our team provides middleware helpers in Scala allowing to implement these patterns through simple configurations and coming with a full set of monitoring feature.
· Creating a Local Dev Environment leveraging efficient BI development: Jointly with our Hadoop Vendor we create and maintain Docker containers to allow early integration of business code with libraries of the platform.
· Providing continues integration of Analytics applications through Software Workbench: Jointly with the Team in charge of providing Software Workbench 2 our intention is to build a fully automate non regression platform for analytic developments.
· Building expertise and anticipating evolutions of Hadoop technologies: Mainly through open source initiatives, the Hadoop ecosystem is evolving pretty fast. We are in charge to assess the impacts on these evolutions to the λbox platform.

Key accountabilities

· Contribute with applicative developers to define requirements for development of new or improvement of existing analytic solutions.
· Design technical solutions and perform feasibility studies associated to future standards (e.g. SQL on Hadoop technologies)
· Propose viable technical solutions to applicative developers for validation (e.g. non-regression test framework).
· Define Amadeus Standards for analytic software development extended current Amadeus Standards.
· Conduct unit, integration and performance tests of the software and ensure a level of quality in line with the Amadeus guidelines.
· Participate in the validation / acceptance phase of the product cycle ensuring the fine-tuning necessary to finalize the product.
· Produce software documentation /architecture presentation.
· Support the end user in the Production phase by debugging existing software solutions in response to Problem Tracking Records (PTR) and Change Requests (CR) issued from Product Management or Product Definition.

Education

· Post-secondary degree in Computer Science or related technical field or equivalent experience
· English fluent.

Specific competencies

Technical Skills

· Knowledge of continues integration/releasing/deployment
· Object oriented programming (Java).
· Functional programming (Scala).
· Experience with Hadoop technologies is a plus including Spark, Hive, Impala
· ACS and/or Cloud computing knowledges is a plus

Any duplication and display of partial or full content of our job advertisement on any support, such as brochures, websites, mail, emails, this list is not exhaustive, is strictly forbidden without prior formal Amadeus’ authorisation.

Recruitment agencies:Amadeus does not accept agency resumes. Please do not forward resumes to our jobs alias, Amadeus employees or any other company location. Amadeus is not responsible for any fees related to unsolicited resumes.

Faire de chaque avenir une réussite.
  • Annuaire emplois
  • Annuaire entreprises
  • Événements