Atos SE (Societas Europaea) is a leader in digital services with pro forma annual revenue of circa € 12 billion and circa 100,000 employees in 72 countries. Serving a global client base, the Group provides Consulting & Systems Integration services, Managed Services & BPO, Cloud operations, Big Data & Cyber-security solutions, as well as transactional services through Worldline, the European leader in the payments and transactional services industry. With its deep technology expertise and industry knowledge, the Group works with clients across different business sectors: Defense, Financial Services, Health, Manufacturing, Media, Utilities, Public sector, Retail, Telecommunications, and Transportation.
Atos is focused on business technology that powers progress and helps organizations to create their firm of the future. The Group is the Worldwide Information Technology Partner for the Olympic & Paralympic Games and is listed on the Euronext Paris market. Atos operates under the brands Atos, Atos Consulting, Atos Worldgrid, Bull, Canopy, Unify and Worldline.
Big Data Developers (GCM3)
4-6 years total with 2-4 years in Big Data
· Strong experience on development in Java projects (Core Java, Spring framework) or Scala projects or both with very good experience in developing REST Webservices development. Ability to write JUnit test cases and experience with Maven, Jenkins, JIRA, GitLab. Experience in working in Agile methodology and using Continuous integration tools is very important.
· Strong experience on development in Hadoop Ecosystem components (Hortonworks HDP distribution or Cloudera CDH distribution). One must have a good experience with various utilities and services like YARN, Cloudera Manager (or Ambari), Sqoop, Flume, Pig, Hive etc.
· Developing distributed applications to solve large scale data processing problems using Spark (with Scala or Java). It is important to note that both Spark Core and Spark streaming knowledge and experience is mandatory.
· It is important to have experience with Kafka as message bus and integration of Spark streaming with Kafka.
· Knowledge and experience with at least one NoSQL database (HBase/ DataStax Enterprise Cassandra/ MongoDB etc.) is required. The most important being HBase. Just PoC level knowledge is not useful – we need project implementation knowledge.
· Basic idea and experience with one Cloud environment like AWS or Azure will be considered a plus point.
· Prepare documentation, change control and supporting QA processes consistent with enterprise requirements.
· The developer must be able to own a task and deliver a good quality tested code independently. The developer must have a key understanding of performance issues and how to solve those. We need the developer to understand requirement and do design and coding without any hand-holding or spoon-feeding.
· Good analytical and critical reasoning ability and good communications skills both written and verbal is an absolutely must requirement.
· Define low level technical designs (LLD) based on Architecture and High level Design (HLD)
· Ability to understand architecture and translating the architecture in technical design
· Coding, unit testing and debugging is considered the primary responsibility
· Prepare and execute unit test cases
· Performance tuning and trouble shooting
· Communicate with customers & other project stakeholders professionally
Java (Core Java, Spring)
It is mandatory to have excellent knowledge and experience in either Java or Scala or both.
Big Data Distribution
Hortonworks HDP 2.5+
It is mandatory to have 3-4 years of development experience with either Hortonworks or Cloudera or both. This includes experience with common utilities like Hive, Sqoop, Flume, Pig, YARN etc.
Cloudera CDH 5.11+
It is very important skill to have development experience with Kafka
Spark Core & Spark Streaming
It is mandatory to have very good experience in Spark core and Spark Streaming both – primarily using Scala (Java is acceptable)
Detailed Knowledge and experience of using HBase is mandatory.
Knowledge and experience of DSE (Cassandra) or MongoDB will be considered a plus point.
DataStax Enterprise (Cassandra)
Usage of CI tools like Jenkins, GitLab, JUnit
Experience with Maven, Jenkins, GitLab, JIRA will be considered very important skill.
Communication and Customer Engagement
Candidate will be interacting with client.
Communication skills and accuracy, autonomy, as well as ability to record, report and document are essential.
Candidate should be self-starter and should be able to work independently.
Critical Reasoning and Estimation
The candidate should have good analytical and critical reasoning ability.
Also should have experience on doing estimation based on requirements.
Compensation and Benefits
A great incentive to join the Atos team is the market competitive range of benefits that the Company provides. These include a competitive salary, as well as a number of core benefits, such as; 25 days annual leave plus bank holidays; private medical insurance, which all new starters will be automatically opted into; an attractive stakeholder pension scheme, with employer contributions of up to 10% basic salary; Life Assurance; Income Protection; Personal Accident Insurance; and Season Ticket Loan. In addition to this Atos operates a flexible benefits scheme that allows you to purchase discounted products and services. Additionally, comprehensive training and development is delivered in a variety of ways, leading to accreditation if required.
If you wish to apply for this position, please click below to complete our online application form and attach your CV in either Word, rtf or text format.
Atos does not discriminate on the basis of race, religion, colour, sex, age, disability or sexual orientation. All recruitment decisions are based solely on qualifications, skills, knowledge and experience and relevant business requirements.
We are committed to making reasonable adjustments to the applications process for people with disabilities.