Offers “HSBC”

Expires soon HSBC

Big Data/Consultant Specialist/Payments and PCM

  • Pune (Pune)
  • Bachelor's Degree
  • IT development

Job description

Big Data - Senior Data Engineer

Job Description

Job Purpose:
A Data lake is a central data repository that can store multi-structured (i.e., structured, semi-structured and unstructured) data in native format. It then uses a variety of processing tools to discover and to govern the data, which include improving its overall quality, making it consumable; finally, it allows tools and exposes APIs for consumers to explore and extract business value through several types of consumer workloads.

The HSBC Payments Data Lake (PDL) aims to be the single, trusted, on-demand source for the group's messaging, screening, investigations and processing data. This data is further enriched by linking with Customer identifiers, Lines of businesses and reference datasets. The data is presented in a structured manner leveraging Group's investment in big data technology (Hadoop platform).

Key characteristics of this role will be the ability to work closely with a demanding set of stakeholders, and leverage the component parts of the broader CDO team. Requirements and demands need to be ‘gated' and prioritized, and expectations set and managed, so that we are able to execute confidently and securely rather than being constantly ‘blown off course'. For this reason, the successful incumbent will be highly skilled at facing off to multiple business stakeholders, be able to ‘win them over' despite not always giving them what they want as soon as they'd want it, and exude confidence such that there is limited confusion or misunderstanding about progress.

· Payment Data Programme is seeking a hands-on senior developer to support delivery of CDO book of work and internal data projects at Global and Regional levels.
· Have a system engineering background and the ability to compare and contrast different solutions to meet a business requirement.
· Provide technical thought leadership in evaluation of new technologies to meet business requirements and influence key stakeholders leading to adoption.

Key Responsibilities:
· At least 5 years of industry experience in Banking or Financial services industry with a focus on Big Data and Advanced Analytics
· As a Big Data Technology developer, successful candidate is able to perform hands-on development of the solution using Hadoop stack i.e. HIVE, NoSQL, Spark, Hbase, Pig etc in an Agile/Scrum based delivery environment
· Collaboratively work with Solution Architects and deliver solution in alignment with the strategic vision
· Be responsible for analysing the requirements and taking it all the way from conceptual phase to implementation phase.
· Experience implementing at least two Hadoop production clusters using Hortonworks or Cloudera distribution preferably in Cloud (AWS or Azure)
· Experience designing and building data ingestion pipelines using Kafka, Flume, NIFI frameworks
· Experience with integration of data from multiple sources including from NoSQL databases such as HBase
· Ability to develop using Spark framework in Java or Python
· Hands-on experience designing/developing solutions in Cloud (AWS)
· Requires thorough understanding of the Hadoop security best practices and has hands-on experience implementing Kerberos authentication, RBAC, TLS and data encryption controls.
· To undertake code review and Rlease management reviews. To assist the Juniors/peers in any technological/functional road block and should be able to formally raise it to Team Lead/Project manager
· Solid background in Big Data technologies, Distributed File systems, and Advance Analytics.
· Atleast 2+ years of experience in Big Data application development; Hands-on experience with the Hadoop & SPARK stack (Core Java, JMR, HDFS, Hive, Pig, Sqoop & Hbase)
· Extensive hands on experience in Shell Scripting or Python programming experience.
· Good to have Datameer, Google Cloud platform and/or AWS development experience.
· Strong ability in performance tuning & data quality issues.

Person Specification


Knowledge/Experience:
· Hadoop big Data Development Experience for at least 2 -3 yrs.
· Unis Shell Scripting : Intermittent to Expert
· Control M, Confluence, Jira, Excel, Notepad ++
Skills Required
· As a Senior developer of Big Data, perform hands-on development of the solution in an Agile/Scrum based delivery environment
· Collaboratively work with Solution Architects and deliver solution in alignment with the strategic vision
· Be responsible for analysing the requirements and taking it all the way from conceptual phase to implementation phase.
· Experience implementing at least two Hadoop production clusters using Hortonworks or Cloudera distribution preferably in Cloud (AWS or Azure)
· Experience designing and building data ingestion pipelines using Kafka, Flume, NIFI frameworks
· Experience with integration of data from multiple sources including from NoSQL databases such as HBase
· Ability to develop using Spark framework in Java or Python
· Hands-on experience designing/developing solutions in Cloud (AWS)
· Requires thorough understanding of the Hadoop security best practices and has hands-on experience implementing Kerberos authentication, RBAC, TLS and data encryption controls.
· To undertake code review and Release management reviews. To assist the Juniors/peers in any technological/functional road block and should be able to formally raise it to Team Lead/Project manager
· Solid background in Big Data technologies, Distributed File systems, and Advance Analytics.
· At least 2+ years of experience in Big Data application development; Hands-on experience with the Hadoop & SPARK stack (Core Java, JMR, HDFS, Hive, Pig, Sqoop & Hbase)
· Extensive hands on experience in Shell Scripting or Python programming experience.
· Experience in Cloudera or HortonWorks distributions.
· Good to have Datameer, Google Cloud platform and/or AWS development experience.
· Strong ability in performance tuning & data quality issues.
· Ability to effectively work in the organization's iterative and Agile methodology
· Self-starter who can work with minimal guidance, yet within a collaborative team environment and interaction
· Strong communication skills (both verbal and written)
· Develop Code /Review Code & design documentation for all components of the Hadoop stack
· Must be familiar with DevOps
· Work with test/QA team to build automated test scripts
· Shares own expertise with others; may co-ordinate activities of others/the team
· Promotes teamwork and works with other cross-functional teams
· Demonstrates creativity the ability to develop and present new ideas and conceptualize new approaches and solutions
· Experience in big data, data warehousing, data analytics and MI
· Experience working in relevant market/context, i.e. Banking, Finance
· Experience working in Payments domain desirable but not essential
· Experience of using relevant software packages, e.g. Hadoop, Terradata, Tableau, Qlikview, Ab Initio, Datameer
· Strong knowledge of data structures, algorithms, enterprise systems, and asynchronous architectures.
· Experience in Hadoop using Jave, C++, HBase, Spark, Kafka, Hive,Pig, Splunk technologies.
· Experience with large distributed services is a plus as is building/operating highly available systems.
· Agile & waterfall development experience.
· Experience using version control and bug tracking tools.
· Ability to generate creative and innovative solutions for QA challenges and constraints.
· Ability to work well in a team environment and be able to effectively drive cross-team solutions that have complex dependencies and requirements.
· Strong technical vision, presentation and technology leadership skills.
· Expert level SQL skills for data manipulation (DML) and validation (SQL Server, DB2, Oracle).
· Ability to handle multiple competing priorities in a fast-paced environment

Desired profile

Qualifications :

Qualifications:
·  Bachelor's or masters' degree (in science, computers, information technology or engineering)
·  Certification in big data technology - desirable not essential
·  Hadoop Certified – desirable not essential

Make every future a success.
  • Job directory
  • Business directory