Offers “Aviva”

Expires soon Aviva

Data Engineer/ETL Developer

  • Internship
  • Markham, CANADA
  • IT development

Job description

We are looking for a Data Engineer/ETL Developer to join our growing team of Data and BI experts. Aviva has embarked on a journey to build next generation data platform to support the growing need of data from business intelligence and analytics. The candidate will be responsible designing, developing and productionizing ETL jobs to ingest data into Data Lake, load data to data marts; and extract data to integrate with various business applications.

Roles and Responsibilities:

  • Design and Develop ETL Pipeline to ingest data into Hadoop from different data sources (Files, Mainframe, Relational Sources, NoSQL Etc.) using Informatica BDM
  • Parse unstructured data, semi structured data such as JSON, XML etc. using Informatica Data Processor.
  • Analyze the Informatica PowerCenter Jobs and redesign and develop them in BDM.
  • Design and develop efficient Mapping and workflows to load data to Data Marts.
  • Perform the GAP analysis between various legacy applications to migrate them to newer platforms/data marts.
  • Write efficient queries in Hive or Impala and PostgreSQL to extract data on Adhoc basis to do the data analysis.
  • Identify the performance bottlenecks in ETL Jobs and tune their performance by enhancing or redesigning them.  
  • Work with Hadoop administrators, PostgreS DBAs to partition the hive tables, refresh metadata and various other activities, to enhance the performance of data loading and extraction.
  • Performance tuning of ETL mappings and queries.
  • Write simple or medium complex shell scripts to preprocess the files, schedule ETL jobs etc.
  • Identify various manual processes, queries etc. in the Data and BI areas, design and develop ETL Jobs to automate them.
  • Participate in daily scrums; work with vendor partners, QA team and business users in various stages of development cycle.

Skill Required:

  • 7+ years of experience in designing and developing ETL Jobs (Informatica or any other ETL tool)
  • 3+ years of experience working on Informatica BDM platform
  • Experience on various execution modes in BDM such Blaze, Spark, Hive, Native.
  • 3+ years of experience working on Hadoop Platform, writing hive or impala queries.
  • 5+ years of experience working on relational databases (Oracle, Teradata, PostgreSQL etc.) and writing SQL queries.
  • Should have deep knowledge on performance tuning of ETL Jobs, Hadoop Jobs, SQL’s, Partitioning, Indexing and various other techniques.
  • Experience in writing Shell scripts.
  • Experience in Spark Jobs (Python or Scala) is an asset.
  • 1+ years of experience with working on AWS technologies for data pipelines, data warehouses
  • Minimum 5+ years of experience with building ETLs to load data warehouse, data marts
  • Awareness of Kimball and Inmon data warehouse methodologies
  • Nice to have knowledge on all the products of Informatica such as IDQ, MDM, IDD, BDM, Data Catalogue, PowerCenter etc.
  • Must have experience working in Agile SCRUM methodology, should have used Jira, Bit bucket, GIT, Jenkins to deploy the codes from one environment to other.
  • Experience working in diverse multicultural environment with different vendors, onsite/offshore vendor teams etc.
  • P&C Insurance industry knowledge will be an added asset
  • Certifications in Informatica product suite as a developer

Additional Information

Aviva Canada is committed to providing accommodations for people with disabilities during all phases of the hiring process including the application process. If you require an accommodation because of a disability, we will work with you to meet your needs. Applicants need to make their needs known in advance. If you are selected for an interview and require an accommodation, you are encouraged to advise the Talent Acquisition Partner who will consult with you to determine an appropriate accommodation.

Make every future a success.
  • Job directory
  • Business directory