Offers “Amazon”

Expires soon Amazon

Data Architect

  • Seattle (King)
  • Bachelor's Degree
  • Architecture / Town planning

Job description

DESCRIPTION

AWS Tech360 team is customer obsessed (or obsessed with customer metadata). We are a start-up environment that will be the authoritative source of customer metadata and the solutions team for applications that action AWS' strategies to better serve our customers. We provide actionable insights and improve the customer experience by enabling the world's most efficient and effective selling organization. Our group is looking for a Data Architect to lead and design the architecture of our next generation data transformation, reporting, analytics, and warehouse technologies used to acquire, enrich, transform and store customer metadata.

This role will focus on internal solutions including HPC, batch data processing, Big Data and Business intelligence. This role will specifically focus on designing data solutions that help our sales and marketing teams leverage data to develop business insights.

Responsibilities include:
· Expertise - Learn and effectively use AWS services such as Amazon Elastic Compute Cloud (EC2), Amazon Data Pipeline, S3, DynamoDB NoSQL, Relational Database Service (RDS), Elastic Map Reduce (EMR) and Amazon Redshift.
· Design- Short, proof of concept models proving the use of AWS services to support new distributed computing solutions that migration of existing applications and development of new applications using AWS cloud services.
· Push the envelope – Cloud computing is reducing the historical “IT constraint” on businesses. Imagine bold possibilities to find innovative new ways to satisfy business needs through Big Data / Business Intelligence cloud computing.
Amazon aims to be the most customer centric company on earth. Amazon Web Services (AWS) provides a highly reliable, scalable, low-cost infrastructure platform in the cloud that powers critical applications for hundreds of thousands of businesses in 190 countries around the world.

Desired profile

BASIC QUALIFICATIONS

BA/BS degree or equivalent experience; Computer Science or Math background preferred.
· 7+ years of experience of IT platform implementation in a highly technical and analytical role.
· 5+ years' experience of Big Data platform implementation, including 3+ years of hands-on experience in implementation and performance tuning Hadoop/Spark implementations.
· Understanding of database and analytical technologies in the industry including MPP and NoSQL databases, Data Warehouse design, BI reporting and Dashboard development.
· Track record of thought leadership and innovation around Big Data.
· Experience with analytic solutions applied to the Marketing or Risk needs of enterprises
· Highly technical and analytical, possessing 5 or more years of IT platform implementation experience.
· Deep understanding of Apache Hadoop 1/2 and the Hadoop ecosystem. Experience with tools such as Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, HCatalog, Solr, and Avro.
· Familiarity with SQL-on-Hadoop technologies such as Hive, Pig, Impala, Spark SQL, and/or Presto.
· Experience developing software code in one or more programming languages (Java, JavaScript, Python, etc).
· Current hands-on implementation experience required

Make every future a success.
  • Job directory
  • Business directory