Description de l'offre
Machine Learning is changing how our customers operate, from Marketing and Advertising to Supply Chains. The Professional Services team is a unique center of excellence, working on solving challenging data problems across AWS customers. Are you passionate about Big Data, Machine Learning and Artificial intelligence? Our consultants will deliver proof-of-concept projects, deliver workshops, and lead implementation projects. These professional services engagements will focus on key media customer solutions in preparing large sets of data for Machine Learning use cases.
Big Data Processing
Production, processing, and analysis of TB's or PB's of customer data. Collecting and processing data coming from various channels such as onsite, free search, paid search, social, paid social email, associates etc. We heavily use AWS services such as AWS Flow, S3, EC2, and EMR (Spark).
Delivering accelerated rate of innovation to our customers is dependent on the quality and breadth of data we input to the Machine Learning models.
We are looking for an outstanding individual who combines superb technical, communication, and analytical capabilities with a demonstrated ability to get the right things done quickly and effectively. This person must be comfortable working with a team of data scientists and customers to raise the bar of the data pipelines we build and maintain for our customers.
The ideal candidate for our team is a thinker and a doer: someone who loves algorithms and mathematical precision, but at the same time enjoys implementing real systems, and is motivated by the prospect of spectacular business returns that customers need.
* Demonstrated ability in data modeling, ETL development, and data warehousing.
* A desire to work in a collaborative, intellectually curious environment.
* Industry experience as a Data Engineer or related specialty (e.g., Software Engineer, Business Intelligence Engineer, Data Scientist, Business Analyst) with a track record of manipulating, processing, and extracting value from large datasets.
* Experience with a DW technology (Oracle, Teradata, Netezza, Redshift, etc) and relevant data modeling
* Experience with Hadoop or other map/reduce "big data" systems and services
* Coding proficiency in at least one modern programming language (e.g. Python, Java)
* Degree in Computer Science, Engineering, Mathematics, Physics, or a related field and at least 3+ years work experience
* Experience processing large amounts of data, in various formats such as Parquet, ORC and processing data in batch mode, and streaming mode
* Experience processing files in various image, audio and video media formats
* Exposure and knowledge of Security, encryption and Data Governance.