Description de l'offre
Would you like to be part of a team focused on improving the supplier experience as well as helping Amazon save lots of money? Are you passionate about data? Does the prospect of building pipelines to consume petabyte scale data excite you?
As a Data Engineer with Finance Operations, you will be working in a large, extremely complex, and dynamic data driven environment. We are looking for data engineers with expertise and passion for analyzing data, designing and building predictive and decision models and design metrics to measure the performance of the business. You will interact with business groups that rely on the metrics and the decisions produced using predictive models.
Our team is focused on building a better experience for our valued consumer partners. We advocate for improvements through technology and partnering with various teams to define, design, and prioritize those enhancements in the ordering, fulfillment, payment and dispute resolution processes with vendors and sellers.
Key responsibilities of the role include:
· Interface with business customers, gather requirements, and deliver complete reporting solutions
· Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions
· Develop a deep understanding of our vast data sources and know exactly how, when, and which data to use to solve particular business problems
· Work with engineers from upstream teams to root cause identified defects
· Bachelor's degree or higher in an analytical area such as Computer Science, Physics, Mathematics, Statistics, Engineering or similar.
· Demonstrated ability in data modeling, ETL development, and data warehousing.
· Coding proficiency in at least one modern programming language (e.g. Python, Java, Scala)
· Previous experience with distributed or load balanced system design and development.
· Strong verbal/written communication and data presentation skills, including an ability to effectively communicate with both business and technical teams.
· Experience with Big Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc.).