company_logo

Full Time Job

Sr Data Engineer

HBO

Seattle, WA 01-21-2022
 
  • Paid
  • Full Time
  • Senior (5-10 years) Experience
Job Description
The Job

The Sr. Data Engineer will build scalable data solutions to develop and grow WarnerMedia's direct to consumer streaming service, HBO Max.

The Daily
• This role will partner with a world class team of data architects, data scientists, software engineers and others to grow analytics capabilities that will be used to inform business decisions and product optimizations, and product design.
• Take lead in driving a culture of high quality, innovation, and incremental experimentation.
• Design, implement and fully own critical portions of HBO Max's analytical data models and pipelines used to populate them.
• Lead the collaboration with stakeholders to understand requirements, lead and design data structures using best practices, and develop batch or real-time data pipelines to ensure the timely delivery of high-quality data.
• Own, maintain and support existing data pipelines and platforms storing petabytes of data quickly and reliably.
• Creatively explore and demonstrate the value of data for the HBO Max team. Be part of a driving force that enables and empowers better decision making through data.
• Has a knack of using flexible and scalable methodologies that can be applied to a broad set of problems across the Data Engineering organization.
• Always be thirsty for optimizations and always be eager to find new, improved ways to address an old problem.
• The Essentials
• SQL Rockstar: can code complex SQL logic, lead best practices and drive the development of other data engineers.
• Proven Computer Science fundamentals in Algorithms and Data Structures.
• 5-7 years of experience in Python or other programming languages such as Scala or Java.
• Significant experience with Big Data technologies such as Apache Spark.
• 2-3 years of experience with MPP/Cloud Data Warehouses such as Snowflake.
• Three years of experience with core AWS services related to data engineering.
• Education: Bachelor's or Master's degree in computer science or related field.

The Nice to Haves
• Experience with using Apache Kafka or AWS Kinesis as a message bus for storing and processing high-velocity real-time data.
• Experience with building data notebooks using Jupyter, AWS Sagemaker, or Databricks.
• Experience with building and productionizing ML Pipelines and other solutions.
• Experience with Graph-based workflow orchestration engine such as Apache Airflow.
• Familiarity with marketing and media platforms/data sets.

Jobcode: Reference SBJ-rnve8o-3-16-76-43-42 in your application.