company_logo

Full Time Job

Principal Data Engineer

Peacock

New York, NY 03-19-2021
 
  • Paid
  • Full Time
  • Senior (5-10 years) Experience
Job Description

Responsibilities
Peacock is the place to stream thousands of hours of hit movies and shows, exclusive originals, live sports, news, and pop culture. We're in need of courageous leaders and creative problem-solvers who work hard and want to be at the epicenter of content, tech, and entertainment. NBCUniversal's Peacock is growing, and we're looking for another smart, passionate, and collaborative person to join our team.

We value inclusivity in our content and our people, and believe that success is only possible when we represent the world around us. So, are you ready to join our flock?

As part of the Direct-to-Consumer Decision Sciences team, the Principal Data Engineer will be responsible for creating a connected data ecosystem that unleashes the power of our streaming data. We gather data from across all customer/prospect journeys in near real-time, to allow fast feedback loops across territories; combined with our strategic data platform, this data ecosystem is at the core of being able to make intelligent customer and business decisions.

In this role, the Principal Data Engineer will lead the development and maintenance of an optimized and highly available data pipelines that facilitate deeper analysis and reporting by the business, as well as support ongoing operations related to the Direct to Consumer data ecosystem

Responsibilities include, but are not limited to:
• Manage a high-performance team of Data Engineers and contractors
• Lead team in design, build, testing, scaling and maintaining data pipelines from a variety of source systems and streams (Internal, third party, cloud based, etc.), according to business and technical requirements.
• Deliver observable, reliable and secure software, with a focus on automation and GitOps.
• Continually work on improving the codebase and have active participation and oversight in all aspects of the team, including agile ceremonies.
• Take an active role in story definition, assisting business stakeholders with acceptance criteria.
• Work with other Principal Engineers and Architects to share and contribute to the broader technical vision.
• Develop and champion best practices, striving towards excellence and raising the bar within the department.
• Develop solutions combining data blending, profiling, mining, statistical analysis, and machine learning, to better define and curate models, test hypothesis, and deliver key insights
• Operationalize data processing systems (dev ops)

Qualifications/Requirements
• 8+ years relevant experience in Data and/or Software Engineering
• Programming skills in one or more of the following: Python, Java, Scala, R, SQL and experience in writing reusable/efficient code to automate analysis and data processes
• Experience in processing structured and unstructured data into a form suitable for analysis and reporting with integration with a variety of data metric providers ranging from advertising, web analytics, and consumer devices
• Strong expertise in designing, implementing and maintaining highly-scalable distributed systems
• Experience with Google Cloud Platform or other Cloud Platforms
• Hands on programming experience of the following (or similar) technologies: Apache Beam, Scio, Apache Spark, Apache Kafka, Flink or Snowflake
• A good understanding of CI/CD pipelines and automated testing
• Experience with containerisation tools such as Docker, Kubernetes
• Experience in progressive data application development, working in large scale/distributed SQL, NoSQL, and/or Hadoop environment
• Build and maintain dimensional data warehouses in support of BI tools
• Passion for quality and observability
• Understanding of application security standards
• Bachelors' degree with a specialization in Computer Science, Engineering, Physics, other quantitative field or equivalent industry experience.

Desired Characteristics
• Experience with graph-based data workflows using Apache Airflow
• Experience building and deploying ML pipelines: training models, feature development, regression testing
• Strong Test-Driven Development background, with understanding of levels of testing required to continuously deliver value to production.
• Experience with large-scale video assets
• Ability to work effectively across functions, disciplines, and levels
• Team-oriented and collaborative approach with a demonstrated aptitude, enthusiasm and willingness to learn new methods, tools, practices and skills
• Ability to recognize discordant views and take part in constructive dialogue to resolve them
• Pride and ownership in your work and confident representation of your team to other parts of NBCUniversal

Jobcode: Reference SBJ-rbqkwe-3-136-97-64-42 in your application.