Full Time Job

Data Engineer

ViacomCBS

Fort Lauderdale, FL 03-16-2021
 
  • Paid
  • Full Time
  • Entry (0-2 years) Experience
Job Description

ABOUT US:

ViacomCBS Streaming (formerly CBS Interactive) is a division of ViacomCBS that encompasses both free, paid, and premium streaming services including Pluto TV, CBS All Access, CBS Sports Digital, and CBS News Digital.

Role Details:

The Data Engineer will be primarily responsible for designing and implementing data extracts from a wide variety of sources including databases, API's and streaming services. They possess a deep sense of curiosity, a passion for building smarter products based on data, and the ability to communicate data structures and tools throughout the ViacomCBS Streaming organization. This individual will be developing in Python and leveraging a wide range of technologies in Cloud platforms - notably the services and offering of GCP, AWS, Kubernetes, Docker, Apache Airflow, Dataflow etc. You will also work closely with the wider data, engineering, and product teams cross-functionally to implement data-driven plans that drive our business! Does this sound like you?

Your Day-to-Day:
• Design and develop highly scalable and reliable data engineering pipelines to process large volumes of data across many data sources in the cloud
• Identify, design and implement internal process improvements by automating manual processes and optimizing data delivery
• Take problems from inception all the way to completion - own the building, testing, deployment, and maintenance of the code that you work on
• Be part of the on call rotation supporting our SLA's

Key Projects:
• Develop data pipelines for processing data from internal and external sources
• Migration of on-premise ETL processes to Python and Airflow
• Develop a process to capture different data points required for a reliable data quality system

QUALIFICATIONS:

What you bring to the team:

You have -
• Master's Degree in Computer Science or a related field
• 3+ years of applicable data engineering experience
• Strong proficiency in Python
• Hands-on experience with GCP Catalog Service
• Ability to write complex SQL to perform common types of analysis and aggregations
• Familiarity with Application Programming Interfaces
• Knowledge on Jira, Github, Confluence
• Exceptional written and verbal communication skills
• Work effectively on an Agile team and collaborate well with your other team members

You might also have -
• Experience with Google Data Flow, Pubsub, Apache airflow or similar
• Experience with Docker containerization
• Familiar with a NoSQL database such as MongoDB

FUNCTION: Data and Research

Location
Map
Advertisement