company_logo

Full Time Job

Sr Data Engineer

Pluto TV

San Francisco, CA 09-10-2021
 
  • Paid
  • Full Time
  • Mid (2-5 years) Experience
Job Description

Job Posting Title: Senior Data Engineer

Department: CBS Digital Media Data/Data Products

Location: San Francisco CA

About us:

ViacomCBS Streaming (formerly CBS Interactive) is a division of ViacomCBS that encompasses both free, paid, and premium streaming services including Pluto TV, Paramount+, CBS Sports Digital, and CBS News Digital.

CDM Data/Data Products Overview:

We are a passionate group providing data needs for all CBS Digital Media properties. This includes CBS, CBSNews, CBS Sports, and related Sports Properties. Our team is made up of varying engineering roles including Data, Data Product and Ingestion Engineering. We are responsible for driving the CDM data strategy and best practices around data collection, pipelines, usage, and infrastructure.

Role Details:

The Senior Data Engineer should possess a deep sense of curiosity and a passion for Data. They will be expected from their knowledge of the data, to build smart data products that will drive the business and provide insights into the CBSi audience.

They will have good communication skills and possess the ability to convey knowledge of data structures and tools throughout the CBS Digital Media organization.

This candidate will be expected to lead a project from inception to completion as well as help mentor members of the team on best practices and approaches around data.

They will use their skills in reverse engineering, analytics, and creative experimental solutions to devise data and BI solutions. This engineer supports data pipeline development which includes machine learning algorithms using disparate data sources.

Your Day-to-Day:

Works with large volumes of traffic, ad, and user behavior data to build pipelines that enhance raw data.

Look at ways of improving efficiency by staying current on the latest technology and trends and introduce members of the team to such.

Find answers to business questions via hands-on exploration of data sets via Jupyter, SQL, dashboards, statistical analysis, and data visualizations.

Develop prototypes to proof out strategies for data pipelines and products.

Mentor members of the team and department on best practices and approaches.

Lead initiatives in ways to improve the quality of our data as well as making the data more effective, with other members of engineering, BI teams, and business units to implement changes.

Meet with members of the Product, Engineering, Marketing, and BI teams to discuss and advise on projects and implement data-driven solutions that drive the business.

Able to break down and communicate highly complex data problems into simple, feasible

solutions.

Extract patterns from large datasets and transform data into an informational advantage.

Partnering with the internal product and business intelligence teams to determine the best approach around data ingestion, structure, and storage then, working with the team to ensure these are implemented correctly.

Qualifications:

What you bring to the team:

You have-

Bachelor's degree and 4+ years of work experience in Analytics/Measurement/Data Operations fields or consulting roles with a focus on digital analytics implementations.

Experience with data management systems, both relational and NoSQL (e.g., HBase, Cassandra, MongoDB)

Proficient in Python.

Able to write SQL to perform common types of analysis.

Experience with exploratory data analysis using tools like iPython Notebook, Pandas & matplotlib, etc.

Strong problem-solving and creative-thinking skills.

Experience with a Python web framework such as Django or Flask.

Demonstrated development of ongoing technical solutions while developing and maintaining documentation, at times training impacted teams.

Experience developing solutions to business requirements via hands-on discovery and exploration of data.

Exceptional written and verbal communication skills, including the ability to communicate technical concepts to non-technical audiences, as well as translating business requirements into Data Solutions

Familiarity with Informatica and ETL concepts.

Experience building and deploying applications on a cloud platform (Google Cloud Platform preferred)

Experience with Docker and container deployment.

Influences and applies data standards, policies, and procedures

Builds strong commitment within the team to support the appropriate team priorities

Stays current with new and evolving technologies via formal training and self-directed education

You might also have:

Experience with full-stack application development.

Familiarity with Data Modeling.

Familiarity in Hadoop pipelines using Spark, Kafka.

Familiar with version control systems (Git, SVN).

Can perform statistical analyses using tools such as R, Numpy/SciPy with Python

Jobcode: Reference SBJ-d28v85-3-142-195-24-42 in your application.