company_logo

Full Time Job

Sr. Data Engineer

Pluto TV

West Hollywood, CA 03-12-2021
 
  • Paid
  • Full Time
  • Senior (5-10 years) Experience
Job Description

About The Brand

Pluto TV, a ViacomCBS company, is the leading free streaming television service in America, delivering 250+ live and original channels and thousands of on-demand movies in partnership with major TV networks, movie studios, publishers, and digital media companies. Pluto TV is available on all mobile, web and connected TV streaming devices and millions of viewers tune in each month to watch premium news, TV shows, movies, sports, lifestyle, and trending digital series. Headquartered in West Hollywood, Pluto TV has offices in New York, Silicon Valley, Chicago and Berlin.
Overview and Responsibilities

The Sr. Data Engineer for streaming will be responsible for supporting and enhancing our Kafka and Kinesis based streaming data collection platforms. The ideal candidate will need to have hands-on experience with Kafka (open source or Confluent) and AWS Kinesis. The candidate must have proven experience working on cloud platforms and be very familiar with AWS. A strong working knowledge of kafka components namely Topics, Producer/Consumer, KStream, KTable, KSQL is a basic prerequisite. Also Kinesis components including streams, firehose and data analytics are definitely desired. The candidate should have worked on streaming systems that handle very large volume and high-velocity data and should have a good understanding of streaming best practices.

This is a critical role with a wide range of responsibilities, including:

This position will be part of Pluto TV's Data Engineering department and will be working with the team responsible for collecting analytical streaming data from all our client software. This is a very fast-paced environment dealing with very large volume/velocity of data. You must have strong analytical and problem-solving skills and strong organizational and prioritization skills.

Basic Qualifications

We believe the right individual will have the following skills and experience in order to be successful in this role:
• Strong knowledge of Java, Python or Scala
• Strong knowledge of Kafka (Topics, Producer/Consumer, KStream, KTable, KSQL)
• Knowledge of AWS Kinesis (streams, firehose, and data ​analytics) would be nice to have
• Knowledge of SQL and familiarity with modern cloud databases like Snowflake
• Good experience with the following technology stacks
• Frameworks: Spring Framework, Flink, JUnit, Mockito, Apache Flink.
• Cloud Technologies: AWS(VPC, Kubernetes, EC2, Lambda, CloudWatch and etc)
• Deployment Tools: Jenkins, Docker, Helm, terraform
• Monitoring tools: Grafana, Prometheus
• Strong technical and troubleshooting skills
• Strong experience with data streaming systems
• Effectively and clearly communicate open project risks and dependencies with the data team and other internal teams

Additional Qualifications
• 7+ years' experience in data engineering and data streaming
• Demonstrated understanding of Kafka
• Demonstrated ownership and ability to lead several projects simultaneously
• Demonstrated strong time-management, prioritization and leadership skills
• Demonstrated strong troubleshooting and problem-solving skills
• Strong interpersonal skills, with the ability to cultivate relationships and negotiate with internal clients
• Ability to meet deadlines and partner timelines
• Demonstrated strong written and oral communication skills
• Bachelor's Degree or Equivalent

Jobcode: Reference SBJ-gq68yz-3-137-174-216-42 in your application.