Technical Solutions Developer
Los Angeles, CA US
Senior Data Engineer - Content Data Engineering
What will you do:
• Fully own critical portions of Netflix' Content data model. Collaborate with partners to understand needs, model tables using data warehouse standard methodologies, and develop data pipelines to ensure the timely delivery of high quality data.
• Continually acquire new data sources to develop an increasingly rich dataset that characterizes content.
• Creatively explore how to use data to continually contribute to Netflix. Translate data questions into flexible methodologies that scale to answer broad problems across the organization.
• Be a bridge between data engineering and the business, enabling insight that can empower better decision-making.
• Be comfortable outside of your comfort zone - explore new tech, make your own tool, or find a new way to address an old problem.
Who are you:
• Lazy in a productive way (find tedious work boring and would rather automate it).
• Charismatic, determined, curious, and industrious, and not just hardworking.
• Thrive in a fast paced environment, and see yourself as a partner with the business with the shared goal of moving the business forward.
• Have strong beliefs that are weakly held: you can deliberate, and hear, all sides of a discussion and adapt to new perspectives that emerge from it.
• Sharp communicator who can explain complex data problems in clear and concise language.
• Build code that is understandable, simple, and clean, and take pride in its beauty.
• Love freedom and hate being micromanaged. Given context, you're capable of self-direction.
• Passionate about data quality and delivering effective data to impact the business.
• Motivated to explore new technologies and learn, and can do so without taking formal education.
What do you know:
• Data warehousing, data modeling, and data transformation.
• How to write complex SQL in your sleep.
• Significant experience with ETL tech (Informatica, SSIS, etc) is very valuable, but expect to work in a distributed environment.
• MPP/Cloud data warehouse solutions (Snowflake, Redshift, BigQuery, Vertica, Teradata, Greenplum, etc).
• Experience with sourcing and modeling data from application APIs.
• Python for scripting and automation.
• Education: Surprise us.
The application for this position is hosted at the Employer's site. Click on the button below to open the application page in a new tab in your browser.Apply at Employer's Site