Senior Software Engineer, Tools and Automation - Video Platform
New York, NY US
Comprised of Disney's international media businesses and the Company's various streaming services, the Direct-to-Consumer and International (DTCI) segment aligns technology, content and distribution platforms to expand the Company's global footprint and deliver world-class, personalized entertainment experiences to consumers around the world.
The Walt Disney Company's Direct-to-Consumer and International segment (DTCI) is a global, multiplatform media, technology and distribution organization for high-quality content created by Disney's Studio Entertainment and Media Networks groups.
DTCI includes Disney's international media operations and the Company's direct-to-consumer businesses globally, including the upcoming Disney-branded direct-to-consumer streaming service, the Company's ownership stake in Hulu, and the ESPN+ sports streaming service, programmed in partnership with ESPN. BAMTECH Media, developer of the ESPN+ and Disney-branded streaming platforms, oversees all consumer-facing digital technology and products across the Company as part of the Direct-to-Consumer and International segment.
Data is critical for Disney Streaming Services success. To gain and maintain our competitive advantage, we use our data assets to analyze user behavior, engagement and acquisition. To facilitate Data Analytics, we are building a scalable data analytics platform.
We are seeking a Sr. Manager for Data Architecture, to build and manage a diverse team of Data Modelers and Data Architects designing and implementing scalable data infrastructure to support our broader Data Science and Analytics organization. This team partners with product, analytics and engineering teams to architect, build and maintain our growing data warehousing and analytical insights and BI reporting environments. This leader will also build and grow this core function of our data team in support of our Disney Plus product vision. Collaborating across disciplines, you will identify new data sources; design and implement table structure, data products, ETL strategy, automation frameworks and scalable data pipelines.
• Build strong relationships with stakeholders including data scientists, product managers, and software engineers
• Oversee a roadmap to meet cross-functional data needs, building shared goals and project plans
• Communicate the vision and path to successful execution for both peers and executive stakeholders
• Manage and mitigate projects risks, priorities and trade-offs while negotiating buy-in to deliver results
• Hire and retain top talent including nurturing their growth by building short- and long-term career paths
• Lead and manage daily scrum activities with your team, diving deep to help unblock their daily tasks
• Perform design reviews and put mechanisms in place for continually raising the bar for technical skills in the organization
• Guide the design, building, and launching of new data models and data pipelines in production
• Partner with stakeholders to establish and support delivery SLAs for key data sets
• Foster culture of high standards for data quality through expansion of frameworks, toolkits, etc.
• Own design, develop, and maintain ETL routines using tools such as Airflow and Jenkins
• Own data models, table designs, and publish sufficient metadata documentation to facilitate rapid data adoption
• Establish processes to instill operational excellence principles for project management, SDLC, data architectural design best practices, scalability, reliability, etc.
• 8+ years of relevant professional experience
• 3+ years in direct people management capacity
• Experience scaling and managing a team of 5+ members
• 5+ years' work experience in building data warehousing environments. Strong background of data modeling principles including Dimensional modeling, data normalization principles etc.
• 5 + years' experience using analytic SQL, working with traditional relational databases and/or distributed systems such as Spark/ Hadoop / Hive, BigQuery, Snowflake, Redshift.
• 2+ years of experience programming languages (e.g. Python, R, bash).
• 2+ years of experience with workflow management tools (Airflow, Oozie, Azkaban, UC4)
• Strong project and program management experience in agile fast paced environments
• Experience with Snowflake cloud data platform (or similar ecosystem MapReduce, Yarn, HDFS, Hive, Spark, Presto, Pig, HBase)
• Familiarity with data exploration / data visualization tools like Tableau, Looker, Chartio, etc.
• Ability to think strategically, analyze, and interpret market and consumer information.
• Strong communication skills – written and verbal presentations.
• Comfortable working in a fast-paced and highly collaborative environment.
• Excellent conceptual and analytical reasoning competencies.
• Bachelor's degree in quantitative or analytical field e.g. economics, mathematics, or computer science
• Advanced degree in quantitative or analytical field preferred
The application for this position is hosted at the Employer's site. Click on the button below to open the application page in a new tab in your browser.Apply at Employer's Site