Job Description
Ubisoft's 19,000 team members, working across more than 30 countries around the world, are bound by a common mission to enrich players' lives with original and memorable gaming experiences. Their commitment and talent have brought to life many acclaimed franchises such as Assassin's Creed, Far Cry, Watch Dogs, Just Dance, Rainbow Six, and many more to come. Ubisoft is an equal opportunity employer that believes diverse backgrounds and perspectives are key to creating worlds where both players and teams can thrive and express themselves. If you are excited about solving game-changing challenges, cutting edge technologies and pushing the boundaries of entertainment, we invite you to join our journey and help us create the unknown.
Job Description
As a Data Developer within Ubisoft Montreal's UDO (Ubisoft Data Office) group, you will transform large volumes of data into organized and relevant information to act on.
More specifically, you will be involved in the development of streaming pipelines aggregating player telemetry into several Analytical Models, critical for the decision-making of Production teams. This pipeline already plays an important role within the analytics ecosystem, and aims to be generated for a broader functional scope within the Group. In particular, you will have the opportunity to participate in the redesign of its core, moving from the Kafka-Streams Framework to Flink.
What you'll do
• Work closely with analysts in our studios, evaluating the most important metrics to provide them with useful analytical content.
• Gather, process, and structure disparate data + create metrics accessible via APIs for end users.
• Integrate datasets into dashboards and analytics platforms and products.
• Collaborate on decisions regarding the use of new tools and processes.
• Identify opportunities to improve data quality.
• Stay on top of technological advances to help develop our best practices.
Qualifications
• Experience in data engineering or related experience with big data technologies such as Kafka, Hadoop, Hive Hbase and Spark;
• Knowledge of programming languages and functional paradigms such as Java, or Scala;
• Strong critical thinking, communication and interpersonal skills;
• A collaborative and innovative spirit;
• Motivation to make complex information accessible and understandable to everyone;
• Willingness to step out of your comfort zone + continuously learn and try new things;
Knowledge of infrastructure, Kubernetes, big data cloud solutions (e.g. AWS), Flink or Kafka Streams is an asset
Jobcode: Reference SBJ-ree651-3-237-15-145-42 in your application.