The Global Data Analytics team enables Discovery Communications to turn data into action. Using big data platforms, data warehousing and business intelligence technology, audience data, advanced analytics, data science, visualization, and self-service analytics, this team supports company efforts to increase revenue, drive ratings, and enhance consumer engagement.
We are looking for a well-rounded software developer to join our DataOps team. The right candidate for this role will have a passion to develop tools and automation to deploy/maintain/troubleshoot distributed services/applications in cloud and on-premise infrastructure.
Working within Global data engineering and advance analytics team the role will provide technical direction and implementation to build scalable infrastructure to enable digital transformation. With modern cloud, devops tools and open source technologies, you are responsible for architecting and implementing the new end-to-end CI/CD pipeline for Data and analytics applications.
You interface with the functional areas throughout the enterprise, identifying potential opportunities to re-engineer technology processes, improve efficiencies and reduce costs.
A strong background in development is necessary, not only to build and improve internal tooling, but also to improve selected open source tools as well.
• Develop tools and frameworks for distributed systems, services and applications
• Be an expert in cloud technologies (AWS must) to drive automation and best practices
• Contribute in all phases of the development life cycle; Collaborate with system architects on application infrastructure.
• Develop reliable and scalable systems used for monitoring/alerting and access management of production systems
• Leveraging modern tools and techniques to develop clean, efficient, and reusable code
• Identifying and addressing design, development, and delivery performance bottlenecks in preproduction/development environment looking to continually improve applications
• Writes documentation for both internal and external consumers, covering design artifacts, code and fixes
• Collaborate with DevOps teams to automate software deployment, including deployment of immutable application infrastructure
• Administration of AWS services like IAM, Role, policies and AD integration
• Bachelor's Degree in computer science, Information Technology, Information Systems or similar
• Excellent knowledge in Linux internals, virtualization, DevOps tools, and cloud technologies
• Hands-on experience building systems and tools using cloud-native technologies in AWS
• Strong foundation in Infrastructure as Code and configuration management using tools like Terraform, SaltStack, Ansible, Chef or Puppet
• Minimum of 5 years of experience in enterprise solution development
• Strong experience developing in Bash and Python
• Advanced knowledge of AWS big data and analytics platform services, such as EMR, Redshift, Datapipeline, and Kinesis, including system design, build, and deployment
• Advanced knowledge of HashiCorp Terraform, including best practices for enterprise, working with custom providers, module registry/module sharing, advanced HCL and interpolation functions, and running Terraform in a shared environment
• Experience in automated deployment, installation and configuration of applications on Linux systems, including the development and improvement of the tools for doing so
• Advanced knowledge of Linux system administration, on systems such as Redhat/CentOS and Debian, including networking, init/systemd, process monitoring/tracing, and system resource monitoring/troubleshooting
• Extensive experience working with version control and repository management systems such as GIT
• Experience working with containerization approaches such as Docker
• Experience in working with Ansible for automation and configuration, or other configuration management frameworks
Jobcode: Reference SBJ-d89nvm-52-23-219-12-42 in your application.