company_logo

Full Time Job

Data Engineer

Entercom

Los Angeles, CA 03-02-2021
 
  • Paid
  • Full Time
Job Description
Location: This opportunity is available remote, however, the majority of the team sits in Philadelphia and Denver. The Data team is an elite team of data, marketing strategy and technology specialists with expertise in the ever-changing universe of precision marketing, ad-technologies, and data architecture and management. The Data Engineer helps to define, execute, and support the delivery of the technical data architecture for Entercom's data team. The work will improve the quality, reliability, accuracy, and consistency of our data by engineering project-specific data pipelines and validation tools, and the team's overall data model. The candidate will have a hands-on role with product specialists and be responsible for working with them to understand data requirements and then execute by implementing those requirements. As part of the Data Team, you will help build data pipelines that empower the organization's ability to make more informed decisions. The successful candidate will have a fully-rounded knowledge of successfully developing and delivering organizational and client value solutions.

Responsibilities
• Collaborate with product teams, data analysts and data scientists to help design and build data-forward solutions
• Design, build and deploy streaming and batch data pipelines capable of processing and storing petabytes of data quickly and reliably
• Integrate with a variety of data metrics providers ranging from advertising, mobile/web analytics, and consumer devices
• Build and maintain dimensional data warehouses in support of business intelligence and optimization product tools
• Develop data catalogs and data validations to ensure clarity and correctness of key business metrics
• Drive and maintain a culture of quality, innovation and experimentation
• Coach data engineers best practices and technical concepts of building large scale data platforms

Qualifications:
• Technical Skills
• Good with python and advanced SQL(window functions, summary functions, joins etc) – data engineering related libraries pandas, pyspark etc.
• AWS – Redshift, Glue , Lambda, EC2, VPC, Step Functions
• Good to have Snowflake experience – basic administration and concepts like user management, data ingestion tools and capabilities.
• Terraform(at least basic know how) , docker (at least basic know how) and Jenkins(Good to have)
• Version control tools like git, GitHub

Functional Experience
• Experience in developing Data pipelines moving data in a heterogeneous environment.
• Experience with leveraging AWS and Snowflake tools to process large datasets.
• Experience with setting up re-usable deployment frameworks, process improvements, security conscious.
• Bigdata and Datawarehouse concepts – Schema Design, Columnar data stores, Spark Performance tuning concepts.
• API concepts and Development, Parse and Process complex data structures and files – JSON, XML.

Behavioral:
• Team Player – with humble and helping mentality
• Strive for knowledge sharing and win as a team
• Constant learner

Required Education
• Bachelor's degree in Computer Science or a related field or equivalent work experience

Professionals

Jobcode: Reference SBJ-rbq29o-3-142-197-198-42 in your application.