Senior Data Engineer

Login to Apply

Job Description

City: Glendale, CA

Onsite/ Hybrid/ Remote: Onsite, 3-4 days a week

Duration: 18 months

Rate Range: $107/hr on W2 depending on experience (no C2C or 1099 or sub-contract)

Work Authorization: GC, USC, All valid EADs except H1b

Must haves:

1) Airflow and Spark

2) Snowflake or Databricks

3) Data Modeling

Additional Notes: Netflix background would be perfect or content related experience is ideal.

Description:

As a Senior Data Engineer at The Studios, you will play a pivotal role in the transformation of data into actionable insights. Collaborate with our dynamic team of technologists to develop cutting-edge data solutions that drive innovation and fuel business growth. Your responsibilities will include managing complex data structures and delivering scalable and efficient data solutions. Your expertise in data engineering will be crucial in optimizing our data-driven decision-making processes. If you’re passionate about leveraging data to make a tangible impact, we welcome you to join us in shaping the future of our organization.

Key Responsibilities:

● Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines

● Build tools and services to support data discovery, lineage, governance, and privacy

● Collaborate with other software/data engineers and cross-functional teams

● Build and maintain APIs to expose data to downstream applications

● Develop real-time streaming data pipelines

● Tech stack includes Airflow, Spark, Databricks, Delta Lake, Snowflake, Graph Database, Kafka

● Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform

● Contribute to developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, and more

● Ensure high operational efficiency and quality of the Core Data platform datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our stakeholders (Engineering, Data Science, Operations, and Analytics teams)

● Be an active participant and advocate of agile/scrum ceremonies to collaborate and improve processes for our team

● Maintain detailed documentation of your work and changes to support data quality and data governance requirements

● Address urgent production issues on timely manner during non standard working hours

● Experience working with studio production data a plus

Basic Qualifications:

● 7+ years of data engineering experience developing large data pipelines

● Proficiency in at least one major programming language (e.g. Python,Java, Kotlin)

● Strong SQL skills and ability to create queries to analyze and extract complex datasets

● Hands-on production environment experience with distributed processing systems such as Spark

● Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines

● Deep Understanding of AWS or other cloud providers as well as infrastructure as code

● Familiarity with Data Modeling techniques and Data Warehousing standard methodologies and practices

● Strong algorithmic problem-solving expertise

● Excellent written and verbal communication

● Advance understanding of OLTP vs OLAP environments

● Willingness and ability to learn and pick up new skill sets

● Self-starting problem solver with an eye for detail and excellent analytical and communication skills

● Strong background in at least one of the following: distributed data processing or software engineering of data services, or data modeling

● Graph Database experience a plus

● Realtime Event Streaming experience a plus

● Familiar with Scrum and Agile methodologies

● Bachelor’s Degree in Computer Science, Information Systems equivalent industry experience