Data Engineer

<strong>Everyone at LRW is responsible for pioneering new ways, opening new doors, and contributing to a culture of the most brilliant and beautiful minds in the business.</strong>

Los Angeles, CA

LRW, a Material Company

<p>Technologists and creators. Strategists and designers. Researchers and storytellers. LRW is now a Material Company, powered by sophisticated analytics, deep human understanding, and design thinking. This page contains job postings for LRW and several other Material companies. Want to join us?</p>

keywords: marketing,design,quality assurance,knowledge,analysis,experience,cloud,web services


Overview: LRW is swimming in data, coming from many sources.&nbsp; The marketing and data science team requires an experienced and all-purpose data engineer to contribute to building of data processing infrastructure and ETL pipelines to enable access to the data in efficient ways across multiple platforms.&nbsp;
Responsibilities: In this role, you will: <ul> <li>Work with a variety of stakeholders to design, implement, and maintain&nbsp; data lake and data warehouse &nbsp;architecture to consolidate data from various APIs, SQL and NoSQL sources and make it &nbsp;accessible based on different use cases</li> <li>Design and implement &nbsp;processes to shape and deliver data in accordance with business needs and various use cases</li> <li>Employ required languages and &nbsp;tools to stitch a coherent system oriented toward improving data reliability, efficiency and quality</li> </ul>
Requirements: ABOUT YOU <ul> <li>The ideal candidate should be curious, self-motivated, responsive, and articulate</li> <li>Is interested in the core business of the company and seeks to identify the business and use implications of various solutions</li> <li>Someone who is a tenacious problem-solver who seeks to identify core bottlenecks from both a technological as well as a process-oriented standpoint.</li> <li>With a deep understanding of data processing concepts and data modeling principles (the difference between OLTP, data warehouse, and data lake )</li> <li>With advanced knowledge of SQL, Python, &nbsp;and Bash</li> <li>Experience working with <ul> <li>cloud technologies: AWS (RDS, EC2, S3, Athena, Lambda, EMR, ECS), GCP (BigQuery, DataProc), Snowflake</li> <li>Data platforms: PostgreSQL, MongoDB , data in different file formats (Parquet, JSON, CSV, Excel)</li> <li>Execution platform: Apache Airflow</li> <li>Distributed processing: HDFS, Spark, Hive</li> </ul> </li> <li>Familiarity deploying solutions through docker or REST APIs</li> <li>Experience in R and JavaScript &nbsp;is a plus</li> </ul>