Data Engineer
Description
Evolus (NASDAQ: EOLS) is a performance beauty company with a customer-centric approach focused on delivering breakthrough products. We are seeking an experienced and driven
Data Engineer to join our Information Technology team reporting to the Executive Director, Data Engineering. We are looking for an enthusiastic person with strong data engineering and analytical skills to join our team and support data and analytics initiatives across Evolus's global business functions. The Data Engineer’s role is to integrate data from a variety of internal and external sources into a common warehouse data model. This is a technical role that involves building and maintaining ELT data pipelines, recommend and implement appropriate data models and be comfortable in a DataOps environment. We are looking for someone with a consultative mindset to be able to interact with business and analytics team and help drive value to business. Data ecosystem is an evolving space, and we expect and encourage innovation and thought leadership.
If you join our team, you will be working on some of the most exciting opportunities and challenges we face, with a team that values growth, recognition, and camaraderie. If you are looking for an opportunity to exhibit your knowledge and technical abilities in a unique environment, then look no further! In this role, you will be challenged to drive the success of Evolus in an effort to build a brand like no other.
- Collaborate with team members to collect business requirements, define successful analytics outcomes, and design data models
- Design, develop Snowflake data warehouse using dbt or any other ELT tool to extend the Enterprise Dimensional Model
- Contribute to planning and prioritization discussion
- Break down and architect the most complex data engineering problems to deliver insights that meets and ideally exceeds business needs
- Own and deliver solutions - from ingestion of sources to data products for end user consumption, from conceptual iteration to production support
- Deliver and ensure sustained performance of all data engineering pipelines and remediate where required
- Own source code management, documentation (technical and end user), and release planning for data engineering products; lean-in to DataOps, DevOps, and CI/CD to deliver reliable, tested, and scalable functionality through automation
- Identify and proactively manage risks to the data engineering platform
- Office location – Newport Beach. Hybrid schedule: Monday and Friday remote; Tuesday - Thursday onsite
- Other duties as assigned
- Bachelor’s degree required
- 6+ years of experience in enterprise data solutions
- 4+ years in cloud-based data warehousing with strong SQL experience
- Experience building data pipelines using python and data orchestration tools like Apache Airflow
- Data extraction/transformation/orchestration tools such as Fivetran, dbt, Dataform, Airflow, Prefect, Kafka, Stitch and Matillion
- Deep understanding of data analysis, data modeling for visualization, and reporting
- Experience in DataOps and git or AzureDevOps and CI/CD pipelines
- Demonstrated experience with one or more of the following business subject areas: healthcare, marketing, finance, sales, product, customer success or engineering
- Experience performing root cause analysis for production issues and identify opportunities for improvement
- Passionate about writing clean, documented, and well-formed code and perform code reviews
- Keen attention to detail in planning, organization, and execution of tasks, while still seeing the big picture and understanding how all the pieces fit together and affect one another
- Excellent communication and interpersonal skills
Preferred Qualifications…
- Snowflake preferred
#LI-PB1 #LI-HYBRID