Evolus logo

Data Engineer

Evolus
Full-time
Remote friendly (Newport Beach, CA)
United States
$114,000 - $142,000 USD yearly
IT

Want to see how your resume matches up to this job? A free trial of our JobsAI will help! With over 2,000 biopharma executives loving it, we think you will too! Try it now — JobsAI.

Role Summary

Data Engineer at Evolus, joining the Information Technology team to integrate data from internal and external sources into a common warehouse data model. Build and maintain ELT data pipelines, recommend and implement data models, and operate in a DataOps environment while collaborating with business and analytics teams to drive value. Hybrid role based in Newport Beach with on-site and remote days.

Responsibilities

  • Collaborate with team members to collect business requirements, define successful analytics outcomes, and design data models
  • Design, develop Snowflake data warehouse using dbt or other ELT tools to extend the Enterprise Dimensional Model
  • Contribute to planning and prioritization discussions
  • Architect complex data engineering problems to deliver insights that meet or exceed business needs
  • Own and deliver end-to-end solutions from data ingestion to data products for end user consumption, including production support
  • Ensure sustained performance of data engineering pipelines and remediate as required
  • Own source code management, documentation, and release planning; participate in DataOps, DevOps, and CI/CD for reliable, scalable functionality
  • Identify and manage risks to the data engineering platform
  • Office location – Newport Beach. Hybrid schedule: Monday and Friday remote; Tuesday - Thursday onsite
  • Other duties as assigned

Qualifications

  • Bachelor’s degree required
  • 6+ years of experience in enterprise data solutions
  • 4+ years in cloud-based data warehousing with strong SQL and Snowflake experience
  • Experience building data pipelines using Python and data orchestration tools like Apache Airflow
  • Data extraction/transformation/orchestration tools such as Fivetran, dbt, Datafold, Prefect, Kafka, Stitch and Matillion
  • Deep understanding of data analysis, data modeling for visualization, and reporting
  • Experience in DataOps and Git or Azure DevOps and CI/CD pipelines
  • Experience in one or more of healthcare, marketing, finance, sales, product, customer success or engineering
  • Experience performing root cause analysis for production issues and identifying opportunities for improvement
  • Strong coding practices with clean, documented code and ability to perform code reviews
  • Attention to detail in planning and execution while understanding overall integration
  • Excellent communication and interpersonal skills

Skills

  • Snowflake
  • dbt
  • Python
  • Apache Airflow
  • Data modeling
  • DataOps / DevOps / CI/CD
  • Data visualization and reporting
  • Strong SQL

Education

  • Bachelor’s degree required
Apply now
Share this job