Evolus logo

Data Engineer

Evolus
Full-time
Remote friendly (Newport Beach, CA)
United States
$114,000 - $142,000 USD yearly

Want to see how your resume matches up to this job? A free trial of our JobsAI will help! With over 2,000 biopharma executives loving it, we think you will too! Try it now — JobsAI.

Role Summary

Data Engineer at Evolus. Join IT to support data and analytics initiatives across global functions. Build and maintain ELT data pipelines, integrate data into a common warehouse data model, and interact with business and analytics teams to drive value.

Responsibilities

  • Collaborate with team members to collect business requirements, define successful analytics outcomes, and design data models
  • Design, develop Snowflake data warehouse using dbt or other ELT tools to extend the Enterprise Dimensional Model
  • Contribute to planning and prioritization discussions
  • Break down and architect complex data engineering problems to deliver insights that meet business needs
  • Own and deliver solutions—from ingestion of sources to data products for end user consumption, from conceptual iteration to production support
  • Deliver and ensure sustained performance of all data engineering pipelines and remediate where required
  • Own source code management, documentation (technical and end user), and release planning for data engineering products; lean into DataOps, DevOps, and CI/CD to deliver reliable, tested, and scalable functionality through automation
  • Identify and proactively manage risks to the data engineering platform
  • Office location – Newport Beach. Hybrid schedule: Monday and Friday remote; Tuesday - Thursday onsite
  • Other duties as assigned

Qualifications

  • Bachelor’s degree required
  • 6+ years of experience in enterprise data solutions
  • 4+ years in cloud-based data warehousing with strong SQL and Snowflake experience
  • Experience building data pipelines using Python and data orchestration tools like Apache Airflow
  • Data extraction/transformation/orchestration tools such as Fivetran, dbt, Datafold, Prefect, Kafka, Stitch and Matillion
  • Deep understanding of data analysis, data modeling for visualization, and reporting
  • Experience in DataOps and git or Azure DevOps and CI/CD pipelines
  • Demonstrated experience with one or more business domains (healthcare, marketing, finance, sales, product, customer success or engineering)
  • Experience performing root cause analysis for production issues and identifying opportunities for improvement
  • Proficiency in writing clean, documented, well-formed code and performing code reviews
  • Attention to detail in planning, organization, and execution while understanding the bigger picture
  • Excellent communication and interpersonal skills

Skills

  • SQL
  • Snowflake
  • Python
  • Apache Airflow
  • dbt
  • DataOps
  • CI/CD
  • Git/Azure DevOps