Amgen logo

Data Analytics Manager

Amgen
Remote friendly (Tampa, FL)
United States
IT

Role Summary

Data Analytics Manager responsible for designing and developing advanced data pipelines and solutions to support the business intelligence reporting portfolios of the Strategic Insights & Analytics (SIA) team and the Financial Insights & Technology (FIT) team. Designs, develops, and optimizes data pipelines, data integration frameworks, and metadata-driven architectures in Prophecy/Databricks to enable seamless data access for Power BI and Tableau dashboards used by hundreds of internal customers. Plays a key role in financial data integration and analysis for the global organization.

Responsibilities

  • Develop a strong understanding of Amgen’s financial data and systems to support reporting, integrations, and business data requests.
  • Design, develop, and maintain ETL/ELT pipelines in Databricks and Prophecy using PySpark, Python, and SQL, including ingestion and transformation of structured and unstructured data from sources such as Databricks, Oracle, AWS, SQL Server, APIs, logs, and third-party platforms.
  • Build scalable data pipelines and maintain underlying data infrastructure to migrate and deploy data across systems, with an understanding of Finance, HR, and related domains.
  • Build and maintain GitLab CI/CD pipelines and repositories to support code reviews, automate deployments, and uphold coding standards.
  • Use JIRA, Confluence, and Agile/DevOps tools to manage sprints, backlogs, and documentation.
  • Design and implement solutions that enable unified data access, governance, and interoperability across hybrid cloud environments.
  • Ensure data integrity and accuracy through rigorous validation, quality checks, and monitoring.
  • Evaluate and implement new tools, frameworks, and best practices to improve data processing efficiency and modernize engineering workflows.
  • Automate manual tasks, develop reusable data engineering components, collaborate with cross-functional teams to translate business needs into scalable technical solutions, and maintain documentation for processes.

Qualifications

  • Required: Doctorate degree
  • OR
  • Required: Master’s degree and 2 years of relevant experience in Analytics, Computer Science, IT or related field
  • OR
  • Required: Bachelor’s degree and 4 years of relevant experience in Analytics, Computer Science, IT or related field
  • OR
  • Required: Associate’s degree and 8 years of relevant experience in Analytics, Computer Science, IT or related field
  • OR
  • Required: High school diploma / GED and 10 years of relevant experience in Analytics, Computer Science, IT or related field
  • Preferred: Databricks Certificate
  • Preferred: Strong verbal and written communication skills
  • Preferred: Ability to work effectively with global, virtual teams
  • Preferred: High degree of initiative and self-motivation
  • Preferred: Ability to manage multiple priorities successfully
  • Preferred: Team-oriented, with a focus on achieving team goals
  • Preferred: Ability to learn quickly, be organized, and detail-oriented
  • Preferred: Hands-on experience of building ETL pipelines and data engineering technologies such as Databricks, Gitlab, SQL, PySpark, SparkSQL, AWS, Python
  • Preferred: Proficiency in workflow orchestration and performance tuning on big data processing
  • Preferred: Strong understanding of Prophecy or Alteryx
  • Preferred: Experience with data analysis and data visualization solutions such as Tableau, Power BI
  • Preferred: Proficiency with MS Office, especially Excel, Word, and PowerPoint
  • Preferred: Ability to quickly learn, adapt, and apply new technologies
  • Preferred: Strong problem-solving and analytical skills
  • Preferred: Excellent communication and teamwork skills
  • Preferred: Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices

Skills

  • Databricks, Prophecy, Alteryx
  • PySpark, SparkSQL, SQL, Python
  • ETL/ELT design and implementation
  • Data governance, data quality, and metadata management
  • Tableau, Power BI
  • GitLab CI/CD, DevOps practices
  • Agile methodologies (SAFe, Scrum)
  • Hybrid cloud architectures and data interoperability
  • Strong communication and cross-functional collaboration

Education

  • Doctorate degree
  • Master’s degree
  • Bachelor’s degree
  • Associate’s degree
  • High school diploma / GED