Role Summary
Palantir Data Engineer (International Commercial) responsible for maintaining and developing data platforms to deliver reliable data pipelines, data models, and data integrations. The role translates business requirements into technical specifications and assets for reporting and analytics, while maintaining code quality in Python/PySpark and ensuring platform standards within a matrixed organization.
Responsibilities
- Ensure Foundry data platform functions without outage or downtime.
- Validate & adhere to data quality standards & data modelling regulations in the platform for efficient data pipelines that are scalable & function reliably.
- Build new data models & maintain data pipelines per business requirements.
- Translate business requirements into technical specifications & data assets for reporting & analytical needs.
- Maintain code quality in Python/PySpark & ensure foundry platform standards.
- Ensure data integration connectivity between the platform & its downstream & upstream sources.
- Monitor platform compute resources & bandwidth usage & add more resources if necessary.
Qualifications
- Required: 6 years of experience as a Data Engineer, Data Scientist, &/or ETL/Big Data developer.
- Required: 4 years of experience with each of the following: (a) PySpark, Python, and SQL; (b) Data Modeling; (c) Git, Jira, Confluence; (d) AWS;
- Required: Coding with the following programming languages Python, SQL, Spark, Amazon Web Services;
- Required: Designing & implementing DevOps principals & Git version control;
- Required: Data modeling & schema design for optimized querying, aggregation, analysis, and visualization;
- Required: Scripting for automation & system management using Bash or Powershell;
- Required: Managing data infrastructure using AWS or Azure;
- Required: Working in a matrixed organization providing technical solutions to a nontechnical audience by reporting to peers, business leaders, & executives orally and in writing; &
- Required: Executing optimizing techniques, user experience design, & automated testing using Jmeter & Tosca.
Education
- Required: Bachelor's in Computer Science, Data Science, Computer Engineering, Information Technology or related field.