My client is searching for a Sr. Data Platform Engineer to join their team in Boise, Idaho. This organization builds advanced AI-driven platforms that help large-scale industrial and manufacturing environments optimize complex processes and deliver measurable operational improvements. Their technology supports highly data-intensive workflows across hybrid and cloud-native ecosystems, enabling teams to operate at enterprise scale with reliability and precision.
Key Responsibilities
- Design and implement secure, scalable data connectors for cloud and on-prem data sources (object storage, cloud warehouses, relational databases, APIs, and streaming systems).
- Build and maintain data governance capabilities, including access controls, audit logging, lineage tracking, metadata services, and data quality enforcement.
- Architect and develop core backend data platform components such as S3-compatible object storage layers, table/metadata services, and efficient serialization and transformation systems.
- Engineer lakehouse-compatible services supporting ACID transactions, schema evolution, time travel, and optimization to enable interoperability with modern analytics platforms.
- Develop platform abstractions and APIs that support distributed compute frameworks (e.g., Spark, Pandas) and data/ML workloads.
- Optimize data pipelines for performance, reliability, and throughput across multi-cloud and hybrid environments while meeting defined SLOs.
- Collaborate cross-functionally with platform engineers, data engineers, data scientists, and product teams to translate requirements into scalable solutions.
- Enhance platform observability through metrics, logging, and tracing, and contribute to high-availability and disaster recovery strategies.
Details:
- Full-Time, Permanent Position
- $170k - $190k + Benefits
- 5 Days On-Site | Boise, Idaho
Qualifications
- Bachelor's or Master's degree in Computer Science, Engineering, or a related field from a Top 20 University.
- 6+ years of experience in data platform engineering, data infrastructure, or distributed systems.
- Proven experience building and operating large-scale data pipelines, ETL frameworks, or analytics platforms.
- Strong proficiency in one or more backend languages such as Python, Go, Java, or Scala.
- Deep understanding of data modeling, lakehouse architectures, and modern table formats.
- Experience designing data connectors, APIs, and integrations with external systems.
- Solid knowledge of distributed systems, databases, and major cloud platforms (AWS, Azure, GCP).
Oscar Associates Limited (US) is acting as an Employment Agency in relation to this vacancy.