Sr Data Platform Engineer

San Francisco, California

US$170000 - US$220000 per year

Full time

Ref: RG1_1768414010

Oscar is working with a leading AI solution for Semiconductor Manufacturing Process Optimization organization that is looking for an experienced Sr Data Platform Engineer to join their team.

As the Sr Data Platform Engineer, you will design and build the core data infrastructure that underpins our AI- and analytics-driven semiconductor manufacturing platform. Your mission is to evolve our data platform from a functional foundation into a robust, enterprise-grade system that is secure, scalable, and highly performant.

Key Responsibilities:

  • Design and build secure, scalable data connectors for cloud and on-prem sources, including object stores, cloud data warehouses, relational databases, APIs, and streaming/event systems.

  • Architect and implement data governance capabilities such as access controls, audit logging, lineage tracking, metadata/catalog services, data quality checks, and policy enforcement.

  • Develop core backend data platform components, including S3-compatible object storage layers (MinIO-like), table and metadata services, and efficient data serialization and transformation pipelines.

  • Build Delta Lake-compatible services supporting ACID transactions, schema evolution, time travel, and compaction, enabling interoperability with Databricks, Snowflake, BigQuery, and other lakehouse platforms.

  • Create platform abstractions and APIs that enable Spark, Pandas, and distributed compute frameworks to run reliably and efficiently on the platform.

  • Optimize performance, throughput, and reliability of data pipelines across multi-cloud and hybrid environments while meeting defined SLOs.

  • Partner with data engineers, data scientists, platform engineers, and product teams to translate platform requirements into scalable, production-ready data solutions.

  • Improve observability across data services through metrics, logging, and tracing, and implement high-availability and disaster recovery strategies for critical data components.

Qualifications:

  • 6+ years of experience building data platforms, data infrastructure, or distributed systems at scale.

  • Hands-on experience developing and operating large-scale data pipelines, ETL frameworks, or data warehouses (e.g., Databricks, Snowflake, BigQuery, Airflow).

  • Strong programming skills in one or more of Python, Go, Java, or Scala.

  • Deep understanding of data modeling, lakehouse architectures, and open table formats such as Delta Lake, Apache Iceberg, or similar technologies.

  • Experience building data connectors, backend APIs, and integrations with external data systems.

  • Solid knowledge of distributed systems, databases, and cloud platforms (AWS, Azure, GCP).

  • Strong analytical skills with the ability to communicate complex technical concepts clearly and collaborate effectively across teams.

Recap:

  • Location: San Francisco, CA (Onsite)
  • Type: Full time Permanent
  • Rate: $170k - $220k annual salary dependent on relevant experience

If you think you're a good fit for the role, we'd love to hear from you!

Oscar Associates Limited (US) is acting as an Employment Agency in relation to this vacancy.

Apply today.

Share job