Senior Data Platform Engineer (Remote - US)

Jobgether
United States
On-site
Full-time
Posted 11 days ago

Job Description

This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Senior Data Platform Engineer in the United States.

In this role, you will be part of a team building a robust, scalable data platform that empowers data scientists, machine learning engineers, and analytics teams to extract insights at scale. You will design, implement, and maintain infrastructure and tooling for real-time and batch data processing, ensuring high reliability and accessibility. You will collaborate with cross-functional teams to promote best practices in data engineering, DataOps, and event-driven architectures. This position offers an opportunity to shape the organization’s data ecosystem, contribute to architectural decisions, and influence how analytics and machine learning capabilities evolve. The environment is dynamic, collaborative, and highly technical, offering exposure to cutting-edge data technologies in a distributed, remote-first setting.

Accountabilities

  • Design, build, and maintain scalable, reliable data infrastructure to support analytics, machine learning, and business intelligence workloads.
  • Develop and manage streaming data systems using technologies such as Kafka, Spark, AWS Kinesis, and Flink.
  • Implement infrastructure-as-code using Terraform or CloudFormation across cloud environments (AWS, GCP, or Azure).
  • Collaborate with Data Scientists, Machine Learning Engineers, and Analytics teams to ensure data quality, accessibility, and reliability.
  • Promote best practices for DataOps, observability, and system reliability within the data platform.
  • Guide teams in adopting Event-Driven Data Mesh architecture to standardize data flows and ensure consistency.
  • Contribute to architectural decisions and provide technical leadership across platform initiatives.

Requirements

  • 5+ years of experience in software engineering with strong proficiency in Python.
  • 3+ years of experience building and maintaining streaming data pipelines (Kafka, Spark, AWS Kinesis, Flink).
  • Hands-on experience deploying infrastructure-as-code using Terraform or CloudFormation.
  • Background in platform or software engineering, ideally supporting internal developer or data teams.
  • Advanced degree in Engineering, Computer Science, or a related technical field.
  • Proven ability to work in a distributed, remote team environment.
  • Strong problem-solving skills and the ability to influence architectural decisions across teams.

Preferred:

  • Experience building and maintaining data lakes (Delta Lake, Apache Iceberg).
  • Collaboration experience with Machine Learning Engineers or Data Scientists on Feature Store design.
  • Familiarity with SRE principles, DevOps, and DataOps best practices.

Disclaimer: Real Jobs From Anywhere is an independent platform dedicated to providing information about job openings. We are not affiliated with, nor do we represent, any company, agency, or agent mentioned in the job listings. Please refer to our Terms of Services for further details.