AWS Data Architect (Remote - India)

Jobgether
India
On-site
Full-time
Posted 18 days ago

Job Description

This position is posted by Jobgether on behalf of a partner company. We are currently looking for an AWS Data Architect in India.

We are seeking a seasoned AWS Data Architect to drive the design, development, and optimization of enterprise-scale data platforms and analytics solutions. This role involves building robust data pipelines, cloud-based data lakes, lakehouses, and ML systems while ensuring governance, security, and high-quality data practices across the organization. The ideal candidate will provide technical leadership, mentor data engineering teams, and collaborate across multiple cloud environments, including AWS and Azure. This position is perfect for someone passionate about leveraging advanced cloud technologies to solve complex data challenges and deliver actionable insights at scale.

Accountabilities:

·         Define and lead enterprise data architecture and cloud data strategy, focusing on AWS.

·         Design, implement, and optimize scalable data lakes, lakehouses, data warehouses, streaming, and analytics platforms.

·         Architect and maintain ETL/ELT pipelines, real-time data workflows, and ML systems.

·         Drive multi-cloud (AWS + Azure) and hybrid data integration initiatives.

·         Implement frameworks for data governance, security, metadata management, lineage, and data quality.

·         Mentor and guide data engineering and data science teams on best practices and architectural standards.

·         Collaborate with cross-functional teams to ensure data initiatives align with business objectives.

Requirements

·         10+ years of experience in data architecture or engineering, with 5+ years on cloud platforms, primarily AWS.

·         Strong expertise in AWS data stack, including S3, Glue, Redshift, Athena, EMR, Kinesis, and Lake Formation.

·         Solid experience with data modeling approaches: dimensional modeling, Data Vault, and lakehouse architecture.

·         Hands-on proficiency in Python, PySpark, SQL, and distributed systems such as Spark and Kafka.

·         Experience with ML platforms, SageMaker, MLOps, and deployment of machine learning pipelines.

·         Multi-cloud experience with AWS and Azure, including data migration and modernization.

·         Mandatory Certifications: AWS Certified Data Analytics Specialty, AWS Certified Machine Learning Specialty, Azure Data Engineer Associate (DP-203).

·         Nice to have: Additional AWS certifications (SA Pro, DevOps Pro), Azure DP-100, Databricks, Terraform, CloudFormation, experience with data mesh/fabric and governance platforms like Alation or Collibra.

Disclaimer: Real Jobs From Anywhere is an independent platform dedicated to providing information about job openings. We are not affiliated with, nor do we represent, any company, agency, or agent mentioned in the job listings. Please refer to our Terms of Services for further details.