Big data Cloud Architect
Role Overview**
We are seeking an experienced Big Data Cloud Architect with 812 years in designing and implementing scalable big data solutions on Azure and AWS. The ideal candidate will have deep hands-on expertise in cloud-based big data technologies, a strong background in the software development life cycle (SDLC), and experience in microservices and backend development.
---
**Key Responsibilities**
- Design, build, and maintain scalable big data architectures on Azure and AWS
- Select and integrate big data tools and frameworks (e.g., Hadoop, Spark, Kafka, Azure Data Factory, )
- Lead data migration from legacy systems to cloud-based solutions
- Develop and optimize ETL pipelines and data processing workflows.
- Ensure data infrastructure meets performance, scalability, and security requirements.
- Collaborate with development teams to implement microservices and backend solutions for big data applications.
- Oversee the end-to-end SDLC for big data projects, from planning to deployment.
- Mentor junior engineers and contribute to architectural best practices.
- Prepare architecture documentation and technical reports.
---
**Required Skills & Qualifications**
- Bachelor’s/Master’s degree in Computer Science, Engineering, or related field.
- 8–12 years of experience in big data and cloud architecture.
- Proven hands-on expertise with Azure and AWS big data services (e.g., Azure Synapse, AWS Redshift, S3, Glue, Data Factory).
- Strong programming skills in Python, Java, or Scala[9].
- Solid understanding of SDLC and agile methodologies.
- Experience in designing and deploying microservices, preferably for backend data systems.
- Knowledge of data storage, database management (relational and NoSQL), and data security best practices[6].
- Excellent problem-solving, communication, and team leadership skills[8].
---
**Preferred Qualifications**
- Certifications in AWS and/or Azure cloud platforms.
- Experience with Infrastructure as Code (e.g., Terraform, CloudFormation).
- Exposure to containerization (Docker, Kubernetes) and CI/CD pipelines.
- Familiarity with data analytics, machine learning, and event-driven architectures.