Azure, PySpark, SQL, Databricks Professional
- Experience: 5+ Years Location: Gurgaon / Pune (Onsite Work From Office) Contract Duration: 6 Months to 1 Year
- Primary Skills Required: Azure PySpark SQL Databricks
Key Responsibilities:
- Design and develop scalable data pipelines.
- Work with Azure-based data platforms.
- Process and transform large datasets using PySpark.
- Build and optimize SQL queries and data workflows in Databricks.
- Note: Candidates must have the primary skills clearly mentioned in the resume.
- Interested candidates can share their updated resume.