Azure Databricks Developer
About Us:
Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise solutions. The company s consultative and design-thinking approach empowers societies worldwide, enhancing the efficiency and productivity of businesses. As part of the multibillion-dollar diversified CKA Birla Group, Birlasoft with its 12,000+ professionals, is committed to continuing the Group s 170-year heritage of building sustainable communities.
Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise solutions. The company s consultative and design-thinking approach empowers societies worldwide, enhancing the efficiency and productivity of businesses. As part of the multibillion-dollar diversified CKA Birla Group, Birlasoft with its 12,000+ professionals, is committed to continuing the Group s 170-year heritage of building sustainable communities.
Key Responsibilities
1. Cloud Platform Operations
Manage and optimize Azure workloads ADLS, VNets, Key Vault, ADF, Synapse, Fabric, Databricks.
Configure and maintain Databricks clusters, jobs, DLT pipelines, Delta Lake storage, and Unity Catalog policies.
Operationalize Fabric Lakehouses, Pipelines, Warehouses, and Semantic Models for production workloads.
Ensure robust platform governance across environments (DEV QA UAT PROD).
2. Infrastructure as Code & CI/CD
Build and maintain Terraform/Bicep templates for environment provisioning and configuration.
Develop end-to-end CI/CD pipelines for Databricks, Fabric, and Azure components (ADO/GitHub).
Automate deployment of notebooks, workflows, access policies, networking components, and Fabric artifacts.
Enforce version control, release governance, and quality gates.
3. FinOps, Cost Management & Capacity Planning
Implement FinOps dashboards, alerts, budgets, and spend governance practices.
Perform Databricks and Fabric cost optimization cluster sizing, autoscaling, idle management, job tuning.
Conduct capacity planning for compute, storage, Fabric engines, and Databricks workloads.
Develop cost-saving recommendations and automated consumption monitoring.
4. Environment Management, Security & Governance
Provision and manage Azure data environments with consistent policies and naming standards.
Configure RBAC, ACLs, Unity Catalog grants, service principals, network security, Managed Identities.
Implement governance standards for data access, lineage, audit logging, compliance, and risk mitigation.
Ensure secure connectivity using Private Endpoints, VNET integration, and enterprise IAM controls.
Required Skills & Experience
6 12 years in cloud data engineering, SRE, or platform engineering roles.
Strong hands-on expertise with:
o Azure Data Services (ADLS, ADF, Synapse, Key Vault, VNets)
o Azure Databricks (clusters, jobs, Delta Lake, DLT, Unity Catalog)
o Microsoft Fabric (Lakehouse, Pipelines, Warehouse, Dataflows)
o Unity Catalog governance (catalogs, schemas, access policies, lineage)
Strong scripting and automation experience: Python, PowerShell, Bash, SQL, PySpark.
Experience with Terraform/Bicep for IaC.
Strong knowledge of Azure DevOps or GitHub Actions CI/CD pipelines.
Proven FinOps experience with cost governance and optimization across cloud workloads.
Experience in SRE practices SLIs, SLOs, operational readiness, automated recovery.
Preferred Qualifications
Certifications in Azure Data Engineer, Azure DevOps Engineer, Databricks Data Engineer, FinOps Practitioner.
Experience in highly regulated environments (BFSI, Healthcare, Retail).
Understanding of zero-trust security models.
ata Platforms)