EWB Senior Lead Data Engineer - R01562639

Bangalore, Karnataka, India April 2, 2026 Lever
Senior Lead Data Engineer
Senior Lead Data Engineer

Primary Skills

 

Primary Duties and Responsibilities:

  • 10+ years of SQL Server ETL Development experience using SSIS, Azure Data Factory and Azure Databricks. Strong ETL, data warehouse, T-SQL skills.
  • Understanding of data warehouse and data mart architecture, and Kimball methodology.
  • 8+ years of strong hands-on experience building large-scale databases
  • Design, build and launch efficient & reliable data pipelines to move and transform data
  • Broad understanding and experience of data modeling, data management, ETL and BI
  • Experience in building enterprise data warehousing with table partitioning, encryption and data compression
  • Experience with cloud migration
  • Perform ongoing monitoring, automation and refinement of reports and BI solutions.
  • Work with BI Management to identify and resolve data related issues.
  • Work with business departmental teams to help them define and document business requirements for new metrics and reports.
  • Participate in process reviews and enhancements.
  • Provide analysis and issue resolution on business reported concerns.
  • Participate in design and delivery of cubes, dashboards and various self-service business intelligence solutions.
  •  

    Other:

     

  • Enforces established documentation guidelines and best practices.
  • Maintains a high degree of business functional knowledge and current automated technologies and tools.
  • Adheres to all policies and procedures concerning all confidential information including but not limited to internal use and restricted information
  •  

    Qualifications:

    Required

  • 7+ years of experience in working as Data Engineer for Enterprise DWH and BI project for both on prem and cloud (Azure) with minimum of 3+ years of hands-on experience in working on Microsoft Technologies (SQL Server, Synapse, SSIS, Data Factory, Data Bricks)
  • Hands on experience on scripting language python, C# or equivalent
  • Advanced level experience on MS SQL Server, MySql , or Oracle
  • Experience with Big Data Technologies (NoSQL databases, Hive, Hbase , Pig or Spark)
  • Exposure and hands on exposure to ETL using SSIS, Data Factory
  • Experience and detailed knowledge with Master Data Management, ETL, Data Quality, metadata management, data profiling, micro-batches, streaming data loads.
  • Experience of using one/more scalable (MPP) relational databases such as Synapse or Snowflake for advanced analytics
  • Working knowledge of application and system availability, scalability and distributed data platforms
  • Strong experience in data warehousing technologies, ETL/ELT implementations
  • Knowledge and experience with full SDLC lifecycle
  • Experience with Lean / Agile development methodologies
  • Experience in providing technical leadership and other engineers for the best practices on the data engineering space.
  •  

     

    Desired

  • Experience in defining new architectures and ability to drive an independent project from an architectural stand point
  • Familiarity with data modeling and ETL tools such as Erwin, Talend etc.
  • Support users with troubleshooting, tuning queries (partition, bucket recommendations etc.),
  • Experience with design and implementation of ETL/ELT framework for complex warehouses/marts. Knowledge of large data sets and experience with performance tuning and troubleshooting
  • Hands-on development mentality, with a willingness to troubleshoot and solve complex problems
  • CI / CD exposure
  • Excellent verbal and written communication skills
  •  

    Job requirements

     

    Summary:

     

    This position designs and constructs East West Bank enterprise-wide data and information architecture to support business intelligence and data warehouse efforts.  Responsible for the design, implementation and maintenance of data warehouse and data marts. ETL development skills using Microsoft SSIS, expertise in relational structures, dimensional data modeling, and T-SQL skills are a must.  Responsibilities will include physical design and creation of physical databases, Work-Load-Management/Priority Scheduling, data & user security, SQL performance tuning, and enforcement of standards, data replication and documentation. Help the Bank adopt and organize data. He or she will improve existing systems (and potentially assess new technologies) to make it easier to manage and deliver information. Combining their knowledge of technology and business management to deliver state-of-the-art business intelligence solutions, improve efficiency.

     

    Primary Duties and Responsibilities:

  • 10+ years of SQL Server ETL Development experience using SSIS, Azure Data Factory and Azure Databricks. Strong ETL, data warehouse, T-SQL skills.
  • Understanding of data warehouse and data mart architecture, and Kimball methodology.
  • 8+ years of strong hands-on experience building large-scale databases
  • Design, build and launch efficient & reliable data pipelines to move and transform data
  • Broad understanding and experience of data modeling, data management, ETL and BI
  • Experience in building enterprise data warehousing with table partitioning, encryption and data compression
  • Experience with cloud migration
  • Perform ongoing monitoring, automation and refinement of reports and BI solutions.
  • Work with BI Management to identify and resolve data related issues.
  • Work with business departmental teams to help them define and document business requirements for new metrics and reports.
  • Participate in process reviews and enhancements.
  • Provide analysis and issue resolution on business reported concerns.
  • Participate in design and delivery of cubes, dashboards and various self-service business intelligence solutions.
  •  

    Other:

     

  • Enforces established documentation guidelines and best practices.
  • Maintains a high degree of business functional knowledge and current automated technologies and tools.
  • Adheres to all policies and procedures concerning all confidential information including but not limited to internal use and restricted information
  •  

    Qualifications:

    Required

  • 7+ years of experience in working as Data Engineer for Enterprise DWH and BI project for both on prem and cloud (Azure) with minimum of 3+ years of hands-on experience in working on Microsoft Technologies (SQL Server, Synapse, SSIS, Data Factory, Data Bricks)
  • Hands on experience on scripting language python, C# or equivalent
  • Advanced level experience on MS SQL Server, MySql , or Oracle
  • Experience with Big Data Technologies (NoSQL databases, Hive, Hbase , Pig or Spark)
  • Exposure and hands on exposure to ETL using SSIS, Data Factory
  • Experience and detailed knowledge with Master Data Management, ETL, Data Quality, metadata management, data profiling, micro-batches, streaming data loads.
  • Experience of using one/more scalable (MPP) relational databases such as Synapse or Snowflake for advanced analytics
  • Working knowledge of application and system availability, scalability and distributed data platforms
  • Strong experience in data warehousing technologies, ETL/ELT implementations
  • Knowledge and experience with full SDLC lifecycle
  • Experience with Lean / Agile development methodologies
  • Experience in providing technical leadership and other engineers for the best practices on the data engineering space.
  •  

     

    Desired

  • Experience in defining new architectures and ability to drive an independent project from an architectural stand point
  • Familiarity with data modeling and ETL tools such as Erwin, Talend etc.
  • Support users with troubleshooting, tuning queries (partition, bucket recommendations etc.),
  • Experience with design and implementation of ETL/ELT framework for complex warehouses/marts. Knowledge of large data sets and experience with performance tuning and troubleshooting
  • Hands-on development mentality, with a willingness to troubleshoot and solve complex problems
  • CI / CD exposure
  • Excellent verbal and written communication skills
  • Apply on company site

    How to Get Hired at Brillio 2

    • Tailor your resume for each Brillio role by matching the exact technology keywords from the job description — Azure Data Factory, Microsoft Fabric, Power BI, Snowflake, and PySpark appear across many current openings, and Lever's search algorithms rely on precise keyword matches.
    • Position yourself as a consultant, not just a technician — highlight client-facing experience, stakeholder communication, and business outcome delivery in your resume and interview responses, since Brillio's model demands both technical depth and consulting soft skills.
    Read the full guide

    How well do you match this role?

    Check My Resume