External Advertising TitleSr. Data Engineer
Job Description
We are hiring Data Engineer for Bengaluru LocationRole - Data EngineerExperience - 6+ YearsWork Mode - Hybrid
About the Role:
Applied Systems, Inc., a worldwide leader in insurance technology, is currently searching for an experienced Data Engineer with proven experience in Google Cloud Platform (GCP) to join our team. We are on an exciting journey to scale our Data and Analytics capabilities to enable better data-driven decision making across the business. In this role, you will be responsible for designing, building, and maintaining scalable data pipelines, optimizing data processes, and ensuring the seamless integration of data from various sources. If you are passionate about data and excited to work on cutting-edge technologies, we want to hear from you!
What You will Do
- Design, develop, operationalize robust and scalable data pipelines with automated data quality checks to support business needs and demonstrate ownership of live data pipelines.
- Proficiency understanding and implementing data life cycles, data lineage, custom metadata, and data governance.
- Proficiency implementing BigQuery SQL procedures, functions, and similar, with actionable logging
- Optimize ETL processes and data workflows to ensure abiding by performance constraints, for efficiency and scalability.
- Comfortably lead designing and building of development to production data pipelines from data ingestion to consumption using GCP services, Python, BigQuery, DBT, SQL, Apache Airflow, Celigo.
- Expertise in JSON, XML, text parsing using Dataflow, Python, or batch jobs.
- Advocates and practices software best practices in leveraging reusable components in implementation.
- Design, develop and maintain robust, and scalable data models and schemas to support analytics and reporting requirements.
- Optimize data processing performance, ensure high availability, scalability of data systems and solutions.
- Implement monitoring and alerting mechanisms to proactively identify and resolve issues.
- Ensure data quality and consistency through rigorous testing and validation processes in development and production tiers.
- Troubleshoot and resolve data-related issues promptly.
- Create, update, and maintain technical documentation of the data processes, pipelines, and models.
- Stay updated with industry trends and technologies to continuously improve our data engineering practices.
What You will Need to Succeed
- 4+ years of experience in Data / ETL Engineering, as well as Data / ETL Architecture and pipeline development with a minimum of 1 year of working experience as Google Cloud Platform (GCP) developer.
- Proven experience in building and maintaining a scalable Data Warehouse in a cloud-based data platform, preferably in Google’s BigQuery.
- Experience with the primary managed data services within GCP, including DataProc, Dataflow, BigQuery
- Proficiency in SQL, DBT, Python (Apache Airflow, Composer) and hands-on experience with ETL tools like Talend, Fivetran or similar.
- Proficiency in Git for version control
- Experience with DBT for data transformation.
- Proven experience in high-quality designing, building and maintaining ETL processes, automated data quality checks and reusable ETL components.
- Familiarity with Data Lake and Data Warehousing concepts and Data Modelling techniques.
- Familiarity with concepts like star schema, snowflake schema, standardization, normalization, fact and dimension tables.
- Strong problem-solving skills and the ability to work independently as well as collaboratively.
- Excellent communication skills and the ability to articulate technical concepts to non-technical stakeholders.
- Bachelor-level degree in Computer Science, MIS, or CIS, or equivalent experience.
Please share your updated CV to [email protected] / [email protected]