Databricks Architect
Data Engineer Profile Requirements: ( Minimum work experience 5 years)
- Experience in data integration (batch, Kafka events, etc.), processing (ETL/ELT), and migration on-premises/cloud (Azure)
- Knowledge of Azure Synapse, DataBricks, Data Lake Gen 2, API Azure, Azure Functions, Triggers, Events, SHIR, etc.
- Strong Python programming skills (core Python, PySpark), and containers development (K8s)
- Strong SQL skills (T-SQL and PL-SQL), with hands-on experience in building complex SQL queries and creating ETL loads
- Strong knowledge of different RDBMS (Oracle, PostgreSQL, etc.)
- Strong knowledge of DWH concepts and architecture, Logical & Physical design, Data modeling (3NF, Star schema, etc.)
- Strong knowledge of Lakehouse concepts (DeltaLake)
- Knowledge of Agile concepts and hands-on experience with Confluence & Jira
- Experience with Data modeling tools (knowledge of Erwin and FET)
- Experience working in a multi-cultural team
- English upper intermediate