Data Architect - Azure, Databricks
Description
Job Description: Data Architect (Azure & Databricks Technology Stack)
Position Summary
We are seeking a highly skilled Data Architect with deep expertise in Azure, Databricks, SQL, and Data Modeling. The ideal candidate will have extensive experience in both traditional data warehouse architectures and modern data platform paradigms (Data Lake, Lakehouse, and Azure Synapse). This role requires a proven track record in designing and implementing enterprise-scale data solutions, integrating multiple data sources, and enabling analytics at scale.
Key Responsibilities
Architecture & Design
- Lead the architecture, design, and implementation of enterprise data warehouses, data lakes, and lakehouses on Azure and Databricks.
- Define and enforce data modeling standards, best practices, and guidelines across transactional and analytical workloads.
- Architect end-to-end modern data platforms leveraging Azure Synapse, Databricks, Delta Lake, and related Azure services (ADF, ADLS, Purview, etc.).
Solution Delivery
- Design scalable ETL/ELT pipelines and orchestrate workflows for ingestion, transformation, and consumption.
- Partner with business stakeholders, product teams, and data engineers to deliver high-quality, business-driven data solutions.
- Ensure solutions are secure, performant, and compliant with enterprise governance and regulatory standards.
Strategy & Leadership
- Define data strategy and reference architecture for modernization from legacy/traditional data platforms (Teradata, Oracle, SQL Server, etc.) to modern architectures.
- Provide technical leadership and mentoring to engineering teams.
- Collaborate with cloud, analytics, and business teams to align data architecture with organizational goals.
Required Skills & Experience
Core Expertise
- Proven experience as a Data Architect or Senior Data Engineer/Lead designing and implementing enterprise data warehouses and modern data platforms.
- Hands-on expertise with Azure services: Azure Synapse Analytics (Dedicated SQL Pools, Serverless Pools), Azure Data Factory (ADF), Azure Data Lake Storage (ADLS Gen2), Azure Purview.
- Strong expertise with Databricks: Delta Lake, Lakehouse, Spark (PySpark/Scala), MLflow.
- Very strong SQL development and optimization skills.
- Strong data modeling (3NF, Dimensional, Data Vault, Canonical Models, etc.).
Complementary Skills
- Knowledge of traditional data platforms (Oracle, Teradata, SQL Server, Informatica, etc.) along with modern ELT/ETL frameworks.
- Solid understanding of data governance, metadata management, data quality, and security.
- Experience in building real-time/streaming data pipelines (Kafka, Event Hub, etc.) is a plus.
Soft Skills
- Excellent communication and ability to engage with business SMEs and senior stakeholders.
- Strong analytical and problem-solving skills with an enterprise-scale mindset.
- Experience in mentoring and guiding teams on modern data practices.
Qualifications
- Bachelor’s/Master’s degree in Computer Science, Information Systems, or related field.
- 15+ years of experience in Data Engineering/Architecture, with at least 5+ years in Azure & Databricks ecosystem.
- Proven track record in enterprise-scale data warehouse and lakehouse implementations.
- Certifications in Azure Data Engineer/Architect or Databricks preferred.