Senior Azure Data Engineer
provides specialty property and casualty insurance for small to middle-market businesses – and we’re on a mission to be the best-in-class while achieving steady, profitable growth. Our guiding principles include the core belief that our people are number one. We also strongly emphasize a customer-centric mentality and disciplined underwriting practices. Our work environment is flexible, friendly, and collaborative, with plenty of opportunities to take charge of your career.
What GBLI offers you:
- Generous paid time off (PTO)
- Professional development opportunities (including a mentorship program)
- Educational assistance program, which covers up to $5,250 in educational costs per year
- Comprehensive health insurance plan (with vision and dental)
- No-cost health insurance plan available
- Life insurance
- 401(k) retirement plan with up to 6% company match and immediate vesting
- Healthcare and dependent care flexible spending accounts
- Short-term and long-term disability
- Company-sponsored social events
- Various committees to get involved in, which include our Diversity, Equity, and Inclusion Committee, Charitable Giving Committee, and Employee Wellness Committee
What you will do:
- Build and maintain a set of managed data pipelines consisting of a series of stages through which data flows for our data warehouse (DWH) and related data stores and data marts. These data pipelines must be created, maintained, and optimized as workloads move from development to production for specific use cases that support
- Develop and maintain Data Documentation so that new and existing databases, flows and stores can be added to an operation manual.
- Work with traditional and agile software life cycle methodologies including working with users and Business Analysts to define requirements and create design documents.
- Provide status on a regular basis and
- Work independently as well as thrive in a collaborative environment.
- Leverage analytical skills and critical thinking to understand current environment and business needs and provide innovative ideas to improve processes.
- Relevant 3rd Level qualification (Degree, Masters) in a related discipline (Data Analytics, Data Science, Computer Science, Technology) or equivalent workplace experience.
- Minimum 7 years of experience working on Data and Analytics projects.
- Minimum 5 years of experience working on Cloud Platforms (Azure, AWS, Google Cloud Platform etc.) and any Cloud certifications (e.g., Azure Data Engineer Associate Certification) preferred.
- Minimum 3 years of experience working on Data projects in an Agile environment.
- Insurance / Financial industry experience desirable.
- Minimum 7 years with ETL tools in a Highly Available DWH setting.
- Minimum 7 years DWH development with star/snowflake methodologies.
- Minimum 7 years working with relational preferably Microsoft SQL server (Oracle, DB2, Teradata, Netezza etc. accepted) including proven database design and delivery.
Required Technical Skills:
- Expertise in SQL coding, Stored Procedure, Extended Stored Procedure, T-SQL, PL-SQL, SQL.
- Hands-on experience using Microsoft SQL Servers, Visual Studio, Microsoft BI Stack (SSIS, SSRS, SSAS).
- Architect level experience leveraging Azure Storages: ADLS Gen2-Blobs, Containers.
- Architect level experience leveraging Azure Data Ingestion tools: Azure Data Factory (ADF), Azure Event Hubs, Azure IoT Hubs, Azure Event Grid.
- Architect level experience using Azure Data Prep/Train tools: Azure Databricks (ADB), Azure Stream Analytics, Azure Synapse, Azure PowerShell, Azure Stream.
- Experience in complex data warehouse and Data Lake House design.
- Developer experience using Azure Databases: SQL databases, SQL Servers, Azure Synapse Analytics, Azure SQL, Dedicated SQL Pools, SQL Elastic Pools, etc.
- Experience using Azure DevOps: Repos, CI/CD Pipelines, Test Plans.
- Strong experience with NoSQL implementation (Mongo, Cassandra, Cosmos).
- Well versed in Programming languages like Python, SQL, etc.
- Experience with any of the following message / file formats: Parquet, Avro, ORC Protobuf.
- Experience with version control systems like Git.
- Strong experience building BI reports, data visualizations using Microsoft Power BI or another high scale reporting service (MicroStrategy, Tableau acceptable) a plus.
- Strong ability to design, build and manage data pipelines for data structures encompassing data transformation, data models, schemas, metadata, and workload management.
- DWH performance optimization performing intelligent sampling and caching a plus.
- Experience with related technologies such as JSON, XSLT, XQuery, XPath, a plus.
- Strong organizational skills and mindfulness.
Global Indemnity Group celebrates and supports differences. We are committed to creating a diverse and inclusive environment for our employees, customers and communities we serve. Qualified applicants will receive consideration for employment without regard to their age, race, religion, national origin, gender, sexual orientation, gender identity, protected veteran status or disability.