Senior Azure Data Engineer

Information Technology Remote (Home Office), United States


GBLI | Global Indemnity provides specialty property and casualty insurance for small to middle-market businesses – and we’re on a mission to be the best-in-class while achieving steady, profitable growth. Our guiding principles include the core belief that our people are number one. We also strongly emphasize a customer-centric mentality and disciplined underwriting practices. Our work environment is flexible, friendly, and collaborative, with plenty of opportunities to take charge of your career. 

What GBLI offers you:

  • Generous paid time off (PTO)
  • Professional development opportunities (including a mentorship program)
  • Educational assistance program, which covers up to $5,250 in educational costs per year
  • Comprehensive health insurance plan (with vision and dental)
  • No-cost health insurance plan available
  • Life insurance
  • 401(k) retirement plan with up to 6% company match and immediate vesting
  • Healthcare and dependent care flexible spending accounts
  • Short-term and long-term disability
  • Company-sponsored social events
  • Various committees to get involved in, which include our Diversity, Equity, and Inclusion Committee, Charitable Giving Committee, and Employee Wellness Committee

What you will do: 
  • Build and maintain a set of managed data pipelines consisting of a series of stages through which data flows for our data warehouse (DWH) and related data stores and data marts. These data pipelines must be created, maintained, and optimized as workloads move from development to production for specific use cases that support long-term strategic goals and short-term tactical plans for creating, managing, and maintaining corporate data systems and software.
  • Ensure data applications are created to the highest standards and meet all requirements by implementing and maintaining unit tests as well as automated regression, integration, and performance tests.
  • Help maintain and extend software development coding/data standards – including but not limited to naming standards, documentation standards, and design pattern recommendations.
  • Develop and maintain Data Documentation so that new and existing databases, flows and stores can be added to an operation manual.
  • Work with traditional and agile software life cycle methodologies including working with users and Business Analysts to define requirements and create design documents.
  • Conduct code reviews to ensure developers are following recommended coding practices and adhering to coding standards when releasing changes.
  • Adhere to and maintain source control and help develop branching and merging strategies to control code promotion and effectively segregate development, test, and production environments.
  • Create and maintain CI/CD pipelines with Azure DevOps.
  • Liaise with all aspects of App Delivery and IT including network administrators, systems analysts, testers, and software engineers to assist in resolving problems with software products or company software systems.
  • Use experience to aid and contribute to the recommendation, scheduling, and performance of Data Landscape software improvements and upgrades, supporting other members of the team in decision making.
  • Provide status on a regular basis and provide guidance to teammates.
  • Work independently as well as thrive in a collaborative environment.
  • Leverage analytical skills and critical thinking to understand current environment and business needs and provide innovative ideas to improve processes.
Requirements and Qualifications:
  • Relevant 3rd Level qualification (Degree, Masters) in a related discipline (Data Analytics, Data Science, Computer Science, Technology) or equivalent workplace experience.
  • Minimum 7 years of experience working on Data and Analytics projects.
  • Minimum 5 years of experience working on Cloud Platforms (Azure, AWS, Google Cloud Platform etc.) and any Cloud certifications (e.g., Azure Data Engineer Associate Certification) preferred.
  • Minimum 3 years of experience working on Data projects in an Agile environment.
  • Insurance / Financial industry experience desirable.
  • Minimum 7 years with ETL tools in a Highly Available DWH setting.
  • Minimum 7 years DWH development with star/snowflake methodologies.
  • Minimum 7 years working with relational preferably Microsoft SQL server (Oracle, DB2, Teradata, Netezza etc. accepted) including proven database design and delivery.

Required Technical Skills:
  • Expertise in SQL coding, Stored Procedure, Extended Stored Procedure, T-SQL, PL-SQL, SQL.
  • Hands-on experience using Microsoft SQL Servers, Visual Studio, Microsoft BI Stack (SSIS, SSRS, SSAS).
  • Architect level experience leveraging Azure Storages: ADLS Gen2-Blobs, Containers.
  • Architect level experience leveraging Azure Data Ingestion tools: Azure Data Factory (ADF), Azure Event Hubs, Azure IoT Hubs, Azure Event Grid.
  • Architect level experience using Azure Data Prep/Train tools: Azure Databricks (ADB), Azure Stream Analytics, Azure Synapse, Azure PowerShell, Azure Stream.
  • Experience in complex data warehouse and Data Lake House design.
  • Developer experience using Azure Databases: SQL databases, SQL Servers, Azure Synapse Analytics, Azure SQL, Dedicated SQL Pools, SQL Elastic Pools, etc.
  • Experience using Azure DevOps: Repos, CI/CD Pipelines, Test Plans.
  • Strong experience with NoSQL implementation (Mongo, Cassandra, Cosmos).
  • Well versed in Programming languages like Python, SQL, etc.
  • Experience with any of the following message / file formats: Parquet, Avro, ORC Protobuf.
  • Experience with version control systems like Git.
  • Strong experience building BI reports, data visualizations using Microsoft Power BI or another high scale reporting service (MicroStrategy, Tableau acceptable) a plus.
  • Strong ability to design, build and manage data pipelines for data structures encompassing data transformation, data models, schemas, metadata, and workload management.
  • DWH performance optimization performing intelligent sampling and caching a plus.
  • Experience with related technologies such as JSON, XSLT, XQuery, XPath, a plus.
  • Strong organizational skills and mindfulness.

Global Indemnity Group celebrates and supports differences. We are committed to creating a diverse and inclusive environment for our employees, customers and communities we serve. Qualified applicants will receive consideration for employment without regard to their age, race, religion, national origin, gender, sexual orientation, gender identity, protected veteran status or disability.