Intermediate Data Engineer, Business Intelligence
The Business Intelligence Team at Kal Tire has a mission to enable everyone at Kal Tire to make data-driven decisions based on actionable insight.
The Intermediate Data Engineer plays a key role in bringing this vision to life by collaborating with fellow data engineers and analysts and building solutions that add business value, enable greater business efficiency and user enablement. The Intermediate Data Engineer actively participates in our analytics community coaching and supporting our users to enable our distributed analytics and governance functions metadata.
- Analyze requirements to recommend, design and implement solutions for extracting and loading data at scale.
- Apply big-data techniques and tools to cleanse, transform and load data.
- Enrich and curate data according to established standards and processes.
- Leverage automation, machine learning or artificial intelligence to reduce manual effort and time to deliver.
- Implement new technology to keep ahead of the demand for data and analytics.
- Extend and evolve our data platform in collaboration with the BI team and user community.
Platform Enablement and Governance
- Implement recommend improvements to our architecture and processes.
- Plan and Implement Features that enhance the consistency, scalability, useability and security of our platform and assets.
- Support client groups through the data flow architecture, request and deployment process
- Actively contribute to the data community by posting articles, best practices and lessons learned and hosting interactive and pre-recorded learning sessions.
Support and Operations
- Troubleshoot, analyze, resolve and prevent data load failures, support requests and incidents. Escalate issues to Senior Engineers and Support Partners as required.
- Roll-out code changes, bug fixes and feature enhancements via continuous integration and deployment pipelines
- Provide on-call support as required.
- 3-5+ years’ experience supporting and developing data, reporting and analytical platforms at a medium-large size organization.
- Experience with Microsoft Azure including resources, resource groups, identity and permissions, cost analysis, automation accounts, log analytics, alerts, etc.
- Extensive experience with the Azure data platform. The ideal candidate has experience with most of the following: Data Factory, Databricks, Synapse Analytics, Azure Data Lake, Power BI, and Delta Lake.
- Experience using Azure DevOps including Deployment Pipelines and Continuous Integration and Development/Deployment Techniques.
- Experience working in an Agile Development environment; the ideal candidate has used Azure Dev Ops to plan and participate in intake, backlog management, project completion and sprint activities.
KNOWLEDGE, SKILLS AND ABILITIES
- Knowledge of SQL, Python, Scala, Spark, DAX, Azure Automation Accounts, Logic Apps, Power Apps and the Power Platform.
- Working knowledge of GIT including making branches commits, pull requests and merges.
- Awareness of and exposure to new and up-in-coming technologies in the BI/Analytics/Azure space.
- Knowledge of incremental loading patterns and a strong ability to extend and improve those patterns.
- Ability to quickly understand business problems and apply new, often unknown tools and techniques to solve the problem.
- Able to understand and empathize with customer needs and challenges.
- Able to coach, educate and inspire clients to enable their own analytic programs.
- Comfortable balancing the needs of the customer with the needs of the team.
- Ability to self-teach/learn new technologies, software languages, APIS independently and with minimal support.
- Post-secondary graduate with a degree, diploma or certificate in computer science or equivalent combination of education, training and experience.
- Evening and occasional weekend work may be required.
- Ability to participate in on-call after-hours support rotation will be required.