Senior Data Engineer
Description
Company Overview:
Lean Tech is a rapidly expanding organization situated in Medellín, Colombia. We pride ourselves on possessing one of the most influential networks within software development and IT services for the entertainment, financial, and logistics sectors. Our corporate projections present numerous opportunities for professionals to advance their careers and experience substantial growth. Joining our team means engaging with expansive engineering teams across Latin America and the United States, contributing to cutting-edge developments in multiple industries.
Position Title: Senior Data Engineer
Location: Remote - Colombia
What you will be doing:
We are seeking a technical professional for a core role centered on the development, maintenance, and operational support of our data platform. This position is integral to ensuring the reliability and performance of data pipelines built on a Snowflake and Azure Data Factory (ADF) stack. A unique aspect of this role is its dual focus, with approximately 80% of responsibilities dedicated to reactive operational support and incident resolution, and 20% to the proactive enhancement of data pipelines. The successful candidate will leverage advanced proficiency in SQL and Python to maintain the stability of the data ecosystem, directly impacting the platform's operational integrity. Your responsibilities include:
- Diagnose and resolve production incidents, including Control-M job failures and Azure Data Factory (ADF) pipeline defects, to ensure the operational stability and reliability of the Snowflake data platform.
- Conduct comprehensive Root Cause Analysis (RCA) for production issues and meticulously document all incident management activities within the ServiceNow framework.
- Manage and maintain the Control-M orchestration layer, ensuring the proper execution and dependency management of ADF data pipelines.
- Develop and implement infrastructure automation scripts and API integrations using Python to enhance platform efficiency and reduce manual intervention.
- Identify and refactor inefficient code within data pipelines, participating in peer code reviews through Azure Pipelines to uphold high standards of code quality and performance
Requirements & Qualifications
To excel in this role, you should possess:
- A minimum of 4-5 years of professional experience in data engineering or ETL development.
- Advanced, hands-on expertise with modern data platforms, specifically Snowflake and Azure Data Factory (ADF).
- Advanced proficiency in SQL for complex data querying, manipulation, and analysis.
- Advanced proficiency in Python, with a specific focus on its application for infrastructure automation and API integration.
- Advanced proficiency in both written and verbal English communication.
- Experience with Control-M for job scheduling and orchestration.
Desired Skills:
- Experience with CI/CD practices using Azure Pipelines. Familiarity with IT Service Management (ITSM) frameworks and tools, particularly ServiceNow.
- Familiarity with the Azure DevOps suite for managing the software development lifecycle.
- Previous experience working within a consulting or client-facing environment.
Why you will love Lean Tech:
- Join a powerful tech workforce and help us change the world through technology
- Professional development opportunities with international customers
- Collaborative work environment
- Career path and mentorship programs that will lead to new levels.
Join Lean Tech and contribute to shaping the data landscape within a dynamic and growing organization. Your skills will be honed, and your contributions will be vital to our continued success. Lean Tech is an equal-opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.