Senior Data Engineer
Description
Company Overview
Lean Tech is a rapidly growing technology company based in Medellín, Colombia, with a strong presence across Latin America and the United States. Recognized for its influential network within the software development and IT services industries, Lean Tech delivers solutions for the entertainment, financial, and logistics sectors. The company is dedicated to fostering a culture that emphasizes professional growth, collaboration, and technical excellence, providing numerous career advancement opportunities for its employees.
Lean Tech is distinguished by its modern engineering teams and its commitment to leveraging advanced technologies—particularly within the Microsoft Azure ecosystem—to deliver innovative data and analytics solutions. The organization encourages participation in Agile methodologies and values expertise in areas such as ETL/ELT pipeline design, data integration, data quality assurance, and automation. With strengths in tools including Azure Fabric, Azure Data Factory, Azure Data Lake, SQL Server, and Python, Lean Tech empowers its teams to drive impactful projects that support advanced analytics and machine learning initiatives. As an equal opportunity employer, Lean Tech is committed to diversity and inclusion, creating an environment where every team member can thrive.
Position Overview
This role is instrumental in designing, building, and optimizing data pipelines within a Microsoft Azure environment to enable critical business analytics and machine learning initiatives. Responsibilities include expert-level development of ETL/ELT pipelines, advanced data integration, and direct management of Azure Data Lake, Azure Fabric, Azure Data Factory, and SQL Server resources. The position plays a key part in supporting data scientists and BI teams by ensuring that data systems are structured for advanced analytics, forecasting, and machine learning, and by integrating APIs—particularly REST APIs—across the Microsoft ecosystem.
You will contribute to data pipeline orchestration and monitoring using tools such as Azure Data Factory (ADF), Azure Databricks, Azure Functions, and Logic Apps, while maintaining data quality through validation, monitoring, troubleshooting, and performance tuning. Familiarity with medallion architecture and modern data warehouse practices is expected, along with strong proficiency in SQL and Python, particularly using the Azure SDK for Python. Automation of data collection, processing, and reporting is required to drive efficiency and data integrity.
The role supports ongoing data platform re-architecture efforts, documentation of data models and pipelines, and effective management of data security. You will collaborate within Agile teams, participate in ceremonies such as sprint planning and daily stand-ups, and utilize Agile tools like Jira and Confluence.
This position offers the opportunity to make a significant impact on Lean Tech's data landscape while collaborating across diverse teams in a dynamic, growth-focused organization. Unique challenges include working at scale with Azure cloud services, supporting enterprise data systems modernization, and fostering continuous improvement in data engineering practices.
What You Will Be Doing
- Design, build, and optimize advanced ETL/ELT pipelines and data integration workflows to enable robust, reliable, and scalable data ingestion, transformation, and delivery across the Microsoft Azure ecosystem.
- Leverage Azure Data Factory, Azure Fabric, Azure Data Lake, and SQL Server to modernize, migrate, and maintain data systems in alignment with evolving business requirements.
- Implement and maintain medallion architecture (bronze/silver/gold layers) to effectively structure, standardize, and organize data for downstream analytical and business use cases.
- Establish data quality assurance processes by validating, monitoring, troubleshooting, and performance tuning data pipelines with Azure orchestration and monitoring tools such as ADF, Azure Databricks, Azure Functions, and Logic Apps.
- Integrate and manage REST APIs and data flows within the Microsoft environment to support seamless data orchestration and automation initiatives.
- Develop and utilize advanced SQL and Python (including Azure SDK for Python) to process, automate, and enhance data infrastructure and analytics workflows.
- Document data models, pipeline architecture, and integrations to support transparency, knowledge sharing, and ongoing platform scalability.
- Contribute to ongoing re-architecture initiatives to ensure the data platform’s scalability, modernization, and alignment with business strategy.
- Participate in Agile ceremonies and utilize Agile tools (Jira, Confluence) to collaborate effectively with Data Scientists, BI teams, and stakeholders.
- Implement systems and best practices to monitor data quality, ensure data integrity, and manage data security across all pipelines and platforms.
- Automate data collection, processing, and reporting tasks to improve operational efficiency and reduce manual intervention.
- Stay current with evolving Azure data and analytics technologies, modern data warehouse practices, and industry trends to continuously improve systems and workflows.
Required Skills & Experience
- Bachelor’s degree in Computer Science, Engineering, or a related field
- 5+ years of experience in data engineering or backend systems with a strong data focus
- Advanced expertise in designing, building, and optimizing ETL/ELT pipelines for data integration
- Strong proficiency in SQL and Python for data processing and automation; experience with the Azure SDK for Python
- Hands-on experience with the Microsoft Azure ecosystem, including Azure Fabric, Azure Data Factory, Azure Data Lake, and SQL Server
- Practical experience with data pipeline orchestration, monitoring, and data quality assurance using tools such as Azure Data Factory, Azure Databricks, Azure Functions, and Logic Apps
- Working familiarity with medallion architecture (bronze/silver/gold) and modern data warehouse practices
- Applied knowledge of API integration, specifically with REST APIs, within the Microsoft ecosystem
- Experience in documenting data models, pipelines, and integrations for transparency and knowledge sharing
- Proficiency in version control (Git) and Agile tools (Jira, Confluence); experience participating in Agile ceremonies such as sprint planning and daily stand-ups
- Strong analytical and problem-solving skills with the ability to optimize and improve large-scale data systems
- Experience in implementing and maintaining data infrastructure and data sets, including data quality monitoring and ensuring data integrity and security
- Automation of data collection, processing, and reporting tasks to enhance efficiency and reduce manual errors
Nice to Have
- Familiarity with Azure Machine Learning (AzureML) or similar platforms for supporting advanced analytics and AI initiatives
- Exposure to big data technologies such as Spark or Hadoop within the cloud ecosystem
- Experience with additional cloud platforms (e.g., AWS, Google Cloud Platform) to complement Azure skills
- Certifications such as Microsoft Certified: Azure Data Engineer Associate or equivalent
- Competence with containerization or orchestration tools like Docker or Kubernetes
- Knowledge of DevOps practices applied to data engineering workflows
- Experience with data visualization or BI tools (e.g., Power BI, Tableau) for facilitating collaboration between teams
- Strong communication skills for engaging with cross-functional stakeholders
- Ability to adapt quickly to emerging technologies and industry best practices
- Background in mentoring or supporting the professional development of team members
Soft Skills
- Strong collaboration skills, demonstrated by effective teamwork with Data Scientists, BI teams, and stakeholders to align business objectives with technical solutions.
- Clear and concise communication abilities, essential for documenting complex data models, pipelines, and integrations, and for participating in Agile ceremonies such as sprint planning and daily stand-ups.
- Adaptability and willingness to embrace new tools and technologies within the Microsoft Azure ecosystem, with a continuous improvement mindset in a dynamic work environment.
- Strong problem-solving approach, applied to troubleshooting, performance tuning, and ensuring data quality and integrity in large-scale data systems.
- Proven organizational and time management skills, necessary for balancing multiple priorities and automating data collection, processing, and reporting tasks efficiently.
- Commitment to accountability and knowledge sharing, fostering transparency within cross-functional engineering initiatives.
Why You Will Love Working with Lean Tech
Join a powerful tech workforce and help us change the world through technology
- Professional development opportunities with international customers
- Collaborative work environment
- Career path and mentorship programs that will lead to new levels.
Join Lean Tech and contribute to shaping the data landscape within a dynamic and growing organization. Your skills will be honed, and your contributions will play a vital role in our continued success. Lean Tech is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.