Mid+ Data Engineer
Description
Company Overview:
Lean Tech is a rapidly expanding organization situated in Medellín, Colombia. We pride ourselves on possessing one of the most influential networks within software development and IT services for the entertainment, financial, and logistics sectors. Our corporate projections offer a multitude of opportunities for professionals to elevate their careers and experience substantial growth. Joining our team means engaging with expansive engineering teams across Latin America and the United States, contributing to cutting-edge developments in multiple industries.
We are seeking a Mid+ Data Engineer to support our data infrastructure using AWS (S3, RDS PostgreSQL), Airflow, Snowflake, Power BI, and designing transactional databases to meet business needs.
Position Title: Mid+ Data Engineer
Location: Colombia
What you will be doing:
We are seeking a Mid+ Data Engineer who will support our data infrastructure by working with technical and business teams. This role requires expertise in AWS (with a focus on S3 and RDS PostgreSQL), data pipeline orchestration (Airflow), cloud data warehouse (Snowflake), ETL processes, Power BI dashboard and report creation, and data visualization. You will also collaborate closely with cross-functional teams to design transactional databases that support our growing business needs. Your responsibilities will include:
- Design and develop scalable data pipelines and workflows using Airflow to orchestrate data ingestion, transformation, and delivery.
- Leverage AWS services (S3, RDS PostgreSQL, etc.) to build robust, highly available data solutions.
- Collaborate with stakeholders to understand data requirements, ensuring that data models and solutions align with business objectives.
- Develop and maintain transactional databases in PostgreSQL on AWS RDS, ensuring optimal performance and reliability.
- Work closely with application teams to design and optimize efficient, normalized transactional data structures.
- Design, develop, and optimize ETL processes to extract data from various sources, transform it, and load it into target databases or data warehouses.
- Ensure ETL pipelines are efficient, fault-tolerant, and capable of handling large data volumes.
- Maintain and monitor data quality across ETL processes to ensure reliable and accurate data flow.
- Design data models that support efficient analytics and align with business requirements.
- Ensure data quality and consistency across multiple data pipelines and systems.
- Develop and optimize data schemas to support reporting and data science initiatives.
- Build and develop Power BI dashboards and reports to facilitate decision-making, ensuring clear visual representation of key data metrics.
- Implement best practices for performance tuning, capacity planning, and cost optimization across AWS, Snowflake, and other data platforms.
- Monitor and troubleshoot data pipelines, proactively resolving bottlenecks and ensuring data availability.
- Collaborate with cross-functional teams (Data Science, Product, IT, etc.) to ensure seamless data integration and architecture alignment.
- Participate in architecture reviews, technical roadmaps, and strategic planning discussions.
Requirements & Qualifications
To excel in this role, you should possess:
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- 4+ years of experience in Data Engineering or a similar role.
- ETL: Experience in designing, building, and maintaining ETL pipelines.
- AWS: Deep experience with S3, RDS (PostgreSQL), IAM, EC2, Lambda, or other related services.
- Databases: Strong proficiency in PostgreSQL; ability to design normalized transactional database structures and optimize complex SQL queries.
- Orchestration: Practical experience with Airflow for automating complex data pipelines.
- Cloud Data Warehouse: Experience designing, developing, and optimizing data models in Snowflake.
- BI Tools: Working knowledge of Power BI or other visualization tools for analytics and reporting.
- Programming: Proficiency in Python, SQL, and shell scripting.
Nice to have
- Version Control: Hands-on expertise with Liquibase (or similar) for database version control.
- CI/CD & DevOps: Experience with modern software development practices (Git, CI/CD pipelines) is a plus.
Soft Skills
- Proactive, go-getter mentality.
- Trustworthy and dependable.
- Excellent interpersonal skills to collaborate across teams.
- Able to adapt and thrive in a dynamic environment.
Why Join Us?
At Lean Tech, we are more than just a company—we are warriors working together to change the world through technology. If you're ready to join a growing, dynamic team that values innovation and continuous improvement, this is the place for you.