Senior Data Engineer
Description
Company Overview:
Global Technology Services is a rapidly expanding organization situated in Medellín, Colombia. We pride ourselves on possessing one of the most influential networks within software development , AI Solutions and IT services for the entertainment, financial, and logistics sectors. Our corporate projections offer a multitude of opportunities for professionals to elevate their careers and experience substantial growth. Joining our team means engaging with expansive engineering teams across Latin America, Philippines and the United States, contributing to cutting-edge developments in multiple industries.
Currently, we are seeking a Senior Data Engineer with a strong English level to join our team. Here are the challenges that our next warrior will face and the requirements we look for:
Position Title: Senior Data Engineer
Location: LATAM
What you will be doing:
As a Senior Data Engineer, you will play a key role in designing, building, and evolving a modern data platform powered by Snowflake. You will work closely with technical leadership to develop scalable data pipelines, ingestion frameworks, and integrations with multiple external systems, including APIs for real-time and batch data processing. A significant portion of your work will involve writing and optimizing SQL-based transformations, while also leveraging Python for data processing, automation, and integrations.
This role goes beyond execution, you will actively participate in shaping data architecture decisions, contributing to how data models, pipelines, and systems are designed and implemented. You will collaborate directly with both technical and business stakeholders to translate evolving requirements into effective data solutions, often working in an environment where not all requirements are fully defined. Additionally, you will support monitoring, troubleshooting, and improving existing data processes, ensuring reliability, performance, and data integrity across the platform, while contributing to governance, logging, and auditing practices.
Key Responsibilities
- Design, build, and maintain scalable data pipelines and ingestion processes within a Snowflake based data warehouse environment
- Write, optimize, and maintain SQL queries, transformations, and data models to support business and operational needs
- Develop and support integrations with external vendors and internal systems through APIs and other data exchange methods
- Use Python to support data processing, automation, and integration-related workflows
- Collaborate closely with technical leadership to help define architecture, data flows, and implementation strategies
- Participate in technical discussions and provide recommendations on how to improve pipeline design, scalability, and performance
- Translate evolving business needs into effective technical solutions, even when requirements are not fully documented
- Monitor, troubleshoot, and improve existing data pipelines and processes to ensure reliability and data integrity
- Contribute to governance, logging, auditing, and control practices across the data environment
- Work as a hands-on contributor while helping bring structure, ownership, and technical direction to the data function
Required Skills & Experience
- 5+Strong experience writing and optimizing SQL, including complex queries, transformations, and data modeling
- Hands-on experience working with Snowflake as a data warehouse solution
- Experience building and maintaining data pipelines and ETL or ELT processes
- Solid experience with Python for data processing, integrations, or automation
- Familiarity with Power BI or similar BI and reporting tools
- Experience working with APIs and external data integrations, including data ingestion and transformation workflows
- Strong understanding of data architecture concepts, including pipeline design, data modeling, and performance optimization
- Experience working in fast-paced, evolving environments with limited predefined requirements
- Ability to collaborate closely with both technical and business stakeholders
- Advanced English communication skills, B2 level or higher
Nice to Have Skills
- Experience with Azure services, particularly Blob Storage or Functions
- Experience working with custom ingestion frameworks or building internal data processing layers
- Exposure to data governance, auditing, and compliance practices
- Experience in financial services, mortgage, or other regulated environments
- Familiarity with modern cloud based data ecosystems and best practices
Soft Skills
- Strong problem-solving mindset with the ability to work in ambiguous environments
- Proactive and self-directed, capable of taking ownership without constant direction
- Comfortable contributing ideas and respectfully challenging approaches in technical discussions
- Strong communication skills, able to explain technical concepts clearly to different audiences
- Collaborative and team-oriented, with the ability to work closely with leadership
- Adaptable and flexible in a fast-moving, evolving business environment
- Detail-oriented while maintaining awareness of broader business objectives