Data Engineer - ETL/Python

Technology Argentina, Colombia, Uruguay


This is an excellent opportunity to join a great multi-national company dedicated to providing IT solutions around the globe.
We believe that clients should be able to focus on their business development while letting us take care of the rest that happens under the hood.

We are looking for a Data Integration Engineer to rock it!
You will work on special projects with a brilliant team helping our clients to succeed

The data engineering team builds and maintains data ingestion pipelines for various types of data, including transactional data, work orders, and telemetry. The existing pipelines are in Jython and Python, with Python the go-forward stack. In this role, you will be responsible for creating, maintaining, and testing python-based data pipelines. You will provide insights and input as the architecture scales, focusing on high-volume data issues like memory utilization, batching, and the use of queueing and messaging systems.

Coordinating with development and data teams to determine data ingest requirements.
Writing scalable code using Python programming language.
Testing and debugging ETL (StreamSets), data scripts, and related applications.
Assessing and prioritizing client feature requests.
Integrating data storage solutions.
Coordinating with application developers.
Reprogramming existing databases, data storage and data ingest to improve functionality.
Develop Python ingestion scripts


3+ years of experience as a Python developer focused on data engineering
Expert knowledge of Python, ML, and data-related frameworks and libraries
A proven ability to manage and debug high-volume data structures

Nice to Have:
Bachelor's degree in Computer Science, Computer Engineering, or related field.
Experience with Kafka or similar messaging and queuing systems
Experience with Jython
Experience with data ingest and management tools like Azure Data Factory, AWS Glue, EventHub, and Data Connectors