Senior Data Engineer

Data EngineerRemote, Brazil


Description

Company Overview
Lean Tech is a rapidly expanding organization situated in Medellín, Colombia. We pride ourselves on possessing one of the most influential networks within software development and IT services for the entertainment, financial, and logistics sectors. Our corporate projections offer a multitude of opportunities for professionals to elevate their careers and experience substantial growth. Joining our team means engaging with expansive engineering teams across Latin America and the United States, contributing to cutting-edge developments in multiple industries.
Currently, we are seeking a Senior Data Engineer to join our team. Here are the challenges that our next warrior will face and the requirements we look for: 
Position Title: Senior Data Engineer
Seniority: Senior
Location: Remote

Position Overview
We are seeking an experienced Data Engineer to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data pipelines, working with big data technologies, and leveraging AWS services for data processing, storage, analytics, and messaging. Experience with microservices architecture is essential for this role.
 
Key Responsibilities
  • Design, implement, and maintain scalable ETL pipelines using AWS services such as Glue, EMR, and Lambda, ensuring data reliability and efficiency.
  • Optimize and manage data models for PostgreSQL in RDS and Amazon Redshift, ensuring performance and reliability.
  • Implement robust data quality checks and real-time monitoring systems to maintain data integrity across all platforms.
  • Design and implement event-driven architectures utilizing SNS and SQS for real-time data processing and integration.
  • Collaborate with data scientists and developers to understand data requirements, providing efficient access solutions and optimized queries for PostgreSQL databases.
  • Develop and maintain comprehensive data documentation, including data dictionaries and architectural diagrams.
  • Implement and secure data lakes on Amazon S3, ensuring seamless integration with other AWS services.
  • Optimize existing data workflows to enhance performance, scalability, and cost-efficiency, including RDS configurations.
  • Establish and manage CI/CD pipelines using GitHub Actions for data workflows, including database schema changes.
  • Mentor junior data engineers, promoting best practices in data engineering and microservices architectures.
  • Provide technical guidance in designing and implementing data-centric microservices using .NET 6 and above, ensuring scalability and performance.
  • Utilize technologies like Kafka and Kinesis for real-time data processing and streaming.
  • Plan and execute database migrations using AWS DMS, ensuring minimal downtime and data integrity.
  • Stay current with emerging data engineering trends and technologies, fostering innovation within the team.

Required Skills & Experience
 
  • Bachelor's degree in Computer Science, Data Science, Engineering, or a related field
  • Minimum 5 years of experience in data engineering, with a focus on AWS and .NET technologies
  • Advanced proficiency in .NET development (version 6 and above) and Entity Framework Core
  • Extensive experience with AWS data services, including RDS, Glue, EMR, Redshift, Athena, S3, SNS, SQS, and DMS
  • Advanced skills in managing and optimizing PostgreSQL databases within AWS RDS, including performance tuning and query optimization
  • Expertise in designing and implementing microservices architectures, particularly for data-intensive applications
  • Proficiency in developing robust and scalable APIs using .NET and working with API gateways
  • Experience with event-driven architectures and message queuing systems, especially SNS and SQS
  • Strong knowledge of data modeling and designing efficient schemas for various data stores, including PostgreSQL databases and warehouses
  • Intermediate proficiency with Git and CI/CD practices for data workflows, including database schema management using GitHub Actions
  • Knowledge of data governance, security, and compliance best practices, particularly with AWS services
  • Strong skills in problem-solving and optimizing complex data workflows
  • Strong communication skills to effectively translate technical concepts to non-technical stakeholders
  • Experience with container technologies such as Docker and orchestration platforms like Kubernetes is beneficial

Nice to Have Skills
  • Experience with real-time data processing and streaming technologies, such as Kafka or Kinesis
  • Familiarity with machine learning workflows and MLOps practices
  • Understanding of domain-driven design (DDD) principles as applied to microservices
  • Knowledge of data visualization tools, such as Tableau or PowerBI, and their integration with AWS services
  • Relevant certifications, such as AWS Certified Data Analytics - Specialty or AWS Certified Database - Specialty
  • Strong communication skills, particularly in translating technical concepts for non-technical stakeholders
Why You Will Love Working with Us
  • Join a powerful tech workforce and help us change the world through technology
  • Professional development opportunities with international customers
  • Collaborative work environment
  • Career path and mentorship programs that will lead to new levels.
Join Lean Tech and contribute to shaping the data landscape within a dynamic and growing organization. Your skills will be honed, and your contributions will play a vital role in our continued success. Lean Tech is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.