Data Warehouse Developer
Description
Why NESS
We know that people are our greatest asset. Our staff's professionalism, innovation, teamwork, and dedication to excellence have helped us become one of the world's leading technology companies. It is these qualities that are vital to our continued success. As a Ness employee, you will be working on products and platforms for some of the most innovative software companies in the world.
You'll gain knowledge working alongside other highly skilled professionals that will help accelerate your career progression.
You'll also benefit from an array of advantages like access to trainings and certifications, bonuses, and aids, socializing activities, and attractive compensation.
Requirements and responsibilities
What you'll do:
- Design, develop, and maintain scalable data warehouse solutions (Amazon Redshift & RDS) to meet business and analytics needs.
- Build and optimize ETL/ELT pipelines for data extraction, transformation, and integration using AWS services like Lambda, Glue, S3, and Kinesis.
- Collaborate with software engineering teams to design robust dimensional data models, data marts, and data lake solutions.
- Develop and optimize stored procedures for data transformation and processing within the data warehouse.
- Ensure high data quality and performance by leveraging continuous integration tools and best practices.
- Implementation of cloud-based infrastructure, focusing on scalable, secure, and cost-efficient AWS architectures.
- Monitor and troubleshoot data workflows to ensure stability, reliability, and performance.
- Stay updated with the latest technologies and methodologies to improve efficiency and productivity.
Key skillsets:
- 5+ years of professional experience in data engineering with hands-on expertise in data warehouse technologies, especially Amazon Redshift or similar.
- Proficiency with AWS cloud services (e.g., Lambda, S3, Kinesis, Glue, RDS, Redshift).
- Strong knowledge of dimensional data modeling and experience designing data marts and OLAP systems.
- Experience with stored procedures and their use for data transformations and optimizations within databases.
- Experience with serverless computing and event-driven architectures using AWS Lambda.
- Strong knowledge of ETL/ELT pipeline development and data integration best practices.
- Experience with CI/CD systems (e.g., Jenkins) and version control tools (e.g., Git, Bitbucket).
- Proficiency with infrastructure-as-code tools like Terraform.
- Excellent problem-solving, analytical, and communication skills.
- Strong experience with SQL and Python (or Java) for data processing.
Nice to have:
- Experience with Machine Learning Operations (MLOps) and integrating ML models into production data pipelines.
- Familiarity with data streaming technologies like Apache Kafka or AWS Kinesis.
- Knowledge of DevOps principles and cloud infrastructure management.
Not checking every single requirement?
If this role sounds good to you, even if you don't meet every single bullet point in the job description, we encourage you to apply anyway. For most of the candidates that applied, we found a role that was a very good fit with their skills.
Let's meet and you may just be the right candidate for one of our roles.
At Ness Digital Engineering we are willing to build a work culture that is based on diversification, inclusion, and authenticity.