Senior Data Engineer

Data EngineeringRemote, Romania


Description

Position at Ness Romania SRL

Why Ness

We know that people are our greatest asset. Our staff's professionalism, innovation, teamwork, and dedication to excellence have helped us become one of the world's leading technology companies. It is these qualities that are vital to our continued success. As a Ness employee, you will be working on products and platforms for some of the most innovative software companies in the world.

You'll gain knowledge working alongside other highly skilled professionals that will help accelerate your career progression.

You'll also benefit from an array of advantages like access to trainings and certifications, bonuses, and aids, socializing activities, and attractive compensation.

Requirements and responsibilities

What you'll do:

    • Work on assigned projects delivering ETL (Extract-Transform-Load) code development;
    • Build/deployments & related artifacts for new development, enhancements & break-fixes;
    • Process and Data Flow diagrams;
    • Technical Design Documentation;
    • Following coding standards;
    • Managing and using Git repos;
    • Following CI/CD procedures;
    • Requesting and performing code reviews;
    • Creating and executing Unit Tests.

    What you'll bring:

    • 5+ years of ETL / data engineering within DWH / Data lake / BI reporting systems.
    • 2+ Experience in developing Data Factory pipelines that are parametrized and reusable and worked on most of the ADF control flow activities like GetMetadata, Lookup, Copy, For Each, Databricks, Stored Procedure, If, Web, etc.
    • 2+ years experience in building Lakehouse using Azure Databricks and open-source Delta.
    • 2+ years experience in creating pipelines for ingestion, transformation, and custom requirements using Azure Data Factory and Azure Databricks.
    • Experience tuning SQL queries in Databricks, Azure SQL DB, Azure Synapse (nice to have).
    • Experience in creating generic Pipelines & Dataflow mappings using Azure Data Factory.
    • Experience in creating Python/SQL notebooks to transform & load data using Databricks.

    Key skillsets:

    • Azure Cloud: Azure Databricks, Azure Datalake Storage, Azure Data Factory;
    • Certification – Azure data engineer;
    • Nice to have: Azure Synapse, ETL Tools such as Visual Studio 2015 (SSIS) Informatica PowerCenter 10.1.1/9.1, or DataStage.

    Not checking every single requirement?

    If this role sounds good to you, even if you don't meet every single bullet point in the job description, we encourage you to apply anyway. For most of the candidates that applied, we found a role that was a very good fit with their skills.

    Let's meet and you may just be the right candidate for one of our roles.

    At Ness Digital Engineering we are willing to build a work culture that is based on diversification, inclusion, and authenticity.