Data Engineer

Engineering Remote, United States Indianapolis, IN


Description

Position at Jobvite

Who We Are: 

Jobvite offers a marketing-inspired approach to recruiting by combining the power of data and the human touch. We enable recruitment teams to personalize the candidate experience with a full-scale talent platform that accelerates and simplifies the entire talent acquisition process — from the moment the search begins to the celebration of the first promotion and beyond. We make it possible to automate communication through bot-powered texts and anticipate the employee journey with internal mobility and referral tools driven by human insight and assisted by AI. With a comprehensive, talent acquisition-focused suite Jobvite has been serving customers, including Dollar Shave Club, Dunkin Brands, OpenTable, LinkedIn, Zappos, Universal Music Group, Wayfair, Zillow & Feeding America, since 2006. 

What you''ll do:

  • Translate business requirements into technical specifications.
  • Participate in all design reviews and requirement sessions, as required.
  • Understand database design, programming concepts, data modeling, and framework management.
  • Communicate ideas to both technical and non-technical people in all levels of the organization.
  • Create or update technical documentation for transition to support teams, including data flows and transformations.
  • Design, develop, and test data pipeline solutions and automate data loading processes.
  • Develop and implement an efficient migration process to move data pipeline objects from development to test and production environments.
  • Analyze data requirements, complex source data, and application data models, and determine the best methods for integrating data to support internal and external analytical needs.
  • Design and implement data models to support reporting, dashboarding, and integration needs.
  • Develop automated data audit, testing, and validation processes.
  • Stay up to date on ever evolving technologies and processes for managing data pipelines

What you'll bring: 
  • 5+ years of data and/or software engineering experience.
  • Expert SQL skills and database ETL/ELT
  • Development experience with Java, Ruby, Python
  • Experience with public cloud solutioning (i.e., AWS, Azure, GCP).
  • Developing streaming data flows (i.e., w/ Kafka, Beam, AWS SQS, Spark Streaming)
  • Experience leading development efforts - gathering requirements, analyzing solutions, and executing
  • Additional Experience needed:
    • Designing and developing complex data flows.
    • Creating data models based on complex business entities.
    • Large scale, complex data migration efforts.
    • Loading data from internal / external APIs.
    • Public cloud analytical database solutions (i.e., Snowflake, Redshift).
    • Shell scripting.
  • Excellent Analytical skills and Flexibility/Adaptability when working as a team.
  • Preferred Experience:
    • Experience with Machine Learning.
    • Experience with BI and Reporting Tools (i.e., Looker, Tableau, PowerBI, Qlik).
    • Experience in delivering solutions based on Agile principles
    • Experience with Kubernetes / containers

* This role can also be remote based anywhere in the United States 

Sign up for Job Alerts

Learn more about our
values and perks

values and perks

Follow us on Instagram

Check out our
Glassdoor Rating

Glassdoor