Senior Data Engineer

Engineering Indianapolis, IN Remote, United States


Description

Position at Jobvite

Who We Are: 

Jobvite offers a marketing-inspired approach to recruiting by combining the power of data and the human touch. We enable recruitment teams to personalize the candidate experience with a full-scale talent platform that accelerates and simplifies the entire talent acquisition process — from the moment the search begins to the celebration of the first promotion and beyond. We make it possible to automate communication through bot-powered texts and anticipate the employee journey with internal mobility and referral tools driven by human insight and assisted by AI. With a comprehensive, talent acquisition-focused suite Jobvite has been serving customers, including Dollar Shave Club, Dunkin Brands, OpenTable, LinkedIn, Zappos, Universal Music Group, Wayfair, Zillow & Feeding America, since 2006. 

What you''ll do:

  • Translate business requirements into technical specifications.
  • Participate in all design reviews and requirement sessions, as required.
  • Understand database design, programming concepts, data modeling, and framework management.
  • Communicate ideas to both technical and non-technical people in all levels of the organization.
  • Create or update technical documentation for transition to support teams, including data flows and transformations.
  • Design, develop, and test data pipeline solutions and automate data loading processes.
  • Develop and implement an efficient migration process to move data pipeline objects from development to test and production environments.
  • Analyze data requirements, complex source data, and application data models, and determine the best methods for integrating data to support internal and external analytical needs.
  • Design and implement data models to support reporting, dashboarding, and integration needs.
  • Develop automated data audit, testing, and validation processes.
  • Stay up to date on ever evolving technologies and processes for managing data pipelines

What you'll bring: 
  • 5+ years of data and/or software engineering experience.
  • Expert SQL skills.
  • Experience with designing and developing complex data flows.
  • Experience in creating data models based on complex business entities.
  • Experience with large scale, complex data migration efforts.
  • Experience with loading data from internal / external APIs.
  • Developing streaming data flows (i.e., w/ Kafka, Beam, AWS SQS, Spark Streaming)
  • Experience with public cloud solutioning (i.e., AWS, Azure, GCP).
  • Experience with public cloud analytical database solutions (i.e., Snowflake, Redshift).
  • Excellent Analytical skills required.
  • Experience with shell scripting required.
  • Flexibility/Adaptability is required, especially when working as a team.
  • Preferred Experience:
    • Experience with Java, Ruby, Python
    • Experience with Machine Learning.
    • Experience with BI and Reporting Tools (i.e., Looker, Tableau, PowerBI, Qlik).
    • Experience in delivering solutions based on Agile principles
    • Experience with Kubernetes / containers

Sign up for Job Alerts

Learn more about our
values and perks

values and perks

Follow us on Instagram

Check out our
Glassdoor Rating

Glassdoor