Senior Data Engineer

Corporate Opportunities Folsom, California


Description

Salary Range: $102,900.00 - $128,000.00
Exact compensation may vary based on skill, experience and location.
Why SAFE?
SAFE offers so much more than just full medical, vision, dental, 401k matching, HSA, and FSA! Learn more about how we support our workforce!
  • Professional Development Opportunities: Offering training programs, workshops, and mentorship.
  • Recognition and Appreciation: Regularly acknowledging employee achievements and contributions.
  • Flexible Work Arrangements: Providing options for remote work and flexible scheduling.
  • Positive Company Culture: Fostering an inclusive, collaborative, and supportive work environment.
  • Career Growth: Clear paths for career advancement and internal promotions.
  • Work-Life Balance: Encouraging a healthy balance between professional and personal life.
  • Employee Empowerment: Allowing employees to make decisions and have autonomy in their roles.
  • Space of Belonging: ERGs, YOUnity Council and a focus around diversity, equity inclusion and belonging.  
  • Wellness Programs: Promoting physical and mental health through wellness initiatives and resources.
  • Strong Leadership: Having leaders who inspire, support, and guide their teams effectively.
  • Sense of Purpose: Creating a sense of mission and aligning company goals with employees' personal values.
ESSENTIAL FUNCTIONS AND BASIC DUTIES
  • Enforce standards, data engineering principles and practices.
  • Design, build, deploy, automate, and maintain end-to-end data pipelines for new and existing data sources and targets utilizing modern ETL/ELT tools and practices, including stream processing technologies where appropriate.
  • Demonstrates problem solving ability that allows team for timely and effective issue resolution.
  • Drives and completes project deliverables within the data engineering & management area according to project plans.
  • Utilize in-depth technical expertise regarding data models, data analysis and design, master data management, metadata management, reference data management, data warehousing, business intelligence, and data quality improvement.
  • Ability to influence internal clients to leverage standard capabilities and data driven decisions.
  • Work with internal technical resources to optimize the data Lakehouse through hardware or software upgrades or enhancements.
  • Design and implement data models that balance performance, flexibility, and ease of use, considering both analytical and operational needs. Enable and support self-service analytics by designing intuitive data models and views, collaborating with the Business Intelligence team to ensure data is easily accessible and interpretable for business partners.  Work with vendors to troubleshoot and resolve system problems, providing on-call support as required.
  • Manage and automate the deployment of upgrades, patches, and new features across the data infrastructure, ensuring minimal disruption to data services and maintaining system integrity. Conduct code review and approvals for data pipelines developed and implemented by team.
  • Ensure compliance to all data Lakehouse administration activities.
  • Design and manage implementation of data models to meet user specifications, while adhering to prescribed standards.
  • Manage and collect business metadata and data integration points.
  • Coordinate with business analysts and prepare data design for systems; analyze user requirements; prepare technical design specifications to address user needs.
  • Develop and implement comprehensive testing strategies, including automated unit, integration, and end-to-end tests, to ensure the accuracy, reliability, and performance of data pipelines and procedures. Provide technical support and coordination during Lakehouse design, testing, and movement to production.
  • Enforce standards and procedures to ensure data is managed consistently and properly integrated within the Lakehouse.
  • Create and maintain thorough, up-to-date documentation for all data engineering projects, processes, and systems, adhering to organizational standards and leveraging modern documentation tools and practices. Implement business rules via coding, stored procedures, middleware, or other technologies ensuring scalability and maintainability of implemented solutions.
  • Analyze processes in specialty areas to isolate and correct problems and improve workflow.
  • Implement and maintain robust data quality assurance processes, including automated checks and balances, to ensure the integrity, accuracy, and reliability of data across all stages of processing and storage. Maintain an awareness of data management and business intelligence trends, products, technical advances, and productivity tools that apply to the SAFE environment through vendor and third-party classes, self-study, and publications.
  • Establish, document, and enforce coding standards, best practices, and architectural guidelines for the data engineering team, promoting consistency, efficiency, and maintainability in all data solutions. Complete other duties, as assigned.
 
QUALIFICATIONS
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. 
 
Education/Experience:
Bachelor's degree in Computer Science, Data Science, Engineering, or a related technical field. Master's degree is preferred but not required. Minimum of 7 years of professional experience in data engineering, with at least 3 years in a senior or lead role. Equivalent combination of education and experience may be considered.
Demonstrated experience with:
  • Designing and implementing large-scale data pipelines and ETL processes
  • Working with cloud-based data platforms (e.g., AWS, Azure, GCP)
  • Implementing and maintaining data lakes and data warehouses
  • Using modern big data technologies (e.g., Spark, Hadoop, Kafka)
  • Applying data governance and security practices in a regulated environment
Commitment to continuous learning and staying updated with the latest data engineering technologies and best practices.
 
Certificates, licenses, registrations: 

 

Certifications in the following areas are preferred but not required:
  • Data Engineering Certifications:
    • Google Certified Professional Data Engineer
    • AWS Certified Data Analytics – Specialty
    • Azure Data Engineer Associate
    • Cloudera Certified Professional (CCP) Data Engineer
  • Modern Data Warehouse Architecture Certifications:
    • Databricks Certified Professional Data Engineer
    • SnowPro Advanced: Data Engineer Certification
    • Microsoft Certified: Azure Data Engineer Associate
    • Google Cloud Certified - Professional Data Engineer
    • AWS Certified Data Analytics - Specialty
  • Advanced SQL Developer Certifications:
    • Microsoft Certified: Azure Database Administrator Associate
    • Oracle PL/SQL Developer Certified Associate
    • Databricks Certified Associate Developer for Apache Spark
    • Google Cloud Certified - Professional Cloud Database Engineer
    • AWS Certified Database – Specialty
  • Advanced Programming and ETL Development Certifications:
    • Python Institute PCPP – Certified Professional in Python Programming
    • RStudio Certified Professional Data Scientist
    • Informatica Certified Professional or Talend Certified Developer
    • AWS Certified Developer - Associate
    • Microsoft Certified: Azure Data Engineer Associate
  • Data Governance and Security Certifications:
    • ISACA Certified Information Systems Auditor (CISA)
    • Certified in Data Protection (CIPT)
    • Certified Information Systems Security Professional (CISSP)
  • Agile and Project Management Certifications (beneficial for cross-functional collaboration)::
    • PMI Agile Certified Practitioner (PMI-ACP)
    • Certified Scrum Master (CSM)
 
Technical Skills:        
  • Programming Languages: 
    • Advanced proficiency in Python and SQL 
    • Proficiency in Java or Scala 
    • Familiarity with R for statistical computing 
  • Cloud Platforms: 
    • Strong experience with at least one major cloud platform (AWS, Azure, or GCP) 
    • Understanding of cloud-native architectures and services 
  • Data Warehousing and Lakes: 
    • Experience with modern data warehousing solutions (e.g., Snowflake, Amazon Redshift, Google BigQuery) 
    • Familiarity with data lake architectures and technologies (e.g., Delta Lake, Apache Hudi) 
  • ETL/ELT and Data Pipelines: 
    • Proficiency in designing and implementing scalable data pipelines 
    • Experience with ETL/ELT tools (e.g., Apache Airflow, AWS Glue, Databricks) 
  • Database Systems: 
    • Strong knowledge of both relational (e.g., PostgreSQL, Oracle) and NoSQL (e.g., MongoDB, Cassandra) databases 
    • Experience with database optimization and performance tuning 
  • Data Modeling: 
    • Proficiency in dimensional modeling and data warehouse design 
    • Experience with data modeling tools 
  • Version Control and CI/CD: 
    • Proficiency with Git and GitHub/GitLab 
    • Experience with CI/CD pipelines for data projects 
  • Container Technologies: 
    • Familiarity with Docker and container orchestration (e.g., Kubernetes) 
  • Data Governance and Security: 
    • Understanding of data governance principles and practices 
    • Knowledge of data security and privacy best practices 
  • Machine Learning Operations (MLOps): 
    • Familiarity with MLOps practices and tools 
  • Agile Methodologies: 
    • Experience working in Agile environments (e.g., Scrum, Kanban) 
    • Proficiency with project management tools (e.g., Jira, Confluence) 
  • Data Visualization: 
    • Basic proficiency with data visualization tools (e.g., Power BI, Tableau) 
 
 
Other Skills and Abilities:
  • Ability to effectively manage multiple project and service requests.
  • Ability to work cross organization and build relationships with other team players, business partners, and stakeholders.
  • Excellent organizational, analytical, and problem-solving skills.
  • Must be a self-starter, able to work without constant supervision
  • Strong written and verbal communication skills
  • Strong investigation, remediation, and reporting intuition
  • Demonstrated ability to work in a team and collaborative environment
  • Ability to build relationships with other leaders, business partners, and stakeholders.
  • Must be customer service oriented to provide the highest level of customer satisfaction
 
 
 
 
INTENT AND FUNCTION OF JOB DESCRIPTIONS
 
This is not necessarily an all-inclusive list of job-related responsibilities, duties, skills, efforts, requirements or working conditions.  All descriptions have been reviewed to ensure that only essential functions and basic duties have been included.  Peripheral tasks, only incidentally related to each position, have been excluded.  Requirements, skills, and abilities included have been determined to be the minimal standards required to successfully perform the positions.  While this is intended to be an accurate reflection of the current job, management reserves the right to revise the job or to require that other or different tasks be performed as assigned.
 
In accordance with the Americans with Disabilities Act, it is possible that requirements may be modified to reasonably accommodate disabled individuals.  However, no accommodations will be made which may pose serious health or safety risks to the employee or others or which impose undue hardships on the organization.
 
Job descriptions are not intended as and do not create employment contracts.  The organization maintains its status as an at-will employer.  Employees can be terminated for any reason not prohibited by law.