Data Engineer

Corporate Opportunities Folsom, California


Description

Salary Range: $89,400.00 - $111,800.00

THIS REQUIRES A LOCAL CANDIDATE - HYBRID 
Why SAFE?
SAFE offers so much more than just full medical, vision, dental, 401k matching, HSA, and FSA! Learn more about how we support our workforce!
  • Professional Development Opportunities: Offering training programs, workshops, and mentorship.
  • Recognition and Appreciation: Regularly acknowledging employee achievements and contributions.
  • Flexible Work Arrangements: Providing options for remote work and flexible scheduling.
  • Positive Company Culture: Fostering an inclusive, collaborative, and supportive work environment.
  • Career Growth: Clear paths for career advancement and internal promotions.
  • Work-Life Balance: Encouraging a healthy balance between professional and personal life.
  • Employee Empowerment: Allowing employees to make decisions and have autonomy in their roles.
  • Space of Belonging: ERGs, YOUnity Council and a focus around diversity, equity inclusion and belonging.  
  • Wellness Programs: Promoting physical and mental health through wellness initiatives and resources.
  • Strong Leadership: Having leaders who inspire, support, and guide their teams effectively.
  • Sense of Purpose: Creating a sense of mission and aligning company goals with employees' personal values.
POSITION PURPOSE
The Data Engineer designs, develops, implements, and maintains robust data pipelines and data management processes to support the organization's data infrastructure. This role is crucial in ensuring efficient data movement from source to destination, applying necessary transformations and integrations along the way, while adhering to business-defined cadences.
 
Key responsibilities include:
  • Designing and implementing scalable ETL/ELT processes
  • Ensuring data quality, governance, compliance, and security throughout the data lifecycle
  • Developing and optimizing data models to support analytics and reporting needs
  • Collaborating with cross-functional teams to understand and meet data requirements
  • Continuously improving data processes for efficiency and scalability
 
The Data Engineer plays a pivotal role in enabling data-driven decision making across the organization by providing reliable, timely, and accurate data. This position requires a strong understanding of data architecture principles, proficiency in data processing technologies, and the ability to work in an agile, fast-paced environment. This requires that the Data Engineer stay current with the latest advancements in data engineering and actively contribute to the continuous improvement of the data infrastructure.
 
ESSENTIAL FUNCTIONS AND BASIC DUTIES
  • Implement and adhere to standards, data engineering principles and practices, ensuring the application of quality, governance, compliance, and security guidelines throughout all data processes.
  • Gather and generate data pipeline requirements, translate,  and  create data flows.
  • Guide and develop end-to-end data flow pipelines as per requirements utilizing modern programming languages (e.g., Python), advance SQL, ETL/ELT tools and practices.
  • Implement automated testing frameworks for continuous integration and deployment of data pipelines.
  • Design and implement data quality checks and monitoring processes throughout the data pipeline to ensure data integrity and reliability.
  • Perform comprehensive quality assurance, including stress testing, unit testing, and peer code reviews, to validate the accuracy and efficiency of data flows and pipelines.
 
  • Create Automation and Scheduling of data pipelines and administer data pipeline change management.
  • Demonstrate strong problem-solving abilities by diagnosing, troubleshooting, debugging, and fixing data pipelines, while continuously monitoring, supporting, and maintaining operational stability of existing data infrastructure. 
  • Create and maintain comprehensive documentation both within and outside the code, data flow  including data dictionaries, schemas, and architectural diagrams, while effectively managing code repositories and version control in a collaborative environment.
  • Integrate and capture the correct technical metadata, data catalog and data lineage.
  • Collaborate with data scientists and data analysts to develop and deploy machine learning pipelines, ensuring smooth integration with existing data infrastructure.
  • Participate in data architecture design discussions, contributing insights on scalability, performance, and best practices.
  • Collaborate with data governance and security teams to implement and maintain data policies throughout the data lifecycle.
  • Act as a liaison between various stakeholders including architects, senior data engineers, data analysts, data stewards, and business analysts, providing technical guidance and fostering coordination among internal and external teams.
  • Stay abreast of technological advancements, emerging products, and productivity tools relevant to SAFE's environment, while effectively communicating these insights to collaborate with team members and stakeholders.
  • Complete other duties, as assigned. 
 
QUALIFICATIONS
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions. 
 
Education/Certification:
Master’s degree in field of computer science/engineering/applications or information technologies or data science and minimum three (3) years of related data engineering and data management experience; or bachelor’s degree in a related field and minimum five (5) years of related data engineering and data management experience; or equivalent combination of education and experience.
Relevant degree fields include, but are not limited to: Computer Science, Data Science, Information Systems, Software Engineering, or closely related technical disciplines.
Demonstrated commitment to ongoing professional development and staying current with emerging data technologies and practices is essential.
 
Certificates, licenses, registrations: 
 
Certifications in the following areas are preferred:
  • Certified Data engineering or equivalent cloud data engineer
  • Certified programmer/developer like Python, Java, R, ETL developer etc.
  • Certified SQL developer like Microsoft SQL, Oracle PSQL, SparkSQL etc.
 
Technical Skills:        
  • Programming Languages 
    • Advance object-oriented programming like Python, JavaScript, R 
    • Proficiency in Java or Scala  
    • Familiarity with R for statistical computing 
  • Cloud Platforms 
    • Strong experience with at least one major cloud platform (AWS, Azure, or GCP)  
    • Understanding of cloud-native architectures and services 
  • Data Warehouse and Lakes 
    • Experience with modern data warehousing solutions (e.g., Snowflake, Data Bricks, Google BigQuery)  
    • Familiarity with data lake architectures and technologies (e.g., Delta Lake, Medallion Architecture) 
  • ETL/ELT and Data Pipelines 
    • Proficiency in designing and implementing scalable data pipelines 
    • Familiarity with Orchestration tools like Apache Airflow 
  • Database Systems 
    • Strong knowledge of both relational (e.g., SQL Server, Oracle) and NoSQL (e.g., MongoDB, Cassandra) databases  
    • Advance SQL skills like MSSQL, PL/SQL, TSQL
    • Experience with database optimization and performance tuning 
  • Data Modeling 
    • Proficiency in dimensional modeling and data warehouse design 
    • Experience with data modeling tools 
  • API Design, Development, and Integration 
    • Understanding of RESTful and GraphQL API design principles  
    • Experience with API development and management tools  
    • Proficiency in consuming and integrating data from external APIs  
    • Experience in handling API authentication, rate limiting, and efficient data retrieval  
    • Skill in transforming API data into formats suitable for data pipelines and analytics 
  • Container Technologies 
    • Familiarity with Docker and container orchestration (e.g., Kubernetes) 
    • Service oriented API design, best principles, and security. 
  • Agile Methodologies 
    • Experience working in Agile environments (e.g., Scrum, Kanban)  
    • Proficiency with project management tools (e.g., Jira, Confluence) 
  • Version Control and Continuous Integration/Continuous Deployment (CI/CD) 
    • Familiarity with CI/CD tools like Jenkins and DevOps 
    • Proficiency with Git and GitHub/GitLab  
 
Other Skills and Abilities:
  • Communication and Collaboration:
    • Strong written and verbal communication skills
    • Ability to explain complex technical concepts to non-technical stakeholders
    • Demonstrated ability to work effectively in a team and collaborative environment
    • Skill in building relationships with team players, business partners, and stakeholders across the organization
  • Problem-Solving and Analytical Thinking:
    • Excellent analytical and problem-solving skills
    • Ability to approach complex data challenges with creative solutions
    • Strong attention to detail and commitment to data accuracy
  • Project and Time Management:
    • Ability to effectively manage multiple projects and service requests concurrently
    • Skill in prioritizing tasks and meeting deadlines in a fast-paced environment
  • Leadership and Mentoring:
    • Ability to work independently and mentor other developers
    • Demonstrated leadership in driving data-related initiatives
  • Business Acumen:
    • Understanding of business processes and ability to translate business requirements into technical solutions
    • Skill in data storytelling and presenting insights to business stakeholders
  • Adaptability and Learning:
    • Willingness to adapt to new technologies and methodologies
    • Commitment to continuous learning and staying updated with the latest trends in data engineering
  • Customer-Centric Approach:
    • Strong focus on internal customer satisfaction and meeting end-user needs
    • Ability to anticipate and proactively address data-related challenges
  • Ethical Data Handling:
    • Understanding of data privacy principles and ethical considerations in data management
    • Commitment to maintaining data security and confidentiality
 
 
 
INTENT AND FUNCTION OF JOB DESCRIPTIONS
 
This is not necessarily an all-inclusive list of job-related responsibilities, duties, skills, efforts, requirements or working conditions.  All descriptions have been reviewed to ensure that only essential functions and basic duties have been included.  Peripheral tasks, only incidentally related to each position, have been excluded.  Requirements, skills, and abilities included have been determined to be the minimal standards required to successfully perform the positions.  While this is intended to be an accurate reflection of the current job, management reserves the right to revise the job or to require that other or different tasks be performed as assigned.
 
In accordance with the Americans with Disabilities Act, it is possible that requirements may be modified to reasonably accommodate disabled individuals.  However, no accommodations will be made which may pose serious health or safety risks to the employee or others or which impose undue hardships on the organization.
 
Job descriptions are not intended as and do not create employment contracts.  The organization maintains its status as an at-will employer.  Employees can be terminated for any reason not prohibited by law.