Senior Software Engineer

Data & Analytics Roseville, California United States


Description

Position at Clear Capital

Description

Our Data Team consists of experts in Business Intelligence, Data Engineering, Software Engineering, Data Management, Data Quality, and Systems Administration. We develop, maintain, and support Clear Capital’s operational and analytical data stores which are used to power our valuations platforms, machine learning outputs, and other various data products. We are searching for a highly skilled, test driven software engineer experienced with big data tools and serverless architecture. This role includes analyzing and modifying existing software as well as designing new applications and data pipelines that run at scale. Rally with our existing highly skilled team to: design, build, test and deploy great software to production.

Who We Are Looking For

Individuals must possess excellent problem-solving skills, the ability to communicate clearly and confidently with technical and business stakeholders, and contribute to the growth of our self-serve data platform. 

  • Minimum of five (5) years of experience with AWS cloud technologies (e.g. EMR, S3, EC2, Redshift, Quicksight, SnowFlake, Athena, DynamoDB, Elastic, RDS, Lambda, API Gateway, SQS, Kinesis, Data Pipeline, Glue)
  • Minimum of five (5) years experience authoring data pipelines supporting data services, analytics, and modeling (e.g. with Apache Airflow, Step Functions, or your own work and project)
  • Minimum of five (5) years experience deploying CI/CD pipelines (BitBucket Pipelines, GitHub Actions)
  • Minimum of five (5) years experience with relational database technologies (PostgreSQL, MySQL)
  • Minimum of five (5) years experience working with RESTful or SOAP APIs
  • Minimum of three (3) years coding in Python
  • Minimum of one (1) year of experience with Apache Spark
  • Minimum of one (1) year experience working in Agile Scrum Environment
  • Minimum of one (1) year experience with Service Oriented Architecture.

What Your Role Will Be

  • Architect and develop ETL data processors at scale with AWS Glue and PySpark
  • Develop serverless Python APIs on lambda accessed through AWS API Gateway
  • Aid in the modeling of schemas saved in S3 and accessed by AWS Athena and Spark
  • Develop change data capture routines with Delta Lake to serve correct up-to-date data
  • Create and maintain internal frontend tools written in React
  • Aid in the creation of state machine logic with AWS step functions
  • Write and maintain useful documentation pertaining to the software

About Us

Clear Capital is the premier provider of real estate valuation, analytics, and technology solutions. Powered by its more than 45 years worth of information on nearly every U.S. metro, neighborhood, and property, Clear Capital’s solutions are trusted by community credit unions and billion-dollar financial institutions alike. Clear Capital is headquartered in Reno-Tahoe with a team of more than 500 nationwide, dedicated to going wherever it leads, and doing whatever it takes.

To all recruitment agencies: Clear Capital does not accept agency resumes. Please do not forward resumes to our jobs alias, Clear Capital employees, or any other company location. Clear Capital is not responsible for any fees related to unsolicited resumes.