Data Engineer - 2415

Software Development/Engineering Brno, Czechia


Description

Why YOU want this position:  
Enverus is the leading energy SaaS company delivering highly technical insights and predictive/prescriptive analytics that empower customers to make decisions that increase profit. Enverus’ innovative technologies drive production and investment strategies, enable best practices for energy and commodity trading and risk management, and reduce costs through automated processes across critical business functions. Enverus is a strategic partner to more than 6,000 customers in 50 countries.   
 
We're looking for a backend focused engineer to help build data driven applications for Power and Renewables. If you are interested in building APIs and complex data pipelines with end-to-end ownership of the feature you build alongside a team of some of the best engineers in the energy industry, then you will love working at Enverus. We are looking for a developer with a natural curiosity, and preferably someone who has full stack experience too and a willingness to pick up additional skills on the job! 
 
The Team:  
You will join the Acuity Team within the Power and Renewable business unit, which builds our project tracking, land usability analysis, and electrical price tracking and analysis tools. This team pairs gigantic datasets with an interactive SaaS UI which offers results in real time. Come join the team and help pioneer Enverus’s mission to provide a complete solution to power and renewable land and energy analytics!

 
 
What you will do:  
  • Create data flows and transformations with AWS Databricks (Python)
  • Design and query database and big-data systems 
  • Design data-heavy APIs and pipelines while maintaining responsiveness.
  • Participate in end-to-end ownership of the products you work on

What you should have:
 
  • Bachelor’s degree in computer science or related field.
  • Minimum 2 years of experience as a software developer.
  • Experience working with Relational DBs, Source Control and CI/CD
  • Experience with Python and Spark data transformations
  • Excellent communication skills with an advanced English level
  • Clean and maintainable code following best practices.

 

Our Tech Stack: AWS using Terraform, Databricks, Python, Spark, MS SQL