Senior Data Engineer
Data Engineers here are involved in designing, developing and maintaining systems for data analysis, transformation, modelling and visualisation. We work directly with the data scientists to develop cutting edge uses of the data we collect.
As a Senior Engineer, you’ll be the bridge between software engineering and data science.
- Helping bring new theories and inventions to life and drive data innovation within Tenable
- Apply the latest real time streaming technologies to process billions of transactions per day
- Work in a lean environment building out the Tenable data platform using both traditional data warehousing ideologies while incorporating the latest trends in big data
- Grow your abilities within the data ecosystem, learn and teach other engineers and drive a data driven culture
- Explore and work with the cutting-edge data processing technologies
- Work in a lean environment building out a state-of-the-art data science platform
- Develop and automate innovative algorithms for data analytics, machine learning and artificial intelligence
- We’re an AWS house, so the chance to work with the latest and greatest tools from AWS
- Make an impact on the future of Data Processing/Science and Cyber Security
- Design and develop solutions that are core to the company values and future
What You’ll Need:
- 5+ years’ experience in a software engineering background within Java, Python or Scala
- Strong experience in designing and engineering data processing workflows ideally using Apache Airflow
- Strong experience with Big Data technologies such as Spark, EMR, Hadoop, Flink, Beam, Kafka, Hive, Presto, Impala, Atlas
- Deep understanding of data storage technologies for structured and unstructured data
- Experience with docker based containerization of software
- Experience with a cloud-based architecture like AWS, GCS or Azure
- Experience using Linux as a primary development environment
- Strong drive for innovation matched with excellent communication and analytical skills
- BSc or MSc in Computer Science, Data Science or directly related field
- Expert in Airflow, Spark, EMR, Kinesis, Kafka
- Expert in data Warehousing Modelling (Kimball/Inmon) and data lake design
- Strong experience with docker and kubernetes
- Strong experience with CI/CD technologies
- Strong AWS DevOps skills like monitoring, deploying and setting up infrastructures
- Experience working in Agile environments, Scrum/Kanban
- Ability to switch between the Python ecosystem and JVM world seamlessly
- Experience in Cyber Threat or Vulnerability related area
If you’ve reached this point in the job description and feel you’re still not sure if you should apply…Just do it! We know there are no perfect applicants. You may not have 100% of all those bullets listed above - and that’s okay. If you’re feeling like you’re not going to fit in with our teams - that’s not ok. We're One Tenable which means however you identify and whatever background you bring with you, we encourage you to submit an application if it’s a role you can be passionate about doing every day.
We’re committed to promoting Equal Employment Opportunity (EEO) at Tenable - through all equal employment opportunity laws and regulations at the international, federal, state and local levels.