Big Data Engineer
SentinelOne was founded in 2013 by an elite group of cyber-security and defense experts.
In our R&D offices in TLV, we develop a next-gen endpoint and server protection SW product that uses several layers of protection, including behavioral analysis (with ML on the collected data), anti-exploitation, traps and more in order to stop zero-day attacks that other vendors simply can’t.
It also provides unparalleled threat visibility at a minimum system impact.
What are we looking for?
People who look at the world differently, who explore, analyze and lead the solution they envision.
People who are passionate about making sense out of huge amount of data, leading decisions and creating new ways to look and develop our product.
If you are an engineer that loves algorithms, interested in Machine Learning and Big Data technologies. If you are passionate about developing an innovative product and making an impact, want to work with the best and be part of a company on its way to huge success – come join us for the ride of your life!
What will you do?
You will develop a large-scale cloud based system and build infrastructure for the data pipeline. The challenge is to develop an efficient and large scale, cross continent data pipeline for Sentinel’s machine learning and data ingest streams. We are specifically looking for top-notch software engineer, who can build a robust and production grade system, and also has a passion for big data.
What experience or knowledge should you bring?
- 3+ years of backend development experience working with one or more of the following programming languages: Java, Python, Scala.
- Experience in building large-scale distributed systems in the cloud.
- Experience implementing big data pipelines using Spark, Flink, Kafka and Presto
- BSc in Computer Science / Software Engineering/equivalent or equivalent experience from an elite army unit.
- Experience with AWS - Advantage.