Senior Big Data Admin - US Citizen Required

Engineering San Jose, California Atlanta, Georgia Any US Metro Area, United States Austin, Texas New York City, New York Seattle, Washington Raleigh, North Carolina Massachusetts


*** Must be US citizen *** Due to the nature of the projects assigned.

For over 10 years, Zscaler has been disrupting and transforming the security industry. Our 100% purpose built cloud platform delivers the entire gateway security stack as a service through 150 global data centers to securely connect users to their applications, regardless of device, location, or network in over 185 countries protecting over 4,500 companies and 100 Million threats detected a day.

We work in a fast paced, dynamic and make it happen culture. Our people are some of the brightest and passionate in the industry that thrive on being the first to solve problems.  We are always looking to hire highly passionate, collaborative and humble people that want to make a difference.

Zscaler's data engineering team is seeking a Senior Big Data Administrator. As a member of our group, you'll have the rewarding opportunity to work on cutting edge technologies to deliver and manage a platform that would be foundational to next gen security analytics.  

Responsibilities/What You’ll Do:

  • Automate, deploy and operate data pipelines
  • Implement facilities to monitor all aspects of data pipeline 
  • Administer and manage data in Spark and large-scale Hadoop environments with an emphasis on automation  
  • Troubleshoot and address operational issues as they come up
  • Develop tools  to monitor workload on Hadoop cluster and tune the cluster for improving data processing throughput
  • Support and improve the build, delivery, and deployment pipeline of software developed in Java, Scala and Python

Qualifications/Your Background:

  • 5+ years of Big Data Platform Administration experience
  • Proficiency in data management and automation on Spark, Hadoop, and HDFS environments
  • Proficiency in understanding various log files emitted by Hadoop and troubleshooting performance bottlenecks in the cluster
  • Strong scripting skills for automating tasks (Python/Shell)
  • Experience  developing ETL pipelines
  • Experience using Spark SQL 
  • Experience implementing and administering logging and monitoring tools such as Nagios, ELK
  • BS in Computer Science or related field


  • Experience developing build and deployment automation
  • Experience managing source in git (GitHub ops, branching, merging, etc) a big plus
  • Experience in orchestration tools like  Ansible 
  • Experience with ETL tools like Airflow
  • Experience with SaaS operations
  • Experience in CI build tools such as Gradle and Jenkins 

Why Zscaler?

People who excel at Zscaler are smart, motivated and share our values. Ask yourself: Do you want to team with the best talent in the industry? Do you want to work on disruptive technology? Do you thrive in a fluid work environment? Do you appreciate a company culture that enables individual and group success and celebrates achievement? If you said yes, we’d love to talk to you about joining our award-winning team.

Learn more at or follow us on Twitter @zscaler. Additional information about Zscaler (NASDAQ : ZS ) is available at  All qualified applicants will receive consideration for employment without regard to race, sex, color, religion, sexual orientation, gender identity, national origin, protected veteran status, or on the basis of disability.