DataOps Engineer
Description
Summary
Data has never been more valuable and vulnerable. As cybercriminals become more sophisticated and regulations more strict, organizations struggle to answer one key question: “Is my data safe?"
At Varonis, we see the world of cybersecurity differently. Instead of chasing threats, we believe the most practical approach is protecting data from the inside out. We’ve built the industry’s first fully autonomous Data Security Platform to help our customers dramatically reduce risk with minimal human effort.
At Varonis, we move fast. We’re an ultra-collaborative company with brilliant people who care deeply about the details. Together, we’re solving interesting and complex puzzles to keep the world’s data safe.
We work in a flexible, hybrid model, so you can choose the home-office balance that works best for you.
About the Role:
We are seeking a DataOps Engineer with Data experience to help design, build, and operate reliable, scalable data systems. This role combines hands-on data pipeline development with operational excellence, focusing on automation, reliability, observability, and efficient data delivery.
You will work closely with data engineers, analytics engineers, platform teams, and stakeholders to ensure data is accurate, available, performant, and production-ready.
Responsibilities
- Design, develop, and maintain batch and streaming data pipelines from multiple source systems
- Implement ETL/ELT processes to ingest, transform, and model data for analytics and downstream consumers
- Build and optimize data models, tables, and views in cloud data warehouses or lakehouses
- Enforce data quality, validation, and schema management across pipelines
- Optimize pipeline performance, scalability, and cost efficiency
- Collaborate with analytics and data science teams to support reporting, dashboards, and ML workloads
- Apply DataOps best practices including CI/CD for data pipelines, automated testing, and version control
- Monitor pipeline health, data freshness, and SLAs using observability and alerting tools
- Automate operational tasks such as deployments, backfills, schema evolution, and rollbacks
- Manage and improve production reliability of data systems, including on-call support and incident response
- Implement and maintain infrastructure and orchestration for data workflows
- Improve transparency and trust in data through metadata, lineage, and documentation
- Partner with platform teams on infrastructure as code, security, and access management
Requirements
- 3+ years of experience in Data Engineering, DataOps, or a closely related role.
- Strong hands-on experience building and maintaining production data pipelines.
- Proficiency with Python.
- Experience with cloud data platforms (Azure, AWS, or GCP).
- Familiarity with workflow orchestration tools (e.g., Airflow, Databricks Workflows, Prefect, Dagster).
- Experience with CI/CD pipelines (GitHub Actions, Azure DevOps, GitLab CI, etc.).
- Solid understanding of data reliability, monitoring, logging, and alerting.
Advantages
- Experience with Databricks, Spark, Snowflake, BigQuery, or similar platforms.
- Knowledge of streaming technologies (Kafka, Event Hubs, Spark Streaming, Flink).
- Familiarity with data quality and observability tools (e.g., Great Expectations, Monte Carlo, OpenMetadata).
- Experience with Infrastructure as Code (Terraform, ARM, CloudFormation).
- Understanding of data governance, security, and compliance practices.
We invite you to check out our Instagram Page to gain further insight into the Varonis culture!
@VaronisLife
Varonis is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, national origin, disability, veteran status, and other legally protected characteristics.
#LI-Hybrid