Production Data Operations and Automation Engineer

Information Technology San Jose, California San Francisco, California Washington, D.C. Austin, Texas Atlanta, Georgia Seattle, Washington Phoenix, United States Houston, Texas Dallas, Texas Portland, Oregon Sacramento, California Detroit, Michigan New York, New York Boston, Massachusetts Los Angeles, CA, California Philadelphia, Pennsylvania Raleigh, North Carolina San Diego, California Vienna, Virginia Irvine, California Baltimore, Maryland


Production Data Operations and Automation Engineer


As a Production Operations Engineer, you will work closely with a team of Data, Business Intelligence and Dev/Prod Ops engineers focused on supporting the automation of data pipes, platform infrastructure availability, and code-based data products and platform services. Your expertise on automation tools/technologies like Ansible, Docker, Kubernetes, code repositories such as BitBucket or GitLab and programming languages, techniques like Python, ETL/ELT, Snowflake, Tableau, Alteryx and other cloud data technologies would be key to deliver the required automation and support. Familiarity with ServiceNow and Jira integration is a plus. You will work towards supporting deployment, data flow architecture and automation to deliver the highly available, performant and secure environments of Splunk’s Enterprise Data Warehouse. 



Responsibilities 

  • Work creatively and analytically in a production operations environment demonstrating teamwork, innovation and excellence. 
  • Ability to troubleshoot code defects, data errors and configuration problems to determine the origin of production errors.
  • Build self service capability for the most common infrastructure and application management tasks 
  • Strong experience AWS cloud infrastructure,  database languages (SQL) and scripting languages and using them in automation (primarily Python, others are a big plus) 
  • Should have prior experience automating deployment of part of or entire CI/CD pipeline in Data Warehouse environment. 

Requirements:

  • 5+ years of relevant work experience surrounding data warehouse, data lake operations and production support. 
  • Experience in writing and troubleshooting ETL jobs (Python, Airflow), Snowflake SQL scripts and Business Intelligence (Tableau) reports.  
  • Extensive knowledge of production deployment CI/CD orchestration and automation (Ansible, Kubernetes, Jenkins).
  • Experience in data technologies server administration, SLA/incident management, RBAC automation and controls, user communication and support. 
  • Strong ability for using independent judgment to make sound, justifiable decisions and take action to solve problems.
  • Strong problem solving skills and ability to work independently or in team settings on complex production issues. Participate in On-Call support as needed. 

Thank you for your interest in Splunk!