Autonomy Engineer - Perception
Position: Autonomy Engineer - Perception
Location: Kitchener, Ontario
Experience: 1-5 Years Relevant Work Experience
Education: Graduate Degree in Related Field
Division: OTTO Motors
Clearpath provides self-driving vehicle technology and services to over 500 of the world’s most innovative brands. Proprietary hardware, software, and services are delivered through the company’s research and industrial divisions: Clearpath Robotics and OTTO Motors.
We employ a diverse and highly talented team who live and breathe robotics. We believe that work must have a high “cool” factor and every day should bring new knowledge. We need more passionate people on our team who are willing and able to push the boundaries of robotics into focused and practical applications.
Clearpath is automating the world and we need your help. Got what it takes?
About the Job
Our industrial solutions team is in high demand finding new ways to automate away our clients' problems, and we have more point clouds to process, networks to learn, and states to estimate than ever. We need more people on our team to bring the state of the art to practical applications. Clearpath, our partner companies, and our clients are making tremendous advances in robotics, and we want you to be a part of it!
You will stay on top of recent developments in SLAM, computer vision, perception algorithms, and sensing technology. You should be familiar with common open-source middleware and libraries like ROS, OpenCV, Gazebo, and
Your primary responsibilities will be:
- Create robust innovative solutions to the problems of robotic perception
- Development of custom SLAM, computer vision, machine learning, target tracking and perception algorithms
- Evaluation of new algorithms and sensing technology
- Assisting with concept development for new products and projects
- Design and architect new perception systems and algorithms
Additional tasks may include:
- Collaboration with other teams including controls and navigation planning
- Triage and support of live industrial systems in the field
- Mentoring and assisting with supervision of interns
- Open-source contributions to the ROS/PCL/Gazebo community
- Attending tradeshows and conferences
You want to work for a small company that thinks big and dreams huge. You are driven, view work as more than just a job, and are never satisfied with a project left half-done. You want to be surrounded by people like you; creative, fun-loving, and passionate about their work. You are motivated by making an impact on your workplace and you thrive on challenging and rewarding problems. Oh, and you have some form of higher education with the common sense to back it up.
- Graduate degree in engineering/CS or a related field, with applicable background
- Practical and theoretical knowledge of state estimation, computer vision, perception, and SLAM techniques
- Strong software development skills (C++ and Python)
- Proficiency with Linux
- Excellent teamwork/communication skills
- Ability to independently develop software development plans, including timelines and test procedures
- Comfortable with abrupt changes to project deadlines, job responsibilities and the local gravity field
Bonus points for:
- ROS, TensorFlow, Gazebo, and PCL experience
- Familiarity with graph optimization techniques and libraries
- Experience writing efficient, high performance code
- Hands-on experience with autonomous systems
- Ability to diagnose broken robots by their sounds and smells
- Experience with git/other DVCS systems and other software development processes
- Understanding of sensor error modeling, particularly laser rangefinders and vision systems
Clearpath is committed to supporting a culture of diversity and accessibility across the organization. We hire the best talent regardless of race, color, creed, national origin, ancestry, disability, marital status, age, sex, veteran status or sexual orientation. If you require special accommodation to complete any portion of the application or interview process, please contact 1-800-301-3863.