Streaming Media

Abstract

Mobile robots now deliver vast amounts of sensor data from large unstructured environments. In attempting to process and interpret this data there are many unique challenges in bridging the gap between prerecorded data sets and the field. This talk will present recent work addressing the application of deep learning techniques to marine robotic perception. We focus on solutions to several pervasive problems when attempting to deploy such techniques on fielded robotic systems. The themes of the talk revolve around alternatives to gathering and curating data sets for training. Are there ways of avoiding the labor-intensive human labeling required for supervised learning? These questions give rise to several lines of research based around self-supervision, simulation, and field data processing. We will show how these approaches applied to depth estimation, object classification, underwater manipulation, and domain transfer problems have great potential to change the way we train, test, and validate machine learning-based systems. Real examples from underwater vehicle deployments will be discussed.

Presenter Bio

Matthew Johnson-Roberson is Associate Professor of Engineering in the Department of Naval Architecture & Marine Engineering and the Department of Electrical Engineering and Computer Science at the University of Michigan. He co-directs the UM Ford Center for Autonomous Vehicles (FCAV) and he founded and leads the DROP (Deep Robot Optical Perception) Lab, which researches 3D reconstruction, segmentation, data mining, and visualization.

He received a PhD from the University of Sydney and he has held prior postdoctoral appointments with the Centre for Autonomous Systems - CAS at KTH Royal Institute of Technology in Stockholm and the Australian Centre for Field Robotics at the University of Sydney. He is a recipient of the NSF CAREER award (2015).

Publication Date

9-24-2021

Document Type

Presentation

Share

COinS