Abstract
High-resolution underwater imagery provides a detailed view of coral reefs and facilitates insight into important ecological metrics concerning their health. In recent years, anthropogenic stressors, including those related to climate change, have altered the community composition of coral reef habitats around the world. Currently the most common method of quantifying the composition of these communities is through benthic quadrat surveys and image analysis. This requires manual annotation of images and is a time-consuming task that does not scale well for large studies. Patch-based image classification using Convolutional Neural Networks (CNNs) can automate this task and provide sparse labels, but they remain computationally inefficient. This work extended the idea of automatic image annotation by using Fully Convolutional Networks (FCNs) to provide dense labels through semantic segmentation. Presented here is an improved version of the Multilevel Superpixel Segmentation (MSS) algorithm, which repurposes existing sparse labels for images by converting them into the dense labels necessary for training a FCN automatically. This improved implementation—Fast-MSS—is demonstrated to perform considerably faster than the original without sacrificing accuracy. To showcase the applicability to benthic ecologists, this algorithm was independently validated by converting the sparse labels provided with the Moorea Labeled Coral (MLC) dataset into dense labels using Fast-MSS. FCNs were then trained and evaluated by comparing their predictions on the test images with the corresponding ground-truth sparse labels, setting the baseline scores for the task of semantic segmentation. Lastly, this study outlined a workflow using the methods previously described in combination with Structure-from-Motion (SfM) photogrammetry to classify the individual elements that make up a 3-D reconstructed model to their respective semantic groups. The contributions of this thesis help move the field of benthic ecology towards more efficient monitoring of coral reefs through entirely automated processes by making it easier to compute the changes in community composition using 2-D benthic habitat images and 3-D models.
Presenter Bio
Jordan graduated from Texas A&M University with a Bachelor's of Science in Geography/GIS and a minor in Oceanography. Originally starting his schooling in the Computer Science department, he wanted to apply his technical background to the geosciences and eventually found an interest in the niche fields of hydrography and ocean mapping.
After graduating, Jordan spent some time abroad in Asia where he traveled, taught and worked as a research assistant. His appointment before he joined CCOM was in a Coral Reef Ecology lab at the University of Hong Kong where he studied the effect of structural complexity of coral reefs on ecosystem functions and applications of machine learning for reconstructing paleoenvironments.
Jordan's current research interests include computer programming, reconstructing 3D models of marine environments and topics in machine learning.
Publication Date
10-29-2020
Document Type
Presentation
Recommended Citation
Pierce, Jordan, "Automating the Boring Stuff: A Deep Learning and Computer Vision Workflow for Coral Reef Habitat Mapping" (2020). Seminars. 318.
https://scholars.unh.edu/ccom_seminars/318