Skip to Main Content
 

Global Search Box

 
 
 

ETD Abstract Container

Abstract Header

ForgeScan: A Framework for Iterative Voxel Reconstructions and Next-Best View Selection

Schellenberg, Andrew

Abstract Details

2023, Master of Science, Ohio State University, Mechanical Engineering.
This thesis investigates methods for autonomously reconstructing digital models from measurements of a physical environment. For embodied autonomous systems, understanding the current state of the surrounding space is critical for many higher-level decisions. This work is motivated by manufacturing systems performing iterative deformation processes on workpieces. However, perception-informed decisions occur in all kinds of systems: an aerial drone mapping a warehouse, a delivery robot driving around an obstacle, or a bin-picking robot selecting the correct object. Despite active research concerning methods to best integrate measurements into voxelized reconstructions during mapping or scanning tasks, there is no set of tools to develop and compare approaches. In response to this need, this work introduces ForgeScan, an open-source library that unifies voxel grid representations, their update methods, and next-best view selection algorithms with simulated or real depth sensors. Rather than reworking the same development path, ForgeScan is designed to be minimal and adaptable. New voxel update rules are easily implemented as a subclass of an abstract voxel grid base class and the use of C++17 variants provides datatype flexibility. More than a versatile voxel grid data structure, ForgeScan provides an abstract policy class to perform view selection algorithms and a lightweight depth camera simulator to generate synthetic data. User-defined policies can suggest camera poses, generate depth images of a mesh, and then add these measurements to one or more voxel grid implementations. ForgeScan is designed to be flexible at runtime. While collecting data, users may interactively add new voxel grids or change what policy is running. At any time, the state of each grid’s reconstruction may be saved and inspected with VTK. Common voxel methodologies – space carving, truncated signed distance fields, and occupation probability – are implemented to demonstrate the library and explore the fundamental limitations of voxels. The precision and miss-rate of these methods are compared for increasing sensor uncertainty for a variety of view selection algorithms. Additionally, simple, open-loop view selection policies are compared with novel algorithms. One of these new policies searches for voxels on the boundary of seen and unknown space and seeks to view as many of these voxels as possible. Another takes an assumed model of the scene and selects a set of views where the camera’s optical axis is perpendicular to as many faces as possible while being distant from other views in the set.
Michael Groeber (Advisor)
Andrew Gillman (Committee Member)
Haijun Su (Committee Member)
Ayonga Hereid (Committee Member)
105 p.

Recommended Citations

Citations

  • Schellenberg, A. (2023). ForgeScan: A Framework for Iterative Voxel Reconstructions and Next-Best View Selection [Master's thesis, Ohio State University]. OhioLINK Electronic Theses and Dissertations Center. http://rave.ohiolink.edu/etdc/view?acc_num=osu170138075081872

    APA Style (7th edition)

  • Schellenberg, Andrew. ForgeScan: A Framework for Iterative Voxel Reconstructions and Next-Best View Selection. 2023. Ohio State University, Master's thesis. OhioLINK Electronic Theses and Dissertations Center, http://rave.ohiolink.edu/etdc/view?acc_num=osu170138075081872.

    MLA Style (8th edition)

  • Schellenberg, Andrew. "ForgeScan: A Framework for Iterative Voxel Reconstructions and Next-Best View Selection." Master's thesis, Ohio State University, 2023. http://rave.ohiolink.edu/etdc/view?acc_num=osu170138075081872

    Chicago Manual of Style (17th edition)