Search Results (1 - 3 of 3 Results)

Sort By  
Sort Dir
 
Results per page  

Howard, Shaun MichaelDeep Learning for Sensor Fusion
Master of Sciences (Engineering), Case Western Reserve University, 2017, EECS - Computer and Information Sciences
The use of multiple sensors in modern day vehicular applications is necessary to provide a complete outlook of surroundings for advanced driver assistance systems (ADAS) and automated driving. The fusion of these sensors provides increased certainty in the recognition, localization and prediction of surroundings. A deep learning-based sensor fusion system is proposed to fuse two independent, multi-modal sensor sources. This system is shown to successfully learn the complex capabilities of an existing state-of-the-art sensor fusion system and generalize well to new sensor fusion datasets. It has high precision and recall with minimal confusion after training on several million examples of labeled multi-modal sensor data. It is robust, has a sustainable training time, and has real-time response capabilities on a deep learning PC with a single NVIDIA GeForce GTX 980Ti graphical processing unit (GPU).

Committee:

Wyatt Newman, Dr (Committee Chair); M. Cenk Cavusoglu, Dr (Committee Member); Michael Lewicki, Dr (Committee Member)

Subjects:

Artificial Intelligence; Computer Science

Keywords:

deep learning; sensor fusion; deep neural networks; advanced driver assistance systems; automated driving; multi-stream neural networks; feedforward; multilayer perceptron; recurrent; gated recurrent unit; long-short term memory; camera; radar;

Mealey, Thomas C.Binary Recurrent Unit: Using FPGA Hardware to Accelerate Inference in Long Short-Term Memory Neural Networks
Master of Science (M.S.), University of Dayton, 2018, Electrical Engineering
Long Short-Term Memory (LSTM) is a powerful neural network algorithm that has been shown to provide state-of-the-art performance in various sequence learning tasks, including natural language processing, video classification, and speech recognition. Once an LSTM model has been trained on a dataset, the utility it provides comes from its ability to then infer information from completely new data. Due to the large complexity of LSTM models, the so-called inference stage of LSTM can require significant computing power and memory resources in order to keep up with a real-time workload. Many approaches have been taken to accelerate inference, from offloading computations to GPU or other specialized hardware, to reducing the number of computations and memory footprint required by compressing model parameters. This work takes a two-pronged approach to accelerating LSTM inference. First, a model compression scheme called binarization is identified to both reduce the storage size of model parameters and to simplify computations. This technique is applied to training LSTM for two separate sequence learning tasks, and it is shown to provide prediction performance comparable to the uncompressed model counterparts. Then, a digital processor architecture, called Binary Recurrent Unit (BRU), is proposed to accelerate inference for binarized LSTM models. Specifically targeted for FPGA implementation, this accelerator takes advantage of binary model weights and on-chip memory resources in order to parallelize LSTM inference computations. The BRU architecture is implemented and tested on a Xilinx Z7020 device clocked at 200 MHz. Inference computation time for BRU is evaluated against the performance of CPU and GPU inference implementations. BRU is shown to outperform CPU by as much as 39X and GPU by as much as 3.8X.

Committee:

Tarek Taha, PhD (Advisor); Eric Balster, PhD (Committee Member); Vijayan Asari, PhD (Committee Member)

Subjects:

Computer Engineering; Computer Science; Electrical Engineering; Engineering

Keywords:

FPGA; inference; hardware acceleration; digital hardware; neural networks; LSTM; long short-term memory; binarization; model compression; binary weights

Sasse, Jonathan PatrickDistinguishing Behavior from Highly Variable Neural Recordings Using Machine Learning
Master of Sciences, Case Western Reserve University, 2018, Biology
Flexible and robust neural pathways are ubiquitous in animals. Previous work has demonstrated that variability in feeding behavior in the marine mollusk Aplysia californica can be useful to the animal – in general, motor components relevant to feeding show higher variability within animals, even as they vary less across different animals. (Cullins et al. Current Biology 2015). This variability, though, makes interpreting neural recordings challenging, especially in an automated context. In this research, we explore the ability for a combination of artificial neural network architectures (Long Short-Term Memory [LSTM] and Dense Fully Connected) to not only classify behaviors but to distinguish behaviors prior to any observable cue. The examined four channel recordings came from the key protractor muscle (I2) and three motor nerves that control the feeding apparatus of Aplysia californica during feeding behaviors. Each channel of the recordings had an LSTM dedicated to learning how to discern bites from swallows from white noise. The output from these four LSTMs were then passed to a dense, fully connected layer for a final classification using context from all channels. Surprisingly, the overall architecture appears able to discriminate bites from swallows (at an accuracy between 97 and 99%) at least half a second before the classic marker (I2 firing frequency exceeding 10hz) occurs. These results suggest that previously disregarded sub-threshold activity may contain high (or at least sufficient) levels of contextual information for behavioral classification which raises exciting questions about possible implications for closed circuit controllers and medical technology. TensorFlow was used with a Python interface to implement the networks.

Committee:

Hillel Chiel, PhD (Advisor); Sarah Diamond, PhD (Committee Chair); Karen Abbott, PhD (Committee Member); Peter Thomas, PhD (Committee Member)

Subjects:

Animal Sciences; Applied Mathematics; Artificial Intelligence; Biology; Computer Science; Neurobiology; Neurosciences

Keywords:

Aplysia; Machine Learning; Behaviorally relevant; Classification; LSTM; Long Short-Term Memory; Feed-forward neural network; artificial neural network; decision boundary; recurrent neural network; TensorFlow