Many birds and bats fatalities have been reported in the vicinity of wind farms. An acoustic, infrared camera, and marine radar based system is developed to monitor the nocturnal migration of birds and bats. The system is deployed and tested in an area of potential wind farm development. The area is also a stopover for migrating birds and bats.
Multi-sensory data fusion is developed based on acoustics, infrared camera (IR), and radar. The diversity of the sensors technologies complicated its development. Different signal processing techniques were developed for processing of various types of data. Data fusion is then implemented from three diverse sensors in order to make inferences about the targets. This approach leads to reduction of uncertainties and provides a desired level of confidence and detail information about the patterns. This work is a unique, multifidelity, and multidisciplinary approach based on pattern recognition, machine learning, signal processing, bio-inspired computing, probabilistic methods, and fuzzy reasoning. Sensors were located in the western basin of Lake Erie in Ohio and were used to collect data over the migration period of 2011 and 2012.
Acoustic data were collected using acoustic detectors (SM2 and SM2BAT). Data were preprocessed to convert the recorded files to standard wave format. Acoustic processing was performed in two steps: feature extraction, and classification. Acoustic features of bat echolocation calls were extracted based on three different techniques: Short Time Fourier Transform (STFT), Mel Frequency Cepstrum Coefficient (MFCC), and Discrete Wavelet Transform (DWT). These features were fed into an Evolutionary Neural Network (ENN) for their classification at the species level using acoustic features. Results from different feature extraction techniques were compared based on classification accuracy. The technique can identify bats and will contribute towards developing mitigation procedures for reducing bat fatalities.
Infrared videos were collected using thermal IR camera (FLIR SR 19). Pre-processing was performed to convert infrared videos to frames. Three different background subtraction techniques were applied to detect moving objects in IR data. Thresholding was performed for image binarization using extended Otsu Threshold. Morphology was performed for noise suppression and filtering. Results of three different techniques were then compared. Selected technique (Running Average) followed by thresholding and filtering is then used for tracking and information extraction. Ant based Clustering Algorithm (ACA) based on Lumer and Faieta with its three different variations including Standard ACA, Different Speed ACA, and Short Memory ACA were implemented over extracted features and were compared in terms of different groups created for detected avian data. Fuzzy C Means (FCM) was implemented and used to group the targets.
Radar data were collected using Furuno marine radar (XANK250) with T-bar antenna and parabolic dish. The target detection was processed using radR which is open source platform available for recording and processing radar data. This platform was used to remove clutter and noise, detect possible targets in terms of blip, and save the blips information. The tracking algorithm was developed based on estimation and data association, independent from radR. Estimation is performed using Sequential Importance Sampling-based Particle Filter (SIS-PF) and data association is performed using the Nearest Neighbors (NN).
The data fusion was performed in a heterogeneous dissimilar sensory environment. This is a challenging environment which needs many efforts in both setting up experiments and algorithmic development. Setting up experiments includes preparation of the equipment including purchase of the required equipment, installing the systems, configuration, and control parameter setting. The algorithmic development includes developing algorithms and use of the best available technique for this specific application. Various trade-off of time, accuracy, and cost were considered.
Data fusion of the acoustics/IR/radar is a hierarchical model based on two levels: Level 1 and Level 2. Level 1 is a homogenous dissimilar fusion based on feature level fusion. Level 2 is a heterogeneous fusion and is based on decision level fusion. The feature level is employed on the IR and radar data and combines the features of detected /tracked targets into a composite feature vector. The constructed feature vector is an end-to-end individual sensors’ feature vector which serves as an input to the next level. The second level is a decision level, which uses the feature vector from L1 and fuses the data with acoustic data. The fusion was developed based on number of fusion functions. Data alignment including temporal and spatial alignment, and target association was implemented. A fuzzy Bayesian fusion technique was developed for decision level fusion, the fuzzy inference system provides the priori probability, and Bayesian inference provides posteriori probability of the avian targets.
The result of the data fusion was used to process the spring and fall 2011 migration time in the western basin of Lake Erie in Ohio. This area is a landscape is in the prevailing wind and is putative for wind turbine construction. Also this area is a stopover for migrant birds/bats and the presence of wind turbines may threatened their habitats and life. The aim of this project is to provide an understanding of the activity and behavior of the biological targets by combining three different sensors and provide a detail and reliable information. This work can be extend to other application of military, industry, medication, traffic control, etc.
Keywords: Acoustics,Evolutionary Neural Network,Infrared Camera,Radar,Data Fusion,Clustering, Classification,Supervised Learning,Unsupervised Learning,Feature Extraction,Bat Echolocation Call,Wind Turbine,Bird Mortality,Fuzzy,Bayesian,Detection,Identification