Artificial Neural Networks (ANNs) are being extensively researched for their wide range of applications. Among the most important is the ability of a type of ANNs—recurrent attractor networks—to work as associative memories. The most common type of ANN used for associative memory is the Hopfield network, which is a fully connected network with symmetric connections. There have been numerous attempts to improve the capacity and recall quality of recurrent networks, with the focus primarily on the stability of the stored attractors, and the network's convergence properties. However, the ability of a recurrent attractor network to switch between attractors is also an interesting property, if it can be harnessed for use. Such switching can be useful as a model of switching between context-dependent functional networks thought to underlie cognitive processing.
In this thesis, we design and develop a stable-yet-switchable (SyS) network model which provides an interesting combination of stability and switchability. The network is stable under random perturbations, but highly sensitive to specific targeted perturbations which cause it to switch attractors. Such functionality has previously been reported in networks with scale-free (SF) connectivity. We introduce networks with two regions: A densely connected core region, and a sparsely connected and larger periphery. We show that these core-periphery (CP) networks are better for providing a combination of stability and targeted switching than scale-free networks. We develop and validate a specific approach to switching between attractors in a targeted way. The CP and SF models are also compared with each other and with randomly connected homogeneous networks.