Dynamics of Heterogeneous Hopfield Neural Network with Adaptive Activation Function Based on Memristor

Study of Heterogeneous Hopfield Neural Networks: Dynamic Behavior Analysis Combining Adaptive Activation Functions and Memristors

This study investigates the impact of nonlinear factors on the dynamic behavior of neural networks. Specifically, activation functions and memristors are commonly used as nonlinear factors to construct chaotic systems and simulate synaptic behaviors. The Hopfield Neural Network (HNN) has garnered extensive attention due to its unique network structure and its ability to generate complex brain-like dynamics. Additionally, current research tends to focus on the impact of neurons with fixed activation functions on system dynamics, whereas there is relatively little research on the combination of heterogeneous activation functions.

This paper, written by Chunhua Wang, Junhui Liang, and Quanli Deng from the College of Computer and Electronic Engineering at Hunan University and the Guangdong-Hong Kong-Macao Greater Bay Area Research Institute, was submitted on January 28, 2024, and accepted by Neural Networks on May 21, 2024.

Research Process

1. Research Model Design

1.1 Memristor Model Design

The memristor is considered the fourth fundamental circuit element and is used to simulate constant function adaptive parameters. In this study, a memristor model with a conduction function value constrained between 0 and 1 was designed, defined as follows:

[ \begin{aligned} &i = w(\phi)v = \sin^2(\phi + 1)v \ &\dot{\phi} = -a\phi + bv \end{aligned} ]

where w(φ) is the conductance, v, i, and φ represent voltage, current, and the state variable of the memristor, respectively, and a and b are internal parameters of the memristor. Circuit simulations of this memristor model were conducted, and the results verified its compliance with memristor characteristics.

1.2 Adaptive Activation Function Model

A memristor-based adaptive Parametric Rectified Linear Unit (PReLU) activation function was designed:

[ \text{mPReLU}(w(\phi), x) = \begin{cases} x & \text{if } x > 0 \ \sin^2(\phi + 1)x & \text{if } x \le 0 \end{cases} ]

The parameter value is determined by the memristor, which automatically adjusts the activation function parameters through current changes, allowing neurons to exhibit more complex nonlinear characteristics.

1.3 Heterogeneous Hopfield Neural Network Model

Based on the classical Hopfield neural network model, a heterogeneous model was constructed using three different activation functions, including tanh, sigmoid, and the memristor-based adaptive PReLU. The mathematical expression of the neural network is defined as follows:

[ \begin{aligned} &\dot{x}_i = -\frac{x_i}{Ri} + \sum{j=1}^n w_{ij}F_j(x_j) + Ii \ \text{Where } &\ F{j1}(x_j) = \tanh(xj) &\quad F{j2}(x_j) = \frac{1}{1 + e^{-xj}} \ F{j3}(x_j) = \text{mPReLU}(w(\phi), x_j) \end{aligned} ]

Experiments were conducted using the corresponding neuron topology structures by combining the three activation functions.

2. Dynamic Analysis

2.1 Equilibrium Points and Stability Analysis

Numerical calculations were used to determine the system’s equilibrium points and their stability. Phase diagrams and Lyapunov exponent spectra analyzed the impact of different internal parameters (a, b) and synaptic weights (w12) on the system’s dynamic behavior, revealing that the system exhibits multiple stable states and complex multi-scroll attractors under different parameters.

2.2 Multi-scroll Chaotic Attractors

Numerical simulations using MatlabR and the ode45 algorithm analyzed the multi-scroll attractors of the system. The study examined how changes in parameters a, b, and w12 affected the number and position of attractors, producing phase diagrams of multi-scroll attractors under different parameters.

2.3 Transient Chaos and State Hopping

It was observed that the system exhibits transient chaos for certain parameters, where chaotic behavior occurs for a finite period before transitioning into a periodic or another chaotic attractor. By setting different parameters (a, b) and initial states, the time-domain waveforms and attractor phase diagrams were plotted, revealing the complex dynamic processes of the system.

2.4 Coexisting Attractors

By setting different initial values, the study found that the system exhibited phenomena of coexisting attractors. Numerical simulations demonstrated various fixed-point attractors, periodic attractors, and chaotic attractors at different locations under different initial conditions.

3. Circuit Design and Experiment

3.1 Circuit Design

To verify the numerical simulation results, a heterogeneous Hopfield neural network circuit system was constructed. Discrete electronic components were used to implement the circuit system for different activation functions, and proportional compression transformation was carried out to ensure operation within the dynamic range of operational amplifiers.

3.2 Experimental Verification

Oscillator waveforms under different parameter conditions were experimentally measured and compared with numerical simulation results. The results showed that the experimental waveforms matched the numerical simulation waveforms, proving the validity of the model.

Conclusion

This paper proposes a new heterogeneous memristor-based Hopfield neural network model and studies its complex dynamic behavior by implementing a memristor-based adaptive activation function. Numerical simulations and circuit experiments both verified the significant enhancement in nonlinear characteristics and dynamic behavior of neural networks provided by the adaptive activation function. Additionally, the paper explored multi-scroll chaotic attractors, transient chaos, and coexisting attractor phenomena, offering new ideas and experimental basis for neural network simulations and practical applications.

This research not only expands the application scope of brain-inspired neural networks but also paves the way for constructing larger-scale, more biologically realistic neural network models. Future research will consider introducing more types of activation functions and a greater number of neurons to build more complex and biologically realistic neural network systems.