Author_Institution :
Coll. of Commun. Eng., Jilin Univ., Changchun, South Korea
Abstract :
In this paper, a nonlinear hysteresis model for a MSMA actuator is established using a PID neural network. The structure of the PID neural network is shown, where n is the output, m is the input,1ωij was the weights from the ith node of the input layer to the jth node of the hidden layer, 2ωi was the weights from the ith node of the hidden layer to the node of the output layer. The node numbers of the input layer, the hidden layer and the output layer are labelled as 2, 3, and 1, respectively. To reduce modeling error, a nonlinear function is added to the two input layer neurons. Through nonlinear transformation and processing the proportion, integral, and differential of the input signal, the prediction output value approximates the actual output value of the actuator by adjusting weights. The BP training algorithm is adopted as the weights training method for the PID neural network model. This algorithm uses a gradient descent algorithm to adjust the weights through a reverse calculation. Using back-propagation training algorithms to train weights, this model can better approximate the main and minor hysteresis loops by adding a nonlinear function in the input layer.
Keywords :
approximation theory; backpropagation; gradient methods; magnetic actuators; magnetic hysteresis; neural nets; nonlinear functions; physics computing; shape memory effects; PID neural network structure; actuator output value; approximation; back-propagation training algorithms; gradient descent algorithm; hidden layer node; input layer neurons; input layer node; input signal differential; input signal integral; input signal proportion; magnetically controlled shape memory alloy actuator; minor hysteresis loops; modeling error; nonlinear function; nonlinear hysteresis model; nonlinear transformation; output layer node; reverse calculation; weight training method; Actuators; Data models; Magnetic hysteresis; Neural networks; Prediction algorithms; Shape memory alloys; Training;