Considering a uniaxial stress-strain curve of a hyperelastic material, a classical neural network can be easily set up to describe this behavior. During training, the network finds a good fitting function that depends mainly on the number of weights and biases and the amount of training data. These overall parameters are not physically motivated, as they only connect the stress values to the strain values via multiplication and the sigmoid transfer functions in the range of the trainings set. This is the reason why classical neural networks have a very poor extrapolation performance.
In contrast, autoregressive neural networks can train a time series, such as the stress curve with a constant strain rate, using previous stress values to calculate the next one. Instead of training a stress-strain function, these networks attempt to find a recursive formulation between stress values. With external inputs, other variables can also be used in the recursive formulation, such as the strain rate. If the training data contains different strain rates, the network can take them into account. In addition, other variables are possible, for example, different temperatures.
Due to the recursive or regressive functionality, the network can calculate stress-strain curves, even beyond the range of the training data. With a sufficiently large training data set, it is thus possible to describe more complex material behavior better than with classical material models.
In this project the properties of viscoelastic materials shall be estimated with an autoregressive neural network. Calculating a stress-strain curve with different strain rates and training the networks can be done in a few minutes. Prediction with different strain rates and stress values outside the range of the training data works very well with only a small error and much less computation time. In addition to optimizing the network architecture, the possibility of other external inputs such as temperature or training with a real measurement data set will also be investigated.