Junhua Geng
Henan institute of economics and trade,Accounting School,Henan,Zhengzhou,450046,China

Abstract:

The back propagation (BP) learning technique, which makes local optimization-based iterative modifications for weights and bias terms of artificial neural networks, is the most often used approach for training ANN-based models in the conventional sense (ANNs). Two of the most significant bottlenecks for BP-based neural predictive models in the financial market are their poor convergence speed and high sensitivity to initial model parameters. This is especially true for applications where models must be trained in a short period of time on the fly. Utilizing another AI and profound learning approach, the creators desire to give a more exact stock unpredictability model to the market. It is the goal of this review to propose crossover models in light of the blend of Transformer and Multi-Transformer layers with different procedures, for example, GARCH-based calculations or LSTM units to accomplish this objective.