1 / 18

Neuron-Adaptive Higher Order Neural Network Models for Automated Financial Data Modeling

Neuron-Adaptive Higher Order Neural Network Models for Automated Financial Data Modeling. Dr. Ming Zhang, Associate Professor Department of Physics, Computer Science & Engineering Christopher Newport University 1 University Place, Newport News, VA 23606, USA. Published.

idalee
Download Presentation

Neuron-Adaptive Higher Order Neural Network Models for Automated Financial Data Modeling

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Neuron-Adaptive Higher Order Neural Network Models for Automated Financial Data Modeling Dr. Ming Zhang, Associate Professor Department of Physics, Computer Science & Engineering Christopher Newport University 1 University Place, Newport News, VA 23606, USA

  2. Published IEEE Transactions on Neural Networks Vol. 13 No. 1 January 2002

  3. Problems • Real-world financial data is often non-linear, high-frequency, multi-polynomial components, and is discontinuous (piecewise continuous). • Classical neural network models are unable to automatically determine the optimum model and appropriate order for financial data approximation.

  4. PHONN Simulator (1994 - 1996) - Polynomial Higher Order Neural Network financial data simulator - A$ 105,000 Supported by Fujitsu, Japan • THONN Simulator (1996 - 1998) - Trigonometric polynomial Higher Order Neural Network financial data simulator - A$ 10,000 Supported by Australia Research Council • PT-HONN Simulator (1999 - 2000) - Polynomial and Trigonometric polynomial Higher Order Neural Network financial data simulator - US$ 46,000 Supported by USA National Research Council

  5. PT-HONN Data Simulator

  6. Simulating by PT-HONN Simulator

  7. Structure of PT-HONN

  8. PT-HONN MODEL • The network architecture of PT-HONN has combined both the characteristics of PHONN and THONN. • It is a multi-layer network that consists of an input layer with input-units, and output layer with output-units, and two hidden layers consisting of intermediate processing units.

  9. Definition of PT-HONN

  10. NAHONN * The network architecture of NAHONN is a multi-layer feed-forward network that consists of an input layer with input-units, an output layer with output-units, and one hidden layer consisting of intermediate processing units.* There is no activation function in the input layer and the output neurons are summing units (linear activation)* our activation function for the hidden layer processing units is a Neuron-Adaptive Activation Function (NAAF)

  11. NAAF The activation function for the hidden layer processing units is a Neuron-Adaptive Activation Function (NAAF) defined aswhere a1,b1,a2,b2,a3 and b3 are real variable which will be adjusted (as well as weights) during training.

  12. Structure of NAHONN

  13. NAHONN Group • Neuron-Adaptive Feed-forward Higher Order Neural network Group (NAFNG) is one kind of neural network group in which each element is a neuron-adaptive feed-forward higher order neural network (Fi). We have: NAFNG ={F1, F2, F3,…... Fi,…...Fn}

  14. Hornik, K. (1991) “Whenever the activation function is continuous, bounded and non-constant, then for an arbitrary compact subset X Rn, standard multi-layer feed-forward networks can approximate any continuous function on X arbitrarily well with respect to uniform distance, provided that sufficiently many hidden units are available”

  15. Leshno, M. (1993) “A standard multi-layer feed-forward network with a locally bounded activation function can approximate any continuous function to any degree of accuracy if and only if the network’s activation function is not a polynomial”

  16. Zhang, Ming (1995) “ Consider a neural network piecewise function group, in which each member is a standard multi-layer feed-forward neural network, and which has locally bounded, piecewise continuous (rather than polynomial) activation function and threshold. Each such group can approximate any king of piecewise continuous function, and to any degree of accuracy”

  17. Feature of NAHONN • A neuron-adaptive feed-forward neural network group with adaptive neurons can approximate any kind of piecewise continuous function.

  18. Conclusion • We proved that a single NAHONN can approximate any piecewise continuous function, to any desired accuracy. • The experimental results show that NAHONN can handle high-frequency data, model multi-polynomial data, simulate discontinuous data, and are capable of approximating any kind of piecewise continuous functions, to any degree of accuracy.

More Related