STOCK PRICE PREDICTION: KOHONEN VERSUS BACKPROPAGATION

Similar documents
TRADITIONAL AND INDEX TRACKING METHODS FOR PORTFOLIO CONSTRUCTION BY MEANS OF NEURAL NETWORKS

International Journal of Computer Engineering and Applications, Volume XII, Issue II, Feb. 18, ISSN

International Journal of Research in Engineering Technology - Volume 2 Issue 5, July - August 2017

International Journal of Computer Engineering and Applications, Volume XII, Issue II, Feb. 18, ISSN

Based on BP Neural Network Stock Prediction

Backpropagation and Recurrent Neural Networks in Financial Analysis of Multiple Stock Market Returns

An enhanced artificial neural network for stock price predications

The Use of Artificial Neural Network for Forecasting of FTSE Bursa Malaysia KLCI Stock Price Index

Forecasting stock market prices

Stock Market Prediction using Artificial Neural Networks IME611 - Financial Engineering Indian Institute of Technology, Kanpur (208016), India

Barapatre Omprakash et.al; International Journal of Advance Research, Ideas and Innovations in Technology

Artificially Intelligent Forecasting of Stock Market Indexes

COGNITIVE LEARNING OF INTELLIGENCE SYSTEMS USING NEURAL NETWORKS: EVIDENCE FROM THE AUSTRALIAN CAPITAL MARKETS

Predictive Model Learning of Stochastic Simulations. John Hegstrom, FSA, MAAA

AN ARTIFICIAL NEURAL NETWORK MODELING APPROACH TO PREDICT CRUDE OIL FUTURE. By Dr. PRASANT SARANGI Director (Research) ICSI-CCGRT, Navi Mumbai

Stock market price index return forecasting using ANN. Gunter Senyurt, Abdulhamit Subasi

Business Strategies in Credit Rating and the Control of Misclassification Costs in Neural Network Predictions

Forecasting Currency Exchange Rates via Feedforward Backpropagation Neural Network

The Use of Neural Networks in the Prediction of the Stock Exchange of Thailand (SET) Index

Application of Innovations Feedback Neural Networks in the Prediction of Ups and Downs Value of Stock Market *

Dr. P. O. Asagba Computer Science Department, Faculty of Science, University of Port Harcourt, Port Harcourt, PMB 5323, Choba, Nigeria

STOCK MARKET TRENDS PREDICTION USING NEURAL NETWORK BASED HYBRID MODEL

Journal of Internet Banking and Commerce

Stock Trading System Based on Formalized Technical Analysis and Ranking Technique

Electrical. load forecasting using artificial neural network kohonen methode. Galang Jiwo Syeto / EEPIS-ITS ITS

Discovering Intraday Price Patterns by Using Hierarchical Self-Organizing Maps

$tock Forecasting using Machine Learning

STOCK MARKET PREDICTION AND ANALYSIS USING MACHINE LEARNING

Understanding neural networks

Pattern Recognition by Neural Network Ensemble

Design and Application of Artificial Neural Networks for Predicting the Values of Indexes on the Bulgarian Stock Market

Applications of Neural Networks in Stock Market Prediction

Predicting Abnormal Stock Returns with a. Nonparametric Nonlinear Method

Predicting Economic Recession using Data Mining Techniques

Outline. Neural Network Application For Predicting Stock Index Volatility Using High Frequency Data. Background. Introduction and Motivation

Stock Market Prediction System

Keywords: artificial neural network, backpropagtion algorithm, derived parameter.

Stock Trading Following Stock Price Index Movement Classification Using Machine Learning Techniques

LITERATURE REVIEW. can mimic the brain. A neural network consists of an interconnected nnected group of

STOCK MARKET FORECASTING USING NEURAL NETWORKS

Introducing GEMS a Novel Technique for Ensemble Creation

Cognitive Pattern Analysis Employing Neural Networks: Evidence from the Australian Capital Markets

An Intelligent Forex Monitoring System

A Novel Iron Loss Reduction Technique for Distribution Transformers Based on a Combined Genetic Algorithm Neural Network Approach

Design and implementation of artificial neural network system for stock market prediction (A case study of first bank of Nigeria PLC Shares)

Performance analysis of Neural Network Algorithms on Stock Market Forecasting

Statistical and Machine Learning Approach in Forex Prediction Based on Empirical Data

Two kinds of neural networks, a feed forward multi layer Perceptron (MLP)[1,3] and an Elman recurrent network[5], are used to predict a company's

A Dynamic Hedging Strategy for Option Transaction Using Artificial Neural Networks

Foreign Exchange Rate Forecasting using Levenberg- Marquardt Learning Algorithm

Neuro Fuzzy based Stock Market Prediction System

Predicting Trading Signals of the All Share Price Index Using a Modified Neural Network Algorithm

Forecasting Foreign Exchange Rate during Crisis - A Neural Network Approach

Keywords Time series prediction, MSM30 prediction, Artificial Neural Networks, Single Layer Linear Counterpropagation network.

Improving Stock Price Prediction with SVM by Simple Transformation: The Sample of Stock Exchange of Thailand (SET)

Option Pricing Using Bayesian Neural Networks

Stock Price and Index Forecasting by Arbitrage Pricing Theory-Based Gaussian TFA Learning

A Novel Prediction Method for Stock Index Applying Grey Theory and Neural Networks

A Review of Artificial Neural Network Applications in Control. Chart Pattern Recognition

Accelerated Option Pricing Multiple Scenarios

Bond Market Prediction using an Ensemble of Neural Networks

Chapter IV. Forecasting Daily and Weekly Stock Returns

Keywords: artificial neural network, backpropagtion algorithm, capital asset pricing model

Predicting stock prices for large-cap technology companies

Chapter 7 One-Dimensional Search Methods

Iran s Stock Market Prediction By Neural Networks and GA

Forecasting the Exchange Rates of CHF vs USD Using Neural. networks

Modeling Federal Funds Rates: A Comparison of Four Methodologies

A.K.Singh. Keywords Ariticial neural network, backpropogation, soft computing, forecasting

An Improved Approach for Business & Market Intelligence using Artificial Neural Network

One-Step and Multi-Step Ahead Stock Prediction Using Backpropagation Neural Networks

Stock Market Index Prediction Using Multilayer Perceptron and Long Short Term Memory Networks: A Case Study on BSE Sensex

Prediction Using Back Propagation and k- Nearest Neighbor (k-nn) Algorithm

Accepted Manuscript AIRMS: A RISK MANAGEMENT TOOL USING MACHINE LEARNING. Spyros K. Chandrinos, Georgios Sakkas, Nikos D. Lagaros

Role of soft computing techniques in predicting stock market direction

Neuro-Genetic System for DAX Index Prediction

Design of a Wavelet Inspired Neuro-Fuzzy Approach to Forecast Financial Data

COMPARING NEURAL NETWORK AND REGRESSION MODELS IN ASSET PRICING MODEL WITH HETEROGENEOUS BELIEFS

Abstract Making good predictions for stock prices is an important task for the financial industry. The way these predictions are carried out is often

Accepted Manuscript. Enterprise Credit Risk Evaluation Based on Neural Network Algorithm. Xiaobing Huang, Xiaolian Liu, Yuanqian Ren

Volatility Prediction with. Mixture Density Networks. Christian Schittenkopf. Georg Dorner. Engelbert J. Dockner. Report No. 15

Predictive modelling around the world Peter Banthorpe, RGA Kevin Manning, Milliman

Data based stock portfolio construction using Computational Intelligence

ANN Robot Energy Modeling

SURVEY OF MACHINE LEARNING TECHNIQUES FOR STOCK MARKET ANALYSIS

The exam is closed book, closed calculator, and closed notes except your three crib sheets.

Solving dynamic portfolio choice problems by recursing on optimized portfolio weights or on the value function?

APPLICATION OF ARTIFICIAL NEURAL NETWORK SUPPORTING THE PROCESS OF PORTFOLIO MANAGEMENT IN TERMS OF TIME INVESTMENT ON THE WARSAW STOCK EXCHANGE

2015, IJARCSSE All Rights Reserved Page 66

Markov Decision Processes: Making Decision in the Presence of Uncertainty. (some of) R&N R&N

Draft. emerging market returns, it would seem difficult to uncover any predictability.

FE670 Algorithmic Trading Strategies. Stevens Institute of Technology

PREDICTION OF CLOSING PRICES ON THE STOCK EXCHANGE WITH THE USE OF ARTIFICIAL NEURAL NETWORKS

Stock Market Forecasting Using Artificial Neural Networks

Neural Network Prediction of Stock Price Trend Based on RS with Entropy Discretization

Stochastic Modelling: The power behind effective financial planning. Better Outcomes For All. Good for the consumer. Good for the Industry.

Spiking Back Propagation Multilayer Neural Network Design for Predicting Unpredictable Stock Market Prices with Time Series Analysis

Prediction of Stock Closing Price by Hybrid Deep Neural Network

SCHEDULE CREATION AND ANALYSIS. 1 Powered by POeT Solvers Limited

Reinforcement Learning. Slides based on those used in Berkeley's AI class taught by Dan Klein

Transcription:

STOCK PRICE PREDICTION: KOHONEN VERSUS BACKPROPAGATION Alexey Zorin Technical University of Riga Decision Support Systems Group 1 Kalkyu Street, Riga LV-1658, phone: 371-7089530, LATVIA E-mail: alex@rulv This paper describes the application of two different neural network types for stock price prediction The prediction is carried out by Kohonen self-organizing maps and error backpropagation algorithm Both experimental networks deal with price change intervals in contradiction to precise value prediction The results are presented and its comparative analysis is performed in this paper, as well as a short discussion on both neural network architectures and learning algorithms 1 Introduction Financial predictions must deal with precise data, therefore in most cases stock price prediction is carried out by forecasting future point of the time series This is one of the most popular approaches for many problem domains However, the precise knowledge of the exact future value is not always necessary (forecasting cannot avoid errors and it is impossible to predict future value exactly) Many trading strategies can easily be based on the price change intervals, which can be treated as classes for neural network In this case we face classification problem, that can be solved by means of ever Kohonen self-organizing maps or backpropagation networks The number of classes depends on our needs if we want to know price changes more precisely, we should use more intervals or classes Neural networks are very sophisticated modelling techniques, capable of modelling extremely complex functions In particular, neural networks are non-linear For many years linear modelling has been the commonly used technique in most modelling domains, since linear models had well-known optimisation strategies Where the linear approximation was not valid (which was frequently the case) the models suffered accordingly Neural networks also keep in check the curse of dimensionality problem, which helps to model non-linear functions with large numbers of variables The successful applications of neural networks [1], [4], [5] can also be attributed to its ease of use Neural networks learn by example The neural network user gathers representative data, and then invokes training algorithms to automatically learn the structure of the data Although the user does need to have some knowledge of how to select and prepare data, how to select an appropriate neural network, and how to interpret the results, the level of user knowledge needed to successfully apply neural networks is much lower than would be the case using (for example) more traditional statistical methods The paper is organized as follows Section 2 provides the discussion about Kohonen and backpropagation network architecture Section 3 covers experimental set-up, which includes the data set description and pre-processing, as well as network architecture design for this particular problem Section 4 shows prediction results of both networks and compares the results of implying trading strategy for each methodology The paper concludes with comments on possible future work in the area and some conclusions 2 Theoretical basics of neural networks A neural network is a set of interconnected simple processing elements, or neurones Neural networks are potentially useful for studying the complex relationships between inputs and outputs of a system There are two neural network models investigated in this research: backpropagation networks and Kohonen self-organizing maps There are three major steps in the neural network-based forecasting [3]: pre-processing, architecture and post-processing In pre-processing, information that could be used as the inputs and outputs of neural networks are collected These data are first normalized or scale in order to reduce fluctuations and noise (however, this does not guarantee good results) In architecture, a variety of neural network models that could be used to capture the relationships between the data of inputs and outputs are built Different models and configurations using different training, validation and forecasting data sets are used for experiments The best model is then selected

2 for use in forecasting Finally, in post-processing, different trading strategies are applied to the forecasting results to show the capability of neural network prediction 21 Backpropagation networks A multilayer feedforward network with an appropriate pattern of weights can be used to model some mapping between sets of input and output variables Figure 1a shows an example of feedforward network architecture, with three output units and one hidden layer, which can be trained using backpropagation The shaded nodes in Figure 1a are processing units The arrows connecting input and hidden units and connecting hidden units and the output units represent weights Output layer Kohonen layer Hidden layer Input layer Input layer a) b) Figure 1 The architecture of two types of networks: a) Backpropagation network, b) Kohonen self-organizing map The backpropagation learning algorithm [2],[6],[8] is formulated as a search in the space of the pattern of weights, W, in order to find an optimal configuration, W*,which minimizes an error or cost function, E(W) The pattern of weights will then determine how the network will respond to any arbitrary input The error or cost function is defined by (1): 1 2 E ( t ip o ip ) (1) 2 i p This function compares an output value o ip to a desired value t ip over the set of p training vectors and i output units The gradient descent method is used to search for the minimum of this error function through iterative updates: W ( k 1) W ( k) E (2) where is the learning rate, and E is an estimate of the gradient of E with respect to W The algorithm is recursive and consists of two phases: forward-propagation and backward-propagation In the first phase, the input set of values is presented and propagated forward through the network to compute the output value for each unit In the second phase, the total-squared error calculated in the first phase is propagated from the output units to the input units During this process, the error signal is calculated recursively for each unit in the network and weight adjustments are determined at each level These two phases are executed in each iteration of the backpropagation algorithm until the error function converges 22 Kohonen self-organizing maps In this section we consider self-organizing networks The main difference between them and conventional models is that the correct output cannot be defined a priori, and therefore a numerical measure of the magnitude of the mapping error cannot be used [6] However, the learning process leads to the determination of well-defined network parameters for a given application The self-organizing networks assume a topological structure among the cluster units This property is observed in the brain, but is not found in other artificial neural networks [2] There are m cluster units, arranged

3 in a one- r two-dimensional array: the input signals are n-dimensional Figure 1b shows architecture of a simple self-organizing network, which consists of input and Kohonen or clustering layer The shadowed units in the Figure 1b are processing units This simplified network may cluster the data into three classes, but in the real problem domains one clustering unit for each class is not enough, therefore we should understand each Kohonen layer neurone in the Figure as a number of units (cluster of neurones) When a self-organizing network is used, an input vector is presented at each step These vectors constitute the environment of the network Each new input produces an adaptation of the parameters If such modifications are correctly controlled, the network can build a kind of internal representation of the environment Consider the problem of charting an n-dimensional space using a one-dimensional chain of Kohonen units [6] The units are all arranged in sequence and are numbered from 1 to m (see Figure 2) neighborhood of unit 2 with radius 1 1 2 3 m-1 m w 1 w 2 w 3 w m-1 w m x Figure 2 A one-dimensional lattice of computing units The n-dimensional weight vectors w 1, w 2,,w m are used for the computation The objective of the charting process is that each unit learns to specialize on different regions of input space When an input from such a region is fed into the network, the corresponding unit should compute the maximum excitation Kohonen s learning algorithm is used to guarantee that this effect is achieved A Kohonen unit computes the Euclidian distance (the dot product metric can also be used) between an input x and its weight vector w In the Kohonen one-dimensional network, the neighbourhood of radius 1 of a unit at the k-th position consists of the units at the positions k-1 and k+1 Units at both ends of the chain have asymmetrical neighbourhoods Kohonen learning uses a neighbourhood function, whose value (i,k) represents the strength of the coupling between unit i and unit k during the training process The complete description of Kohonen learning algorithm can be found in [2] and [6] 3 Experimental set-up The data set consists of the Ventspils Nafta stock prices on daily up-dates for the period from December 1, 2000 till December 28, 2001 The total number of data points is 272 values The description statistics of the data set is as follows: mean = 0678; standard deviation = 0061; maximum = 085, minimum = 058 Figure 3 shows the graphical representation of the data set Stock price 085 08 075 07 065 06 055 05 1122000 2032001 31052001 24082001 19112001 Session date Figure 3 The data set: Ventspils Nafta stock prices on daily up-dates (December 1, 2000 December 28, 2001) Analysing the data set it has been found, that the most often changes of daily stock price are happening in the range from 1 to 4% (without taking to account the sign of the change), therefore we can use 7 classes or intervals: (- ; -45%], (-45%; -25%], (-25%; -05%], (-05%; 05%], (05%; 25%], (25%; 45%], (45%; ) This number of classes is large enough to implement trading strategy on predicted results

4 The input layer of the network is largely determined by application as for backpropagation, as well as for Kohonen network In this particular case it is possible to use two similar approaches The first one is standard windowing method, described in [4], while the second is almost identical The difference is, that in the first case we use the set of last closing prices of the stock as input vector, but in the second one, the window is a set of the opening, the closing, the highest and the lowest price of the three or more trading days [8] We have tried both of these approaches, and the better results are possible with the first one The architecture of the best performing backpropagation network is quite simple (20-10-7): it has 20 input neurons, 10 hidden neurones and 7 output neurons The number of input units is crucial for the experiments In this case the better results are possible with the look-ahead horizon of 4 weeks or 20 trading days The number of classes determines the output layer size An interesting estimate of the hidden layer size was proposed in [7]: ( P A) C B (3) A C 1 where B is the number of hidden layer neurones; P is the total number of input patterns; A is the input layer size; C is the output layer size and is the percentage of unique patterns that the net will be required to learn (accuracy) The main purpose of Kohonen networks is to split data into number of clusters or classes that is defined before In our case we already have classes, and each class is assigned to a pre-determined interval Therefore we cannot use standard Kohonen learning algorithm without any corrections The architecture of the Kohonen selforganizing map is shown in Figure 4 Input layer Kohonen layer Price percent change rate y 1 (-oo; -45%] x 1 y 2 (-45%; -25%] x 2 y 3 (-25%; -05%] x 3 x n y 4 y 5 y 6 (-05%; 05%] (05%; 25%] (25%; 45%] y 7 (45%; oo) Figure 4 The Kohonen network architecture for price prediction The number of input units depends on the window size we are using (in this case there are 20 units) The number of clusters in the Kohonen layer is already determined (seven clusters of neurones) In the Figure 4 there is only scheme of the networks, and each unit in the Kohonen layer means a cluster of neurons (25 neurons with rectangular grid topology and initial radius 2) The training process can be considered as a supervised learning, because of the fact that each of the Kohonen clusters is trained individually This is gone by freezing all other neuron clusters while the inputs with the desired response identical to the given cluster are processed We can say that this network is trained in seven stages accordingly to the number of classes After the learning procedure is completed, all clusters are de-frozen (all neurons may be active), and the network works in a common manner 4 Prediction results There are many trading strategies, which allows interpreting prediction results in backpropagation and in Kohonen networking cases The backpropagation network prediction results can be used, for example, in the most simple strategy if the predicted price belongs to an increasing interval, then buy; if the price is going down, then sell However, the better strategy is to use the committee of networks with different time horizon, and the strategy may be to buy, when the majority of networks classify the input pattern as a rising case; to sell, when networks predict price decrease; or to do nothing (untradable day), when the committee have no unambiguous answer

5 An interpretation of the Kohonen network results may be as follows If one Kohonen layer cluster is noticeably higher than all the rest, then we have a clear signal If two or more neuron clusters are significantly active, then additional interpretation is required A good approach to analysing multiple signals is to ignore signals when they are not occurring in contiguous neurones (clusters) If, for example, neurons 2 and 6 are active, then it is obvious that the network is not able to classify the pattern A simple strategy would be to buy when neurons in the positive ranges are active, sell when neurones in the negative ranges are active and stand aside when neutral or conflicting activation occur Table 1 shows the backpropagation and the Kohonen network prediction results on the training and on the test sets Table 1 Backpropagation and Kohonen network s performance on the training and the test sets Backpropagation Kohonen Training Set: Correctly classified cases 84% 91% Test Set: Correctly classified cases 65% 77% Undetermined cases - 11% The correctly classified cases are given in percents from all test patterns There are no undetermined cases for the backpropagation network, and for the Kohonen network the total percentage of such cases is 11% It would be interesting to estimate the approximate returns of these strategies, but, unfortunately, the author does not have information about the costs per deal and other possible costs 5 Conclusions The results from this study indicate that a neural network is able to capture the relationships in stock price changes over time and make respectable predictions The backpropagation network is like a standard in many problem domains, especially, in forecasting, but Kohonen self-organizing maps may be used more efficiently in this particular case The main advantage of Kohonen network here is that we can easy determine the untradable days thus reducing the risk of losses Finally, future research will be connected with further study of possibilities to implements Kohonen networks in the area of forecasting Another direction of experiments is counterpropagation networks, which are hybrid model of backpropagation and Kohonen networks This kind of networks can also be used not only for interval, but also for the single price value prediction References [1] Baestaens D E, Van den Bergh W M (1995) Tracking the Amsterdam Stock Index Using Neural Networks Neural Networks in Capital Markets, Vol 5, P 149-161 [2] Fausett L (1994) Fundamentals of Neural Networks Architectures, algorithms and applications, Prentice Hall, Inc, P 169-187 [3] Hean-Lee Poh, Jingtao Yao, Teo Jasic (1998) Neural Networks for the Analysis and Forecasting of Advertising and Promotion Impact, International Journal of Intelligent Systems in Accounting, Finance & Management, Vol7, P 253-268 [4] Refenes A N, Azema-Barac M, Chen L, Karoussos S A (1993) Currency Exchange Rate Prediction and Neural Network Design Strategies, Springer-Verlag, London Limited, P 46-58 [5] Refenes A N, Zapranis A, Francis G (1994) Stock Performance Modelling Using Neural Networks Neural Networks, Vol 7 No 2 P 357 388 [6] Rojas R (1996) Neural networks A systematic approach Springer, Berlin, P 389-410 [7] Zirilli J S (1997) Financial Predictions using Neural Networks International Thomson Computer Press, London, 135 p [8] Zurada J M (1992) Introduction to Artificial Neural Systems, West Publishing Company, St Paul, 684 p