Nautilus Systems, Inc. logo and menu bar Site Index Home
News Books
Button Bar Menu- Choices also at bottom of page About Nautilus Services Partners Case Studies Contact Us
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [Subscribe]

Re: DM: Thanks and new questions


From: Omer F Rana
Date: Mon, 29 Sep 1997 08:11:30 -0400 (EDT)
} 
} Hi, 
} 
} 1. On the initialization on neural network.
} When the gradient base learning algorithms(like Bp and so on) are
} adopted, if the initialization error of neuro-network state is 
small or
} at its minimum , could I think that the initialization of the  
neural
} network is good? Or simply say that: the objective of the 
initialization
} of neural network is to persuit the smaller initialization errors.

No, you cannot make that assumption. You could be at a local minima.
There are a number of ways of initialising neural networks. Some 
heuristics,
for instance, are:

1. The range of values from which the initial weights are selected
   should be small. This is, some say, to prevent biasing the network
   before it even starts learning.

2. Use of a Gaussian penalty function to initialise weight values 
rather
   than using a Uniform distribution. More of this can be found in the
   neural simulator - Neural Works Professional.

3. Although initialistion is important, it is also crucial how you use
   other parameters within the network, namely the momentum, learning 
rate,
   and 'hedging factors' (if they are available) during learning. 
   A number of neural network simulator allow for dynamic variation 
of 
   these parameters.

In order to test your neural network, we have developed the 'Gamma 
Test'
which can be used on data, prior to training a neural network, and 
will
give you the best least mean squared error you can expect. The Gamma 
Test
assumes that you have a continuous underlying model however, and does 
not
perform well with *some* discrete models. You can download the Gamma
Test software from :

   http://www.cs.cf.ac.uk/Evolutionary_Computing/

You may also be interested in a list of neural network pointers I 
maintain.
There is a very good link on 'Backpropagation Review', which is 
pretty useful.

   http://www-asds.doc.ic.ac.uk/~ofr/neural3.html

} 
} 2.The construction algorithms of neuro-fuzzy.
} For neuro-fuzzy system, I think that the system model can be 
construted
} before the learning process based on the information provided by
} samples. Does any one have done thus work before?

There are a number of ways of 'fusing' the neuro-fuzzy idea. One is 
the 
'Fuzzy Integral' in

@article{kim95,
   author = {S-B. Cho and J. H. Kim},
   journal = {IEEE Transactions on Systems, Man and Cybernetics},
   number = {2},
   pages = {380-384},
   title = {Combining Multiple Neural Networks by Fuzzy Integral for 
Robust
Classification},
   volume = {25},
   year = {1995}
}
 
Other approaches are those of Hinton el. al. in their combination of 
experts,
and those of Kasko. 

If Warren Sarle is on this list (which I know he is) ;-),
I am sure he can give you some references to his work at SAS.

regards
Omer

-- 
(http://www-asds.doc.ic.ac.uk/~ofr/)(http://www.cs.cf.ac.uk/User/O.F.Rana/)
                 (work:01222 874000 x 5542)(play:0956-299-981)
      room s/2.03, dept of computer science, university of wales - 
cardiff,
                         po box 916, cardiff cf2 3xf, uk
        
----------------------------------------------------------------
           "I haven't lost my mind; I know exactly where I left it."



[ Home | About Nautilus | Case Studies | Partners | Contact Nautilus ]
[ Subscribe to Lists | Recommended Books ]

logo Copyright © 1998 Nautilus Systems, Inc. All Rights Reserved.
Email: nautilus-info@nautilus-systems.com
Mail converted by MHonArc 2.2.0