Unsupervised Hebbian Learning

Discussion in 'General Artificial Intelligence Discussions' started by SystemsLock, Jun 22, 2010.

gilt-tape
  1. SystemsLock

    SystemsLock Guest

    I've been doing some reading about NN and I've stumbled on the wonders of Hebbian learning. I love it so much because it seems to be the most accurate representation of neural plasticity. I'm looking to make an unsupervised, feed forward, Hebbian net to recognize commonly occurring patterns in noisy data.

    I've spent a great deal of time scouring the web yet there are still a few gaping holes in my understanding and my deciding of which algorithm to use.

    Nearly everything I read seems to deal with Hebbian learning from a single layer approach. Obviously for an accurate model multiple layers will be necessary. I've read many different articles describing completely different algorithms (Oja's rule, generalized Hebbian algorithm, BCM theory, etc.). What are their pros and cons and which are optimal in my situation (simplicity is key)? I have also found almost nothing describing the proper activation functions to use!

    You'd think their would be simple answers to these questions but despite how much I've read into this I'm still having a tough time getting past the technical jargon. The deeper I read the farther I get from understanding. I've read enough to know what to do yet I lack crucial details to actually do it. What i really need is someone experienced enough to answer some pointed questions I have and maybe bounce ideas off of. Any help is much appreciated.

    You could link me to a tutorial/article but chances are I've already read it.
     
  2. allie

    allie Guest

    ok, so can you link us to a good tutorial/article explaining more of the principles involved in hebbian learning? thanks!
     

Share This Page

desire-umbilical