Page 5 of 8
Re: Neural Network
Posted:
Fri Sep 20, 2013 11:24 am
by censix
Well I would say that, whatever Nick, yourself and others have in mind for implementing 1-layer MLP training on
the Epiphany architecture, if that works, it would already be HUGE success! And it would be very useful indeed.
Re: Neural Network
Posted:
Fri Sep 20, 2013 2:44 pm
by roberto
Thinking deeper, there is alternative to Back Propagation algoritm to train a Neural Network: to use Genetic Algoritm to evolve a population of weights till reach the element that sutusfy the NN. This is a interesting path to explore, cos it is demostrate that can give great speed up to train and in this case, parallela is a overkill because AG need a simple classic cpu to be runned. I played with GA in past and they can give extraordinary results (few minutes of calculation) where a classic algoritm "brute force" need centuries to produce a result. In this case, the Back Propagation algoritm is concettually a brute force system, no matter if multy-layer-few-neurons-per-layer or single-layer-with-many-neurons NN.
Re: Neural Network
Posted:
Sun Sep 22, 2013 2:48 pm
by roberto
Re: Neural Network
Posted:
Mon Sep 23, 2013 1:31 am
by over9000
Re: Neural Network
Posted:
Wed Sep 25, 2013 10:50 am
by roberto
Re: Neural Network
Posted:
Wed Sep 25, 2013 2:51 pm
by over9000
Re: Neural Network
Posted:
Wed Sep 25, 2013 6:06 pm
by censix
Re: Neural Network
Posted:
Thu Sep 26, 2013 5:42 pm
by CIB
Re: Neural Network
Posted:
Fri Sep 27, 2013 12:56 am
by over9000
Re: Neural Network
Posted:
Tue Oct 01, 2013 12:11 am
by nickoppen
Hi Guys,
Here is the second part of the design - Training. It is a little more complicated that the feed forward pass so please give it a read and let me know if I've missed anything.
Thanks,
nick