Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multi Hidden Layer Support for nn.js #61

Open
wants to merge 5 commits into
base: master
Choose a base branch
from
Open

Multi Hidden Layer Support for nn.js #61

wants to merge 5 commits into from

Conversation

AR-234
Copy link

@AR-234 AR-234 commented Feb 11, 2018

Tested on mnist and xor with default settings and multi hidden layers.
Examples doesn't need to be rewriten.

Does the same:
NeuralNetwork(2, 4, 1)
NeuralNetwork(2, [4], 1)

For more layers adding nodes to the hidden_nodes array

thx @xxMrPHDxx for helping to fix the serialize <3

@shiffman
Copy link
Member

This is really impressive, thank you! I think I'd like to implement this as a tutorial. . . it's similar to how I was thinking about encapsulating the "Layer" object. I'm trying to decide if I should merge now or wait until I cover it in a video?

@AR-234
Copy link
Author

AR-234 commented Feb 11, 2018

Thanks, it's my first GitHub journey, normally i don't release anything because i end up writing hacky command line applications for myself.

Updated it to the current branch, well you still got the 10.18 version somewhere here, i really would like to see your approach on that.

@Versatilus
Copy link
Collaborator

This is similar to how I have done it in the past.

I think each layer should keep its work values both before and after activation. This would allow two things: a wider variety of activation functions per #70, and code reuse from the forward pass when training. Duplicating the same code in both the predict and train methods makes them very easy to accidentally break.

Also, if you calculate all of the gradients first the weights can potentially be updated in parallel.

I think the advantages outweigh the increased memory use.

@AR-234
Copy link
Author

AR-234 commented Feb 14, 2018

@Versatilus yeah i know but because these changes are not implemented yet i'll wait for the master branch to update before i push any changes of that kind..
Or atleast i don't know how i can create a branch, with multiple merged from other branches..
Haven't used git that much..

but i like it, also an other problem is that the neuralnetwork is to tied to the supervised learning thing, i would like to refactor the training and predict things aswell

@LordCaox
Copy link

LordCaox commented Jul 14, 2018

This is really nice, I've been playing a bit and have to say works good.

Now that we have multi hidden layer, any advice on how to implement dropout and LSTM ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants