Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add batch normalization and Adam optimizer #114

Open
2 of 5 tasks
rweed opened this issue Jan 13, 2023 · 0 comments
Open
2 of 5 tasks

Add batch normalization and Adam optimizer #114

rweed opened this issue Jan 13, 2023 · 0 comments

Comments

@rweed
Copy link
Contributor

rweed commented Jan 13, 2023

Here are five things I would like to see added to neural-fortran

  1. Batch normalization layers
  2. Dropout layers (although for what I'm working on now batch normalization works much better)
  3. Adam optimizer
  4. K-fold cross-validation
  5. Linear layer for output (need this for regression)

Also, FYI. I was able to get neural-fortran to compile and run on a Cray XC40 using Cray's compilers but I had to do a little code surgery to make it work. Namely

I had to eliminate dependency on both functional (you're only using the reverse function so pulling in an entire library for just one function thats simple to replicate is overkill IMHO) and h5fortran (CCE compilers did not like some of the unlimited polymorphic logic). I just figured out what was needed and replicated that in a standalone utility.

Since I have access to a Cray with tens of thousands of processors, I would like to use it to both train very large datasets and use a keras generated model to predict thousands of potential values.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant