Skip to content

Conversation

@alecmocatta
Copy link
Member

@alecmocatta alecmocatta commented Aug 8, 2020

I've vendored what looks to me like a nice, well written, pure-Rust (but can optionally use OpenBLAS, Intel MKL, Netlib or Apple's Accelerate framework under the hood) library that contains basic but relatively high-quality implementations of Stochastic Gradient Descent and LSTM RNN, with Hogwild-style parallelisation.

  • Passing test reliably
  • Integrate into amadeus
    • Parallel
    • Distributed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants