-
Notifications
You must be signed in to change notification settings - Fork 24
Few improvements #32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Done in 04787b1
Done in 2a541da
Done in e4e9023
added that as an option for configurations, removed in
current streamlit cannot do that yet. (streamlit/streamlit#2058) |
@ydcjeff this should be done for all templates. |
I think we can close this issue as unrelated. |
Currently tested on master: 78b0def
-r requirements.txt
to install streamlit with appropriate versionChange text: "Those in the parenthesis are used in the generated code." -> "Names in the parenthesis are variable names in the generated code." or something similar.
Let's explicitly create the trainer in CIFAR10 example to show how to write
training_step
Let's add AMP option
Let's add Error metric (to show how we can do metrics arithmetics) :
initialize
and also set up a LR scheduler:Distributed option if used as multiprocessing schema:
python main.py
-> multiple childs have/had a certain issue with dataloaders: first iteration of each epoch is very slow. To avoid that let's prefer to say to the user to lauch things withtorch.distributed.launch
I think this code is useless to add to main.py if exp_logger is None
I'm a bit confused about this option:
eval_max_epochs
and its value = 2. It is something I've never seen before. I think that we have to follow the standard practices and by default run once on the validation dataloader. Thoughts ?If possible make sidebar resizable from a min possible to a max value.
The text was updated successfully, but these errors were encountered: