Network drop out and learning rate warm up requested as additional training parameters

Network dropout of about 20 percent made all the difference for my SD 1.5 models and is a powerful generalization tool. Learning rate warmup for cosine I think is essential for a finely tuned model and it would be great to adjust that.

Please authenticate to join the conversation.

Upvoters
Status

Awaiting Dev Review

Board

πŸ’‘ Feature Request

Date

Over 1 year ago

Author

CruzFlesh

Subscribe to post

Get notified by email when there are changes.