-
Notifications
You must be signed in to change notification settings - Fork 155
Open
Description
From https://kraken.re/main/training/rectrain.html : »If the new dataset is fairly dissimilar or your base model has been pretrained with ketos pretrain, use --warmup in conjunction with --freeze-backbone for one 1 or 2 epochs.«
But later on that page, the following command line is given in the section »Unsupervised recognition pretraining«:
ketos train -i pretrain_best.mlmodel --warmup 5000 --freeze-backbone 1000 ...
Is »Unsupervised recognition pretraining« a complete different use case, so »1000« instead of »1 or 2« ?
Kind regards, Jochen
Metadata
Metadata
Assignees
Labels
No labels