Skip to content

Documentation: --freeze-backbone #739

@jbarth-ubhd

Description

@jbarth-ubhd

From https://kraken.re/main/training/rectrain.html : »If the new dataset is fairly dissimilar or your base model has been pretrained with ketos pretrain, use --warmup in conjunction with --freeze-backbone for one 1 or 2 epochs.«

But later on that page, the following command line is given in the section »Unsupervised recognition pretraining«:
ketos train -i pretrain_best.mlmodel --warmup 5000 --freeze-backbone 1000 ...

Is »Unsupervised recognition pretraining« a complete different use case, so »1000« instead of »1 or 2« ?

Kind regards, Jochen

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions