Skip to content

FutureWarning: torch.cuda.amp.GradScaler(args...) is deprecated. #635

@si-open

Description

@si-open

train.py:196: FutureWarning: torch.cuda.amp.GradScaler(args...) is deprecated. Please use torch.amp.GradScaler('cuda', args...) instead.
scaler = torch.cuda.amp.GradScaler(enabled=(dtype == 'float16'))
num decayed parameter tensors: 26, with 10,755,456 parameters
num non-decayed parameter tensors: 13, with 4,992 parameters
using fused AdamW: True
compiling the model... (takes a ~minute)
W0903 20:13:19.043000 55763 torch/_inductor/utils.py:1436] [0/0] Not enough SMs to use max_autotune_gemm mode

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions