Skip to content

Scaled cosine similarity can cause NaNs #111

@bobcao3

Description

@bobcao3

sqrt_scale = torch.sqrt(scale.to(dtype))

I know it's mentioned in the paper that this version of scaling directly parametrize scale instead of exponent, however an unintended side effects is that when the scale goes close to zero it can get into negatives due to some larger random gradient updates, which causes a NaN.

Fix is simple, in our adaptation for our in house models we changed it to torch.sqrt(torch.abs(scale) + eps). The eps is added for preserving gradients (so it never reaches zero). I guess a biased ReLU also probably works, along with other non linear functions.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions