[Bug] Fix fp8 bugs: 1. support only fp8 quant gemm or grouped gemm 2. precompute float8 dynamic scale before compute_actor_logprobs during rl training#1356
Merged
CyCle1024 merged 2 commits intoInternLM:mainfrom Jan 4, 2026
Commits
Commits on Jan 4, 2026
- authored andcommitted
- authored andcommitted