Skip to content

Conversation

@HIT-cwh
Copy link
Collaborator

@HIT-cwh HIT-cwh commented Dec 11, 2025

No description provided.

@HIT-cwh HIT-cwh changed the title [Bug] Fix fp8 bug [Bug] Fix fp8 bugs: 1. support only fp8 quant gemm or grouped gemm 2. precompute float8 dynamic scale before compute_actor_logprobs during rl training Dec 29, 2025
@CyCle1024 CyCle1024 merged commit 143752b into InternLM:main Jan 4, 2026
3 of 4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants