Skip to content

[FEATURE] Support latest version of transformers #393

@ParagEkbote

Description

@ParagEkbote

Is your feature request related to a problem? Please describe.

Currently, transformer v.4.51.0 is only supported due to constraints of llmcompressor and gliner as seen by this pipdeptree output:

transformers==4.51.0
├── llmcompressor==0.6.0.1 [requires: transformers>4.0,<=4.52.4]
│   └── pruna==0.2.10 [requires: llmcompressor]
├── whisper-s2t==1.3.1 [requires: transformers]
│   └── pruna==0.2.10 [requires: whisper-s2t==1.3.1]
├── gliner==0.2.22 [requires: transformers>=4.38.2,<=4.51.0]
│   └── pruna==0.2.10 [requires: gliner]
├── DeepCache==0.1.1 [requires: transformers]
│   └── pruna==0.2.10 [requires: DeepCache]
├── optimum==1.27.0 [requires: transformers>=4.29]
│   └── whisper-s2t==1.3.1 [requires: optimum]
│       └── pruna==0.2.10 [requires: whisper-s2t==1.3.1]
├── pruna==0.2.10 [requires: transformers]
├── compressed-tensors==0.10.2 [requires: transformers]
│   └── llmcompressor==0.6.0.1 [requires: compressed-tensors==0.10.2]
│       └── pruna==0.2.10 [requires: llmcompressor]
└── hqq==0.2.7.post1 [requires: transformers>=4.36.1]
    └── pruna==0.2.10 [requires: hqq==0.2.7.post1]

This leads to key errors for some models like SmolLM3-3B and GLM-4.5V. While we can resolve this using pip install --upgrade transformers, it could be better to add support to prevent these silent bugs. Also, since transformers v5 is right around the corner, this error could become more prominent. WDYT?

cc: @davidberenstein1957 , @johannaSommer

Describe the solution you'd like

Please support latest version of transformers for better model architecture coverage.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions