Releases: blueokanna/RustGLM
Releases · blueokanna/RustGLM
RustGLM v0.1.5
- Integrated CogView-4, Cogview-3-Flash, GLM-4V-plus , GLM-4V、GLM-4V-Flash
- Optimized the code to reduce performance loss
- Update Old request method from GLM-3 relavent to GLM-4-Air, GLM-4-Plus and GLM-4-Flash
- Not limited to terminal use, open an input interface set_user_input, more open and free
RustGLM v0.1.4
Update
- Integrated CogView-3 and GLM-4V
- Optimized the code to reduce performance loss
- Use toml configuration files to reduce duplicate code and simpler configuration
- Not limited to terminal use, open an input interface set_user_input, more open and free
- delete save api key to store in local file (Keep easy to use)
RustGLM v0.1.2
Update
- Updated support for CogView-3 and GLM-4V
- Optimized the code to reduce performance loss
- Use toml configuration files to reduce duplicate code and simpler configuration
RustGLM v0.1.1
ChatGLM SDK for RustLang
High-performance, Reliable ChatGLM SDK natural language processing in Rust-Lang
- The program is currently uploaded to the
crate.ioofficial repository, and you can build your own binaries. The application also supports streaming or synchronous request dependencies. - Supports saving API keys locally and loading local API key files.
- Comparable to the operating efficiency of C and C++ languages, it can be called multiple times quickly to reduce performance costs.
- There is a system_role setting by default for fun role playing ChatGLM-4, Context-sensitive continuous conversations with ChatGLM-4