-
-
Notifications
You must be signed in to change notification settings - Fork 40
Open
Description
At the moment the creation of memory blocks is rather static, in this issue I want to evaluate the possibility to have a local AI container running with a rather small model which can help in create a comprehensive story for the memory.
Requirements
- low footprint, we do not want to have a need for a gpu. Generation time is rather flexible
- multi language support. It should be able to generate text in multiple languages
- no effort needed. The user should just spinup the compose file
- reitti should fallback to the current way of generating blocks if no llm is available
Metadata
Metadata
Assignees
Labels
No labels