From 388d4cf750dd4f8ed435b177594d35a56a268270 Mon Sep 17 00:00:00 2001 From: Mini256 Date: Mon, 12 May 2025 00:28:29 +0800 Subject: [PATCH 1/3] docs: add release 0.5.0 docs --- docs/src/content/releases/v0.5.0.md | 22 ++++++++++++++++++++++ 1 file changed, 22 insertions(+) create mode 100644 docs/src/content/releases/v0.5.0.md diff --git a/docs/src/content/releases/v0.5.0.md b/docs/src/content/releases/v0.5.0.md new file mode 100644 index 000000000..0c5877261 --- /dev/null +++ b/docs/src/content/releases/v0.5.0.md @@ -0,0 +1,22 @@ +# Release Notes for v0.5.0 + +## Highlights + +- Knowledge Base supports configuring text chunking setting +- Add Bedrock Converse support +- Improve Chat Engine settings + - Support selecting multiple Knowledge Bases + - Support switches to control whether to use Further Questions and Rewrite Question feature + - Support more options to control the vector search retrieval +- Support updating LLM / Embedding Model / Reranker Model configuration +- Support reindex single document + +## Improvements + +- Modify the default `context_window` of Ollama LLM provider to 4096 tokens. (Too large context window for local LLM may cost more GPU memory, too small context window may cause insufficient context tokens error) + +## Breaking Changes + +- The new version uses [RichPromptTemplate](https://docs.llamaindex.ai/en/stable/examples/prompts/rich_prompt_template_features/) to manage prompt templates, after upgrading, you need to modify the old placeholders in the **Answer Question** prompt template like: + - `<>` -> `{{context_str}}` + - `<>` -> `{{query_str}}` From eb0d07509b4c2c74351f5a93541ddc439b91fa97 Mon Sep 17 00:00:00 2001 From: Mini256 Date: Mon, 12 May 2025 09:23:31 +0800 Subject: [PATCH 2/3] fix --- docs/src/content/releases/_meta.ts | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/src/content/releases/_meta.ts b/docs/src/content/releases/_meta.ts index 86654727e..131bc01fb 100644 --- a/docs/src/content/releases/_meta.ts +++ b/docs/src/content/releases/_meta.ts @@ -1,6 +1,7 @@ import type { Meta } from 'nextra'; export default { + "v0.5.0": "v0.5.0", "v0.4.0": "v0.4.0", "v0.3.0": "v0.3.0", "v0.2.0": "v0.2.0", From 8d964883b1bd805e9ab130db3e483eee675e2ae4 Mon Sep 17 00:00:00 2001 From: Mini256 Date: Mon, 12 May 2025 09:38:10 +0800 Subject: [PATCH 3/3] docs: update docs --- docs/src/content/chat-engine.mdx | 6 +++--- docs/src/content/knowledge-base.mdx | 4 ++-- docs/src/content/llm.mdx | 23 ++++++++++++----------- docs/src/content/quick-start.mdx | 2 +- docs/src/content/releases/v0.5.0.md | 2 +- 5 files changed, 19 insertions(+), 18 deletions(-) diff --git a/docs/src/content/chat-engine.mdx b/docs/src/content/chat-engine.mdx index 151dec429..603aea292 100644 --- a/docs/src/content/chat-engine.mdx +++ b/docs/src/content/chat-engine.mdx @@ -9,16 +9,16 @@ After logging in with an admin account, you can configure the Chat Engine in the 1. Click on the `Chat Engines` tab; 2. Click on the `New Chat Engine` button to create a new chat engine; - !["Chat Engine Creation Page - Basic Information Section"](https://github.com/user-attachments/assets/981a0adc-eac2-484d-8141-7d62c394fd0f ) + !["Chat Engine Creation Page - Basic Information Section"](https://github.com/user-attachments/assets/2b7695b8-ce4f-43dd-b19c-fbccb84cebe2) 3. In the `Retrieval` section, you can configure [knowledge base](./knowledge-base.mdx) as the knowledge source and related retrieval parameters. - !["Chat Engine Configuration Page - Retrieval Section"](https://github.com/user-attachments/assets/ed3f3320-a623-4ebb-a10e-d3bee264f20f) + !["Chat Engine Configuration Page - Retrieval Section"](https://github.com/user-attachments/assets/5592deeb-f4f1-4ad9-bc7e-a9eb82a88744) 4. You can also change the prompt to customize the chat experience for your users. The prompt is the message that the chatbot sends to the user to start the conversation. - !["Chat Engine Configuration Page - Prompt Section"](https://github.com/user-attachments/assets/21efccf0-093b-4243-87c8-159ef5975e3c) + !["Chat Engine Configuration Page - Prompt Section"](https://github.com/user-attachments/assets/a5cd25a5-5ec0-4383-b579-dc57b2d6e9f9) 5. Click the `Create Chat Engine` button to finish the configuration. diff --git a/docs/src/content/knowledge-base.mdx b/docs/src/content/knowledge-base.mdx index 35d8ccdec..54051ba76 100644 --- a/docs/src/content/knowledge-base.mdx +++ b/docs/src/content/knowledge-base.mdx @@ -30,7 +30,7 @@ After logging in with an admin account, you can configure the Knowledge Base in 5. Go to `Chat Engine` configuration page, select the knowledge base you created and click `Save` to enable it. - ![Chat Engine Configuration](https://github.com/user-attachments/assets/2572dc02-ce77-4d2f-a4ba-68bc6858d44c) + ![Chat Engine Configuration](https://github.com/user-attachments/assets/5592deeb-f4f1-4ad9-bc7e-a9eb82a88744) ## Data Source Management @@ -76,7 +76,7 @@ You can delete data source by click the **Delete** button of data source. You can manage documents in the **Documents** tab. -![Documents Page](https://github.com/user-attachments/assets/878d2809-97a6-4a87-8d3a-3481f8bb863b) +![Documents Page](https://github.com/user-attachments/assets/832959a4-f670-455d-b765-cae4a569e653) #### Delete Documents diff --git a/docs/src/content/llm.mdx b/docs/src/content/llm.mdx index 544d46297..30119dc44 100644 --- a/docs/src/content/llm.mdx +++ b/docs/src/content/llm.mdx @@ -52,6 +52,18 @@ To learn more about Vertex AI, please visit [Vertex AI](https://cloud.google.com Follow the UI to configure the Gitee AI provider. To learn more about Gitee AI, please visit [Gitee AI](https://ai.gitee.com/serverless-api). +#### Ollama + +Default config: + +```json +{ + "api_base": "http://localhost:11434" +} +``` + +To learn more about Ollama, please visit [Ollama](https://ollama.com/). + ### OpenAI To learn more about OpenAI, please visit [OpenAI](https://platform.openai.com/). @@ -95,17 +107,6 @@ Default config: To learn more about BigModel, please visit [BigModel](https://open.bigmodel.cn/). */} -#### Ollama - -Default config: - -```json -{ - "api_base": "http://localhost:11434" -} -``` - -To learn more about Ollama, please visit [Ollama](https://ollama.com/). #### vLLM diff --git a/docs/src/content/quick-start.mdx b/docs/src/content/quick-start.mdx index d6a500343..28f2e430b 100644 --- a/docs/src/content/quick-start.mdx +++ b/docs/src/content/quick-start.mdx @@ -59,7 +59,7 @@ Go to the **Chat Engines** page to [set up the chat engine](./chat-engine). > The chat engine is used to chat with users. -![Set up Chat Engine](https://github.com/user-attachments/assets/2572dc02-ce77-4d2f-a4ba-68bc6858d44c) +![Set up Chat Engine](https://github.com/user-attachments/assets/5592deeb-f4f1-4ad9-bc7e-a9eb82a88744) ## Step 5: Usage diff --git a/docs/src/content/releases/v0.5.0.md b/docs/src/content/releases/v0.5.0.md index 0c5877261..1d0fbbf79 100644 --- a/docs/src/content/releases/v0.5.0.md +++ b/docs/src/content/releases/v0.5.0.md @@ -1,4 +1,4 @@ -# Release Notes for v0.5.0 +# Release Notes for v0.5.0 (Candidate Release) ## Highlights