@lobehub/chat 1.1.15 → 1.1.16
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md
CHANGED
|
@@ -2,6 +2,31 @@
|
|
|
2
2
|
|
|
3
3
|
# Changelog
|
|
4
4
|
|
|
5
|
+
### [Version 1.1.16](https://github.com/lobehub/lobe-chat/compare/v1.1.15...v1.1.16)
|
|
6
|
+
|
|
7
|
+
<sup>Released on **2024-06-29**</sup>
|
|
8
|
+
|
|
9
|
+
#### 🐛 Bug Fixes
|
|
10
|
+
|
|
11
|
+
- **misc**: Fix clerk `UNAUTHORIZED` error after long-time hang-up.
|
|
12
|
+
|
|
13
|
+
<br/>
|
|
14
|
+
|
|
15
|
+
<details>
|
|
16
|
+
<summary><kbd>Improvements and Fixes</kbd></summary>
|
|
17
|
+
|
|
18
|
+
#### What's fixed
|
|
19
|
+
|
|
20
|
+
- **misc**: Fix clerk `UNAUTHORIZED` error after long-time hang-up, closes [#3084](https://github.com/lobehub/lobe-chat/issues/3084) ([a148c3b](https://github.com/lobehub/lobe-chat/commit/a148c3b))
|
|
21
|
+
|
|
22
|
+
</details>
|
|
23
|
+
|
|
24
|
+
<div align="right">
|
|
25
|
+
|
|
26
|
+
[](#readme-top)
|
|
27
|
+
|
|
28
|
+
</div>
|
|
29
|
+
|
|
5
30
|
### [Version 1.1.15](https://github.com/lobehub/lobe-chat/compare/v1.1.14...v1.1.15)
|
|
6
31
|
|
|
7
32
|
<sup>Released on **2024-06-28**</sup>
|
|
@@ -27,7 +27,7 @@ First, you need to install Ollama. For detailed steps on installing and configur
|
|
|
27
27
|
Assuming you have already started the Ollama service locally on port `11434`. Run the following Docker command to start LobeChat locally:
|
|
28
28
|
|
|
29
29
|
```bash
|
|
30
|
-
docker run -d -p 3210:3210 -e OLLAMA_PROXY_URL=http://host.docker.internal:11434
|
|
30
|
+
docker run -d -p 3210:3210 -e OLLAMA_PROXY_URL=http://host.docker.internal:11434 lobehub/lobe-chat
|
|
31
31
|
```
|
|
32
32
|
|
|
33
33
|
Now, you can use LobeChat to converse with the local LLM.
|
|
@@ -25,7 +25,7 @@ Ollama 是一款强大的本地运行大型语言模型(LLM)的框架,支
|
|
|
25
25
|
假设你已经在本地 `11434` 端口启动了 Ollama 服务。运行以下 Docker 命令行,在本地启动 LobeChat:
|
|
26
26
|
|
|
27
27
|
```bash
|
|
28
|
-
docker run -d -p 3210:3210 -e OLLAMA_PROXY_URL=http://host.docker.internal:11434
|
|
28
|
+
docker run -d -p 3210:3210 -e OLLAMA_PROXY_URL=http://host.docker.internal:11434 lobehub/lobe-chat
|
|
29
29
|
```
|
|
30
30
|
|
|
31
31
|
接下来,你就可以使用 LobeChat 与本地 LLM 对话了。
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@lobehub/chat",
|
|
3
|
-
"version": "1.1.
|
|
3
|
+
"version": "1.1.16",
|
|
4
4
|
"description": "Lobe Chat - an open-source, high-performance chatbot framework that supports speech synthesis, multimodal, and extensible Function Call plugin system. Supports one-click free deployment of your private ChatGPT/LLM web application.",
|
|
5
5
|
"keywords": [
|
|
6
6
|
"framework",
|
package/src/middleware.ts
CHANGED
|
@@ -53,7 +53,12 @@ export default authEnv.NEXT_PUBLIC_ENABLE_CLERK_AUTH
|
|
|
53
53
|
(auth, req) => {
|
|
54
54
|
if (isProtectedRoute(req)) auth().protect();
|
|
55
55
|
},
|
|
56
|
-
{
|
|
56
|
+
{
|
|
57
|
+
// https://github.com/lobehub/lobe-chat/pull/3084
|
|
58
|
+
clockSkewInMs: 60 * 60 * 1000,
|
|
59
|
+
signInUrl: '/login',
|
|
60
|
+
signUpUrl: '/signup',
|
|
61
|
+
},
|
|
57
62
|
)
|
|
58
63
|
: authEnv.NEXT_PUBLIC_ENABLE_NEXT_AUTH
|
|
59
64
|
? nextAuthMiddleware
|