@lobehub/chat 1.91.0 → 1.91.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/CHANGELOG.md +25 -0
- package/changelog/v1.json +9 -0
- package/docs/changelog/2023-09-09-plugin-system.mdx +2 -3
- package/docs/changelog/2023-11-14-gpt4-vision.mdx +4 -6
- package/docs/changelog/2023-11-19-tts-stt.mdx +2 -3
- package/docs/changelog/2023-12-22-dalle-3.mdx +2 -5
- package/docs/changelog/2023-12-22-dalle-3.zh-CN.mdx +2 -2
- package/docs/changelog/2024-02-08-sso-oauth.mdx +2 -2
- package/docs/changelog/2024-06-19-lobe-chat-v1.mdx +2 -3
- package/docs/changelog/2024-06-19-lobe-chat-v1.zh-CN.mdx +2 -2
- package/docs/changelog/2024-07-19-gpt-4o-mini.mdx +2 -3
- package/docs/changelog/2024-07-19-gpt-4o-mini.zh-CN.mdx +2 -2
- package/docs/changelog/2024-08-02-lobe-chat-database-docker.mdx +2 -3
- package/docs/changelog/2024-08-21-file-upload-and-knowledge-base.mdx +4 -5
- package/docs/changelog/2024-09-13-openai-o1-models.mdx +2 -2
- package/docs/changelog/2024-09-20-artifacts.mdx +2 -3
- package/docs/changelog/2024-09-20-artifacts.zh-CN.mdx +2 -2
- package/docs/changelog/2024-10-27-pin-assistant.mdx +2 -3
- package/docs/changelog/2024-11-06-share-text-json.mdx +2 -4
- package/docs/changelog/2024-11-06-share-text-json.zh-CN.mdx +2 -2
- package/docs/changelog/2024-11-25-november-providers.mdx +2 -2
- package/docs/changelog/2024-11-27-forkable-chat.mdx +2 -2
- package/docs/changelog/2025-01-03-user-profile.mdx +2 -2
- package/docs/changelog/2025-01-22-new-ai-provider.mdx +2 -2
- package/docs/changelog/2025-02-02-deepseek-r1.mdx +4 -4
- package/docs/development/basic/add-new-authentication-providers.zh-CN.mdx +1 -2
- package/docs/development/basic/chat-api.mdx +2 -4
- package/docs/development/basic/chat-api.zh-CN.mdx +2 -4
- package/docs/development/internationalization/internationalization-implementation.mdx +10 -10
- package/docs/development/internationalization/internationalization-implementation.zh-CN.mdx +10 -10
- package/docs/self-hosting/advanced/analytics.mdx +2 -2
- package/docs/self-hosting/advanced/auth/clerk.mdx +2 -2
- package/docs/self-hosting/advanced/auth/next-auth/auth0.mdx +2 -3
- package/docs/self-hosting/advanced/auth/next-auth/authelia.mdx +2 -3
- package/docs/self-hosting/advanced/auth/next-auth/authentik.mdx +2 -3
- package/docs/self-hosting/advanced/auth/next-auth/casdoor.mdx +4 -7
- package/docs/self-hosting/advanced/auth/next-auth/casdoor.zh-CN.mdx +0 -3
- package/docs/self-hosting/advanced/auth/next-auth/cloudflare-zero-trust.mdx +2 -3
- package/docs/self-hosting/advanced/auth/next-auth/cloudflare-zero-trust.zh-CN.mdx +2 -2
- package/docs/self-hosting/advanced/auth/next-auth/github.mdx +2 -3
- package/docs/self-hosting/advanced/auth/next-auth/keycloak.mdx +12 -6
- package/docs/self-hosting/advanced/auth/next-auth/keycloak.zh-CN.mdx +5 -1
- package/docs/self-hosting/advanced/auth/next-auth/logto.mdx +8 -14
- package/docs/self-hosting/advanced/auth/next-auth/logto.zh-CN.mdx +6 -12
- package/docs/self-hosting/advanced/auth/next-auth/microsoft-entra-id.mdx +2 -3
- package/docs/self-hosting/advanced/auth/next-auth/wechat.mdx +2 -2
- package/docs/self-hosting/advanced/auth/next-auth/zitadel.mdx +2 -3
- package/docs/self-hosting/advanced/auth.mdx +2 -3
- package/docs/self-hosting/advanced/desktop.mdx +2 -1
- package/docs/self-hosting/advanced/desktop.zh-CN.mdx +1 -3
- package/docs/self-hosting/advanced/feature-flags.mdx +2 -3
- package/docs/self-hosting/advanced/knowledge-base.mdx +4 -3
- package/docs/self-hosting/advanced/model-list.mdx +11 -10
- package/docs/self-hosting/advanced/model-list.zh-CN.mdx +10 -9
- package/docs/self-hosting/advanced/observability/langfuse.mdx +2 -3
- package/docs/self-hosting/advanced/online-search.mdx +11 -10
- package/docs/self-hosting/advanced/online-search.zh-CN.mdx +7 -7
- package/docs/self-hosting/advanced/s3/tencent-cloud.mdx +2 -2
- package/docs/self-hosting/advanced/settings-url-share.mdx +2 -3
- package/docs/self-hosting/advanced/upstream-sync.mdx +2 -3
- package/docs/self-hosting/advanced/webrtc.mdx +2 -2
- package/docs/self-hosting/environment-variables/analytics.mdx +2 -3
- package/docs/self-hosting/environment-variables/auth.mdx +2 -3
- package/docs/self-hosting/environment-variables/basic.mdx +4 -5
- package/docs/self-hosting/environment-variables/basic.zh-CN.mdx +2 -2
- package/docs/self-hosting/environment-variables/model-provider.mdx +2 -3
- package/docs/self-hosting/environment-variables/s3.mdx +3 -5
- package/docs/self-hosting/environment-variables.mdx +2 -2
- package/docs/self-hosting/examples/azure-openai.mdx +2 -2
- package/docs/self-hosting/examples/ollama.mdx +2 -3
- package/docs/self-hosting/faq/no-v1-suffix.mdx +4 -4
- package/docs/self-hosting/faq/proxy-with-unable-to-verify-leaf-signature.mdx +2 -3
- package/docs/self-hosting/platform/alibaba-cloud.mdx +2 -3
- package/docs/self-hosting/platform/btpanel.mdx +5 -5
- package/docs/self-hosting/platform/btpanel.zh-CN.mdx +4 -3
- package/docs/self-hosting/platform/docker-compose.mdx +2 -3
- package/docs/self-hosting/platform/docker-compose.zh-CN.mdx +0 -2
- package/docs/self-hosting/platform/docker.mdx +2 -2
- package/docs/self-hosting/platform/netlify.mdx +2 -4
- package/docs/self-hosting/platform/netlify.zh-CN.mdx +2 -2
- package/docs/self-hosting/platform/railway.mdx +2 -3
- package/docs/self-hosting/platform/repocloud.mdx +2 -3
- package/docs/self-hosting/platform/sealos.mdx +2 -2
- package/docs/self-hosting/platform/tencentcloud-lighthouse.mdx +2 -3
- package/docs/self-hosting/platform/vercel.mdx +2 -3
- package/docs/self-hosting/platform/zeabur.mdx +2 -2
- package/docs/self-hosting/server-database/docker-compose.mdx +65 -44
- package/docs/self-hosting/server-database/docker-compose.zh-CN.mdx +48 -55
- package/docs/self-hosting/server-database/docker.mdx +2 -2
- package/docs/self-hosting/server-database/docker.zh-CN.mdx +2 -2
- package/docs/self-hosting/server-database/dokploy.mdx +4 -5
- package/docs/self-hosting/server-database/dokploy.zh-CN.mdx +137 -138
- package/docs/self-hosting/server-database/netlify.mdx +2 -2
- package/docs/self-hosting/server-database/netlify.zh-CN.mdx +2 -2
- package/docs/self-hosting/server-database/railway.mdx +2 -2
- package/docs/self-hosting/server-database/repocloud.mdx +2 -2
- package/docs/self-hosting/server-database/sealos.mdx +4 -5
- package/docs/self-hosting/server-database/sealos.zh-CN.mdx +18 -20
- package/docs/self-hosting/server-database/vercel.mdx +5 -3
- package/docs/self-hosting/server-database/vercel.zh-CN.mdx +2 -2
- package/docs/self-hosting/server-database/zeabur.mdx +2 -2
- package/docs/self-hosting/server-database.mdx +1 -1
- package/docs/self-hosting/server-database.zh-CN.mdx +2 -1
- package/docs/self-hosting/start.mdx +2 -2
- package/docs/self-hosting/start.zh-CN.mdx +2 -2
- package/docs/usage/agents/agent-organization.mdx +2 -2
- package/docs/usage/agents/concepts.mdx +4 -5
- package/docs/usage/agents/concepts.zh-CN.mdx +2 -2
- package/docs/usage/agents/custom-agent.mdx +3 -4
- package/docs/usage/agents/custom-agent.zh-CN.mdx +1 -1
- package/docs/usage/agents/model.mdx +5 -5
- package/docs/usage/agents/model.zh-CN.mdx +3 -5
- package/docs/usage/agents/prompt.mdx +4 -5
- package/docs/usage/agents/topics.mdx +3 -4
- package/docs/usage/agents/topics.zh-CN.mdx +1 -1
- package/docs/usage/features/agent-market.mdx +3 -11
- package/docs/usage/features/agent-market.zh-CN.mdx +2 -7
- package/docs/usage/features/artifacts.mdx +2 -2
- package/docs/usage/features/auth.mdx +2 -3
- package/docs/usage/features/cot.mdx +2 -2
- package/docs/usage/features/database.mdx +2 -2
- package/docs/usage/features/knowledge-base.mdx +4 -3
- package/docs/usage/features/knowledge-base.zh-CN.mdx +2 -1
- package/docs/usage/features/local-llm.mdx +2 -3
- package/docs/usage/features/mobile.mdx +2 -2
- package/docs/usage/features/more.mdx +2 -3
- package/docs/usage/features/multi-ai-providers.mdx +2 -3
- package/docs/usage/features/plugin-system.mdx +3 -11
- package/docs/usage/features/plugin-system.zh-CN.mdx +1 -8
- package/docs/usage/features/pwa.mdx +4 -4
- package/docs/usage/features/pwa.zh-CN.mdx +2 -1
- package/docs/usage/features/text-to-image.mdx +3 -11
- package/docs/usage/features/text-to-image.zh-CN.mdx +3 -10
- package/docs/usage/features/theme.mdx +2 -3
- package/docs/usage/features/tts.mdx +3 -11
- package/docs/usage/features/tts.zh-CN.mdx +1 -8
- package/docs/usage/features/vision.mdx +3 -11
- package/docs/usage/features/vision.zh-CN.mdx +1 -8
- package/docs/usage/foundation/basic.mdx +2 -3
- package/docs/usage/foundation/share.mdx +2 -3
- package/docs/usage/foundation/text2image.mdx +2 -2
- package/docs/usage/foundation/translate.mdx +2 -2
- package/docs/usage/foundation/tts-stt.mdx +2 -2
- package/docs/usage/foundation/vision.mdx +2 -3
- package/docs/usage/plugins/basic-usage.mdx +2 -3
- package/docs/usage/plugins/custom-plugin.mdx +2 -2
- package/docs/usage/plugins/development.mdx +2 -4
- package/docs/usage/plugins/store.mdx +2 -2
- package/docs/usage/providers/ai21.mdx +2 -2
- package/docs/usage/providers/anthropic.mdx +2 -3
- package/docs/usage/providers/anthropic.zh-CN.mdx +2 -2
- package/docs/usage/providers/azure.mdx +2 -3
- package/docs/usage/providers/azureai.mdx +4 -2
- package/docs/usage/providers/azureai.zh-CN.mdx +2 -1
- package/docs/usage/providers/baichuan.mdx +2 -3
- package/docs/usage/providers/bedrock.mdx +2 -3
- package/docs/usage/providers/cloudflare.mdx +3 -2
- package/docs/usage/providers/deepseek.mdx +2 -2
- package/docs/usage/providers/fireworksai.mdx +2 -2
- package/docs/usage/providers/giteeai.mdx +2 -2
- package/docs/usage/providers/github.mdx +1 -1
- package/docs/usage/providers/github.zh-CN.mdx +1 -1
- package/docs/usage/providers/google.mdx +2 -3
- package/docs/usage/providers/groq.mdx +2 -2
- package/docs/usage/providers/hunyuan.mdx +2 -2
- package/docs/usage/providers/infiniai.zh-CN.mdx +3 -1
- package/docs/usage/providers/internlm.mdx +2 -2
- package/docs/usage/providers/jina.mdx +4 -3
- package/docs/usage/providers/jina.zh-CN.mdx +2 -2
- package/docs/usage/providers/lmstudio.mdx +2 -2
- package/docs/usage/providers/lmstudio.zh-CN.mdx +2 -4
- package/docs/usage/providers/minimax.mdx +2 -3
- package/docs/usage/providers/minimax.zh-CN.mdx +2 -2
- package/docs/usage/providers/mistral.mdx +2 -3
- package/docs/usage/providers/modelscope.mdx +4 -0
- package/docs/usage/providers/modelscope.zh-CN.mdx +4 -0
- package/docs/usage/providers/moonshot.mdx +2 -3
- package/docs/usage/providers/novita.mdx +2 -3
- package/docs/usage/providers/novita.zh-CN.mdx +2 -2
- package/docs/usage/providers/nvidia.mdx +3 -2
- package/docs/usage/providers/ollama/gemma.mdx +2 -3
- package/docs/usage/providers/ollama/gemma.zh-CN.mdx +2 -2
- package/docs/usage/providers/ollama.mdx +2 -2
- package/docs/usage/providers/openai.mdx +5 -5
- package/docs/usage/providers/openai.zh-CN.mdx +3 -3
- package/docs/usage/providers/openrouter.mdx +2 -3
- package/docs/usage/providers/perplexity.mdx +2 -2
- package/docs/usage/providers/ppio.mdx +5 -6
- package/docs/usage/providers/ppio.zh-CN.mdx +6 -6
- package/docs/usage/providers/qiniu.mdx +6 -6
- package/docs/usage/providers/qiniu.zh-CN.mdx +2 -1
- package/docs/usage/providers/qwen.mdx +2 -4
- package/docs/usage/providers/sambanova.mdx +2 -1
- package/docs/usage/providers/sensenova.mdx +2 -2
- package/docs/usage/providers/siliconcloud.mdx +2 -2
- package/docs/usage/providers/stepfun.mdx +2 -3
- package/docs/usage/providers/taichu.mdx +2 -3
- package/docs/usage/providers/togetherai.mdx +2 -2
- package/docs/usage/providers/vllm.mdx +15 -12
- package/docs/usage/providers/vllm.zh-CN.mdx +9 -7
- package/docs/usage/providers/volcengine.mdx +16 -14
- package/docs/usage/providers/wenxin.mdx +2 -2
- package/docs/usage/providers/xai.mdx +2 -2
- package/docs/usage/providers/zeroone.mdx +2 -3
- package/docs/usage/providers/zeroone.zh-CN.mdx +2 -2
- package/docs/usage/providers/zhipu.mdx +2 -3
- package/docs/usage/providers/zhipu.zh-CN.mdx +1 -1
- package/docs/usage/providers.mdx +2 -3
- package/docs/usage/start.mdx +2 -3
- package/docs/usage/tools-calling/anthropic.mdx +2 -2
- package/docs/usage/tools-calling/anthropic.zh-CN.mdx +2 -2
- package/docs/usage/tools-calling/google.mdx +2 -2
- package/docs/usage/tools-calling/google.zh-CN.mdx +4 -4
- package/docs/usage/tools-calling/groq.zh-CN.mdx +2 -2
- package/docs/usage/tools-calling/openai.mdx +2 -2
- package/docs/usage/tools-calling/openai.zh-CN.mdx +2 -2
- package/package.json +1 -1
- package/src/app/(backend)/webapi/user/avatar/[id]/[image]/route.ts +1 -1
- package/src/app/[variants]/(main)/repos/[id]/_layout/Mobile.tsx +7 -7
- package/src/config/aiModels/deepseek.ts +1 -0
- package/src/config/aiModels/hunyuan.ts +1 -0
- package/src/config/aiModels/mistral.ts +1 -2
- package/src/config/aiModels/novita.ts +23 -22
- package/src/config/aiModels/openrouter.ts +1 -0
- package/src/config/aiModels/qwen.ts +11 -11
- package/src/config/aiModels/siliconcloud.ts +7 -6
- package/src/config/aiModels/vertexai.ts +2 -2
- package/src/config/aiModels/wenxin.ts +1 -2
- package/src/config/modelProviders/baichuan.ts +3 -0
- package/src/config/modelProviders/fireworksai.ts +3 -0
- package/src/config/modelProviders/giteeai.ts +3 -0
- package/src/config/modelProviders/github.ts +1 -2
- package/src/config/modelProviders/groq.ts +0 -3
- package/src/config/modelProviders/hunyuan.ts +3 -0
- package/src/config/modelProviders/infiniai.ts +0 -3
- package/src/config/modelProviders/internlm.ts +3 -0
- package/src/config/modelProviders/minimax.ts +3 -4
- package/src/config/modelProviders/moonshot.ts +0 -7
- package/src/config/modelProviders/novita.ts +3 -0
- package/src/config/modelProviders/openrouter.ts +0 -4
- package/src/config/modelProviders/perplexity.ts +0 -3
- package/src/config/modelProviders/qiniu.ts +0 -3
- package/src/config/modelProviders/qwen.ts +0 -3
- package/src/config/modelProviders/sensenova.ts +3 -0
- package/src/config/modelProviders/siliconcloud.ts +0 -3
- package/src/config/modelProviders/spark.ts +0 -5
- package/src/config/modelProviders/stepfun.ts +3 -6
- package/src/config/modelProviders/taichu.ts +3 -0
- package/src/config/modelProviders/tencentcloud.ts +3 -0
- package/src/config/modelProviders/togetherai.ts +3 -0
- package/src/config/modelProviders/upstage.ts +3 -0
- package/src/config/modelProviders/wenxin.ts +3 -4
- package/src/config/modelProviders/xai.ts +0 -3
- package/src/config/modelProviders/zhipu.ts +3 -0
- package/src/database/migrations/meta/0014_snapshot.json +182 -539
- package/src/database/migrations/meta/0016_snapshot.json +182 -539
- package/src/database/repositories/dataImporter/__tests__/fixtures/with-client-id.json +13 -58
- package/src/features/ChatInput/ActionBar/Model/ControlsForm.tsx +1 -8
- package/src/features/ChatInput/ActionBar/Model/index.tsx +8 -16
- package/src/features/ChatInput/ActionBar/Search/Controls.tsx +4 -12
- package/src/features/ChatInput/ActionBar/Search/FCSearchModel.tsx +1 -7
- package/src/features/ChatInput/ActionBar/Search/index.tsx +2 -4
- package/src/features/ModelSwitchPanel/index.tsx +1 -4
- package/src/libs/model-runtime/anthropic/index.test.ts +4 -2
- package/src/libs/model-runtime/utils/streams/anthropic.ts +12 -11
- package/src/libs/model-runtime/utils/streams/openai.ts +6 -4
- package/src/libs/model-runtime/utils/streams/protocol.ts +1 -1
- package/src/libs/model-runtime/utils/streams/spark.test.ts +1 -1
- package/src/libs/model-runtime/utils/streams/spark.ts +1 -2
- package/src/middleware.ts +1 -1
- package/src/server/services/user/index.ts +3 -4
- package/src/services/__tests__/assistant.test.ts +4 -6
- package/src/services/__tests__/tool.test.ts +3 -1
- package/src/store/user/slices/auth/selectors.ts +1 -1
- package/src/store/user/slices/common/action.test.ts +1 -1
- package/src/tools/web-browsing/index.ts +1 -7
@@ -14,10 +14,12 @@ ModelScope (魔塔社区) is Alibaba's open-source model community that provides
|
|
14
14
|
Before using ModelScope API, you need to:
|
15
15
|
|
16
16
|
1. **Create a ModelScope Account**
|
17
|
+
|
17
18
|
- Visit [ModelScope](https://www.modelscope.cn/)
|
18
19
|
- Register for an account
|
19
20
|
|
20
21
|
2. **Bind Alibaba Cloud Account**
|
22
|
+
|
21
23
|
- **Important**: ModelScope API requires binding with an Alibaba Cloud account
|
22
24
|
- Visit your [ModelScope Access Token page](https://www.modelscope.cn/my/myaccesstoken)
|
23
25
|
- Follow the instructions to bind your Alibaba Cloud account
|
@@ -72,11 +74,13 @@ ModelScope provides access to various models including:
|
|
72
74
|
### Common Issues
|
73
75
|
|
74
76
|
1. **"Please bind your Alibaba Cloud account before use" Error**
|
77
|
+
|
75
78
|
- This means you haven't bound your Alibaba Cloud account to ModelScope
|
76
79
|
- Visit [ModelScope Access Token page](https://www.modelscope.cn/my/myaccesstoken)
|
77
80
|
- Complete the Alibaba Cloud account binding process
|
78
81
|
|
79
82
|
2. **401 Authentication Error**
|
83
|
+
|
80
84
|
- Check if your API token is correct
|
81
85
|
- Ensure the token hasn't expired
|
82
86
|
- Verify that your Alibaba Cloud account is properly bound
|
@@ -14,10 +14,12 @@ ModelScope(魔塔社区)是阿里巴巴的开源模型社区,提供各种
|
|
14
14
|
在使用 ModelScope API 之前,您需要:
|
15
15
|
|
16
16
|
1. **创建 ModelScope 账户**
|
17
|
+
|
17
18
|
- 访问 [ModelScope](https://www.modelscope.cn/)
|
18
19
|
- 注册账户
|
19
20
|
|
20
21
|
2. **绑定阿里云账户**
|
22
|
+
|
21
23
|
- **重要**:ModelScope API 需要绑定阿里云账户
|
22
24
|
- 访问您的 [ModelScope 访问令牌页面](https://www.modelscope.cn/my/myaccesstoken)
|
23
25
|
- 按照说明绑定您的阿里云账户
|
@@ -72,11 +74,13 @@ ModelScope 提供各种模型的访问,包括:
|
|
72
74
|
### 常见问题
|
73
75
|
|
74
76
|
1. **"请先绑定阿里云账户后使用" 错误**
|
77
|
+
|
75
78
|
- 这意味着您还没有将阿里云账户绑定到 ModelScope
|
76
79
|
- 访问 [ModelScope 访问令牌页面](https://www.modelscope.cn/my/myaccesstoken)
|
77
80
|
- 完成阿里云账户绑定流程
|
78
81
|
|
79
82
|
2. **401 认证错误**
|
83
|
+
|
80
84
|
- 检查您的 API 令牌是否正确
|
81
85
|
- 确保令牌没有过期
|
82
86
|
- 验证您的阿里云账户是否正确绑定
|
@@ -1,9 +1,8 @@
|
|
1
1
|
---
|
2
2
|
title: Using Moonshot AI API Key in LobeChat
|
3
3
|
description: >-
|
4
|
-
Learn how to integrate Moonshot AI into LobeChat for AI-powered conversations.
|
5
|
-
|
6
|
-
with AI models.
|
4
|
+
Learn how to integrate Moonshot AI into LobeChat for AI-powered conversations. Follow the steps to get the API key, configure Moonshot AI, and start engaging with AI models.
|
5
|
+
|
7
6
|
tags:
|
8
7
|
- Moonshot AI
|
9
8
|
- Web UI
|
@@ -1,9 +1,8 @@
|
|
1
1
|
---
|
2
2
|
title: Using Novita AI API Key in LobeChat
|
3
3
|
description: >-
|
4
|
-
Learn how to integrate Novita AI's language model APIs into LobeChat. Follow
|
5
|
-
|
6
|
-
chat with our various AI models.
|
4
|
+
Learn how to integrate Novita AI's language model APIs into LobeChat. Follow the steps to register, create an Novita AI API key, configure settings, and chat with our various AI models.
|
5
|
+
|
7
6
|
tags:
|
8
7
|
- Novita AI
|
9
8
|
- Llama3
|
@@ -1,8 +1,8 @@
|
|
1
1
|
---
|
2
2
|
title: 在 LobeChat 中使用 Novita AI API Key
|
3
3
|
description: >-
|
4
|
-
学习如何将 Novita AI 的大语言模型 API 集成到 LobeChat 中。跟随以下步骤注册 Novita AI 账号、创建 API
|
5
|
-
|
4
|
+
学习如何将 Novita AI 的大语言模型 API 集成到 LobeChat 中。跟随以下步骤注册 Novita AI 账号、创建 API Key、充值信用额度并在 LobeChat 中进行设置。并与我们的多种 AI 模型交谈。
|
5
|
+
|
6
6
|
tags:
|
7
7
|
- Novita AI
|
8
8
|
- Llama3
|
@@ -32,7 +32,8 @@ This document will guide you on how to access and use AI models provided by Nvid
|
|
32
32
|
- Copy and save the created API Key.
|
33
33
|
|
34
34
|
<Callout type={'warning'}>
|
35
|
-
Please store the key securely as it will only appear once. If you accidentally lose it, you will
|
35
|
+
Please store the key securely as it will only appear once. If you accidentally lose it, you will
|
36
|
+
need to create a new key.
|
36
37
|
</Callout>
|
37
38
|
|
38
39
|
### Step 2: Configure Nvidia NIM in LobeChat
|
@@ -52,4 +53,4 @@ This document will guide you on how to access and use AI models provided by Nvid
|
|
52
53
|
</Callout>
|
53
54
|
</Steps>
|
54
55
|
|
55
|
-
Now you can use the models provided by Nvidia NIM to have conversations in LobeChat.
|
56
|
+
Now you can use the models provided by Nvidia NIM to have conversations in LobeChat.
|
@@ -1,9 +1,8 @@
|
|
1
1
|
---
|
2
2
|
title: Using Google Gemma Model in LobeChat
|
3
3
|
description: >-
|
4
|
-
Learn how to integrate and utilize Google Gemma in LobeChat, an open-source
|
5
|
-
|
6
|
-
to pull and select the Gemma model for natural language processing tasks.
|
4
|
+
Learn how to integrate and utilize Google Gemma in LobeChat, an open-source large language model, in LobeChat with the help of Ollama. Follow these steps to pull and select the Gemma model for natural language processing tasks.
|
5
|
+
|
7
6
|
tags:
|
8
7
|
- Google Gemma
|
9
8
|
- LobeChat
|
@@ -1,8 +1,8 @@
|
|
1
1
|
---
|
2
2
|
title: 在 LobeChat 中使用 Google Gemma 模型
|
3
3
|
description: >-
|
4
|
-
通过 LobeChat 与 Ollama 的集成,轻松使用 Google Gemma 模型进行自然语言处理任务。安装 Ollama,拉取 Gemma
|
5
|
-
|
4
|
+
通过 LobeChat 与 Ollama 的集成,轻松使用 Google Gemma 模型进行自然语言处理任务。安装 Ollama,拉取 Gemma 模型,选择模型面板中的 Gemma 模型,开始对话。
|
5
|
+
|
6
6
|
tags:
|
7
7
|
- Google Gemma
|
8
8
|
- LobeChat
|
@@ -1,8 +1,8 @@
|
|
1
1
|
---
|
2
2
|
title: Using Ollama in LobeChat
|
3
3
|
description: >-
|
4
|
-
Learn how to use Ollama in LobeChat, run LLM locally, and experience
|
5
|
-
|
4
|
+
Learn how to use Ollama in LobeChat, run LLM locally, and experience cutting-edge AI usage.
|
5
|
+
|
6
6
|
tags:
|
7
7
|
- Ollama
|
8
8
|
- Local LLM
|
@@ -1,8 +1,8 @@
|
|
1
1
|
---
|
2
2
|
title: Using OpenAI API Key in LobeChat
|
3
3
|
description: >-
|
4
|
-
Learn how to integrate OpenAI API Key in LobeChat. Support GPT-4o /
|
5
|
-
|
4
|
+
Learn how to integrate OpenAI API Key in LobeChat. Support GPT-4o / GPT-4-turbo / GPT-4-vision
|
5
|
+
|
6
6
|
tags:
|
7
7
|
- OpenAI
|
8
8
|
- ChatGPT
|
@@ -31,7 +31,7 @@ This document will guide you on how to use [OpenAI](https://openai.com/) in Lobe
|
|
31
31
|
alt={'Open the creation window'}
|
32
32
|
inStep
|
33
33
|
src={
|
34
|
-
'https://github-production-user-asset-6210df.s3.amazonaws.com/28616219/296253192-ff2193dd-f125-4e58-82e8-91bc376c0d68.png'
|
34
|
+
'https://github-production-user-asset-6210df.s3.amazonaws.com/28616219/296253192-ff2193dd-f125-4e58-82e8-91bc376c0d68.png'
|
35
35
|
}
|
36
36
|
/>
|
37
37
|
|
@@ -41,7 +41,7 @@ This document will guide you on how to use [OpenAI](https://openai.com/) in Lobe
|
|
41
41
|
alt={'Create API Key'}
|
42
42
|
inStep
|
43
43
|
src={
|
44
|
-
'https://github-production-user-asset-6210df.s3.amazonaws.com/28616219/296254170-803bacf0-4471-4171-ae79-0eab08d621d1.png'
|
44
|
+
'https://github-production-user-asset-6210df.s3.amazonaws.com/28616219/296254170-803bacf0-4471-4171-ae79-0eab08d621d1.png'
|
45
45
|
}
|
46
46
|
/>
|
47
47
|
|
@@ -51,7 +51,7 @@ This document will guide you on how to use [OpenAI](https://openai.com/) in Lobe
|
|
51
51
|
alt={'Retrieve API Key'}
|
52
52
|
inStep
|
53
53
|
src={
|
54
|
-
'https://github-production-user-asset-6210df.s3.amazonaws.com/28616219/296255167-f2745f2b-f083-4ba8-bc78-9b558e0002de.png'
|
54
|
+
'https://github-production-user-asset-6210df.s3.amazonaws.com/28616219/296255167-f2745f2b-f083-4ba8-bc78-9b558e0002de.png'
|
55
55
|
}
|
56
56
|
/>
|
57
57
|
|
@@ -28,7 +28,7 @@ tags:
|
|
28
28
|
alt={'打开创建窗口'}
|
29
29
|
inStep
|
30
30
|
src={
|
31
|
-
'https://github-production-user-asset-6210df.s3.amazonaws.com/28616219/296253192-ff2193dd-f125-4e58-82e8-91bc376c0d68.png'
|
31
|
+
'https://github-production-user-asset-6210df.s3.amazonaws.com/28616219/296253192-ff2193dd-f125-4e58-82e8-91bc376c0d68.png'
|
32
32
|
}
|
33
33
|
/>
|
34
34
|
|
@@ -38,7 +38,7 @@ tags:
|
|
38
38
|
alt={'创建 API Key'}
|
39
39
|
inStep
|
40
40
|
src={
|
41
|
-
'https://github-production-user-asset-6210df.s3.amazonaws.com/28616219/296254170-803bacf0-4471-4171-ae79-0eab08d621d1.png'
|
41
|
+
'https://github-production-user-asset-6210df.s3.amazonaws.com/28616219/296254170-803bacf0-4471-4171-ae79-0eab08d621d1.png'
|
42
42
|
}
|
43
43
|
/>
|
44
44
|
|
@@ -48,7 +48,7 @@ tags:
|
|
48
48
|
alt={'获取 API Key'}
|
49
49
|
inStep
|
50
50
|
src={
|
51
|
-
'https://github-production-user-asset-6210df.s3.amazonaws.com/28616219/296255167-f2745f2b-f083-4ba8-bc78-9b558e0002de.png'
|
51
|
+
'https://github-production-user-asset-6210df.s3.amazonaws.com/28616219/296255167-f2745f2b-f083-4ba8-bc78-9b558e0002de.png'
|
52
52
|
}
|
53
53
|
/>
|
54
54
|
|
@@ -1,9 +1,8 @@
|
|
1
1
|
---
|
2
2
|
title: Using OpenRouter API Key in LobeChat
|
3
3
|
description: >-
|
4
|
-
Learn how to integrate and utilize OpenRouter's language model APIs in
|
5
|
-
|
6
|
-
and configure OpenRouter for seamless conversations.
|
4
|
+
Learn how to integrate and utilize OpenRouter's language model APIs in LobeChat. Follow these steps to register, create an API key, recharge credit, and configure OpenRouter for seamless conversations.
|
5
|
+
|
7
6
|
tags:
|
8
7
|
- OpenRouter
|
9
8
|
- LobeChat
|
@@ -1,8 +1,8 @@
|
|
1
1
|
---
|
2
2
|
title: Using Perplexity AI API Key in LobeChat
|
3
3
|
description: >-
|
4
|
-
Learn how to integrate and use Perplexity AI in LobeChat to enhance your AI
|
5
|
-
|
4
|
+
Learn how to integrate and use Perplexity AI in LobeChat to enhance your AI assistant's capabilities.
|
5
|
+
|
6
6
|
tags:
|
7
7
|
- Perplexity AI
|
8
8
|
- API key
|
@@ -1,9 +1,8 @@
|
|
1
1
|
---
|
2
2
|
title: Using PPIO API Key in LobeChat
|
3
3
|
description: >-
|
4
|
-
Learn how to integrate PPIO's language model APIs into LobeChat. Follow the
|
5
|
-
|
6
|
-
our various AI models.
|
4
|
+
Learn how to integrate PPIO's language model APIs into LobeChat. Follow the steps to register, create an PPIO API key, configure settings, and chat with our various AI models.
|
5
|
+
|
7
6
|
tags:
|
8
7
|
- PPIO
|
9
8
|
- DeepSeek
|
@@ -18,14 +17,14 @@ tags:
|
|
18
17
|
|
19
18
|
<Image alt={'Using PPIO in LobeChat'} cover src={'https://github.com/user-attachments/assets/d0a5e152-160a-4862-8393-546f4e2e5387'} />
|
20
19
|
|
21
|
-
[PPIO](https://ppinfra.com/user/register?invited_by=RQIMOC
|
20
|
+
[PPIO](https://ppinfra.com/user/register?invited_by=RQIMOC\&utm_source=github_lobechat) supports stable and cost-efficient open-source LLM APIs, such as DeepSeek, Llama, Qwen etc.
|
22
21
|
|
23
22
|
This document will guide you on how to integrate PPIO in LobeChat:
|
24
23
|
|
25
24
|
<Steps>
|
26
25
|
### Step 1: Register and Log in to PPIO
|
27
26
|
|
28
|
-
- Visit [PPIO](https://ppinfra.com/user/register?invited_by=RQIMOC
|
27
|
+
- Visit [PPIO](https://ppinfra.com/user/register?invited_by=RQIMOC\&utm_source=github_lobechat) and create an account
|
29
28
|
- Upon registration, PPIO will provide a ¥5 credit (about 5M tokens).
|
30
29
|
|
31
30
|
<Image alt={'Register PPIO'} height={457} inStep src={'https://github.com/user-attachments/assets/7cb3019b-78c1-48e0-a64c-a6a4836affd9'} />
|
@@ -50,7 +49,7 @@ This document will guide you on how to integrate PPIO in LobeChat:
|
|
50
49
|
|
51
50
|
<Callout type={'warning'}>
|
52
51
|
During usage, you may need to pay the API service provider, please refer to PPIO's [pricing
|
53
|
-
policy](https://ppinfra.com/llm-api?utm_source=github_lobe-chat
|
52
|
+
policy](https://ppinfra.com/llm-api?utm_source=github_lobe-chat\&utm_medium=github_readme\&utm_campaign=link).
|
54
53
|
</Callout>
|
55
54
|
</Steps>
|
56
55
|
|
@@ -1,8 +1,8 @@
|
|
1
1
|
---
|
2
2
|
title: 在 LobeChat 中使用 PPIO 派欧云 API Key
|
3
3
|
description: >-
|
4
|
-
学习如何将 PPIO 派欧云的 LLM API 集成到 LobeChat 中。跟随以下步骤注册 PPIO 账号、创建 API Key、并在 LobeChat
|
5
|
-
|
4
|
+
学习如何将 PPIO 派欧云的 LLM API 集成到 LobeChat 中。跟随以下步骤注册 PPIO 账号、创建 API Key、并在 LobeChat 中进行设置。
|
5
|
+
|
6
6
|
tags:
|
7
7
|
- PPIO
|
8
8
|
- PPInfra
|
@@ -17,14 +17,14 @@ tags:
|
|
17
17
|
|
18
18
|
<Image alt={'在 LobeChat 中使用 PPIO'} cover src={'https://github.com/user-attachments/assets/d0a5e152-160a-4862-8393-546f4e2e5387'} />
|
19
19
|
|
20
|
-
[PPIO 派欧云](https://ppinfra.com/user/register?invited_by=RQIMOC
|
20
|
+
[PPIO 派欧云](https://ppinfra.com/user/register?invited_by=RQIMOC\&utm_source=github_lobechat)提供稳定、高性价比的开源模型 API 服务,支持 DeepSeek 全系列、Llama、Qwen 等行业领先大模型。
|
21
21
|
|
22
22
|
本文档将指导你如何在 LobeChat 中使用 PPIO:
|
23
23
|
|
24
24
|
<Steps>
|
25
25
|
### 步骤一:注册 PPIO 派欧云账号并登录
|
26
26
|
|
27
|
-
- 访问 [PPIO 派欧云](https://ppinfra.com/user/register?invited_by=RQIMOC
|
27
|
+
- 访问 [PPIO 派欧云](https://ppinfra.com/user/register?invited_by=RQIMOC\&utm_source=github_lobechat) 并注册账号
|
28
28
|
- 注册后,PPIO 会赠送 5 元(约 500 万 tokens)的使用额度
|
29
29
|
|
30
30
|
<Image alt={'注册 PPIO'} height={457} inStep src={'https://github.com/user-attachments/assets/7cb3019b-78c1-48e0-a64c-a6a4836affd9'} />
|
@@ -48,8 +48,8 @@ tags:
|
|
48
48
|
<Image alt={'选择并使用 PPIO 模型'} inStep src={'https://github.com/user-attachments/assets/b824b741-f2d8-42c8-8cb9-1266862affa7'} />
|
49
49
|
|
50
50
|
<Callout type={'warning'}>
|
51
|
-
在使用过程中你可能需要向 API 服务提供商付费,PPIO 的 API 费用参考[这里](https://ppinfra.com/llm-api?utm_source=github_lobe-chat
|
51
|
+
在使用过程中你可能需要向 API 服务提供商付费,PPIO 的 API 费用参考[这里](https://ppinfra.com/llm-api?utm_source=github_lobe-chat\&utm_medium=github_readme\&utm_campaign=link)。
|
52
52
|
</Callout>
|
53
53
|
</Steps>
|
54
54
|
|
55
|
-
至此你已经可以在 LobeChat 中使用 PPIO 派欧云提供的模型进行对话了。
|
55
|
+
至此你已经可以在 LobeChat 中使用 PPIO 派欧云提供的模型进行对话了。
|
@@ -1,9 +1,8 @@
|
|
1
1
|
---
|
2
2
|
title: Using Qiniu API Key in LobeChat
|
3
3
|
description: >-
|
4
|
-
Learn how to integrate and utilize powerful language models developed by Qiniu
|
5
|
-
|
6
|
-
and configure it for seamless interaction.
|
4
|
+
Learn how to integrate and utilize powerful language models developed by Qiniu into LobeChat for various tasks. Follow the steps to obtain an API key and configure it for seamless interaction.
|
5
|
+
|
7
6
|
tags:
|
8
7
|
- API key
|
9
8
|
- Web UI
|
@@ -16,7 +15,7 @@ tags:
|
|
16
15
|
|
17
16
|
# Using Qiniu's AI Models in LobeChat
|
18
17
|
|
19
|
-
<Image alt={
|
18
|
+
<Image alt={"Using Qiniu's AI Models in LobeChat"} cover src={'https://github.com/user-attachments/assets/3ad2655e-dd20-4534-bf6d-080b3677df86'} />
|
20
19
|
|
21
20
|
[Qiniu](https://www.qiniu.com), as a long-established cloud service provider, delivers cost-effective and reliable AI inference services for both real-time and batch processing, with a simple and user-friendly experience.
|
22
21
|
|
@@ -26,9 +25,10 @@ This document will guide you on how to use Qiniu's AI Models in LobeChat:
|
|
26
25
|
### Step 1: [Obtain AI Model API Key](https://developer.qiniu.com/aitokenapi/12884/how-to-get-api-key)
|
27
26
|
|
28
27
|
- Method 1: Using Console
|
28
|
+
|
29
29
|
1. [Register a Qiniu account](https://s.qiniu.com/umqq6n?ref=developer.qiniu.com\&s_path=%2Faitokenapi%2F12884%2Fhow-to-get-api-key)
|
30
30
|
2. [Go to the console to obtain your API Key](https://portal.qiniu.com/ai-inference/api-key)
|
31
|
-
|
31
|
+
<Image alt={'Obtain your API Key'} inStep src={'https://static.sufy.com/lobehub/439040511-a014769f-262c-4ee4-a727-2c3c45111574.png'} />
|
32
32
|
|
33
33
|
- Method 2: Using Mini Program
|
34
34
|
1. Open the Qiniu mini program
|
@@ -48,7 +48,7 @@ This document will guide you on how to use Qiniu's AI Models in LobeChat:
|
|
48
48
|
- Open Qiniu and enter the obtained API key.
|
49
49
|
- Choose a Qiniu's model for your AI assistant to start the conversation.
|
50
50
|
|
51
|
-
<Image alt={
|
51
|
+
<Image alt={"Select a Qiniu's model and start conversation"} inStep src={'https://static.sufy.com/lobehub/439048211-eadae11f-86e8-4a8d-944d-2f984e257356.png'} />
|
52
52
|
|
53
53
|
<Callout type={'warning'}>
|
54
54
|
During usage, you may need to pay the API service provider. Please refer to [Qiniu's relevant pricing policies](https://developer.qiniu.com/aitokenapi/12898/ai-token-api-pricing).
|
@@ -23,9 +23,10 @@ tags:
|
|
23
23
|
### 步骤一:[获取 AI 大模型 API 密钥](https://developer.qiniu.com/aitokenapi/12884/how-to-get-api-key)
|
24
24
|
|
25
25
|
- 方法一:使用控制台获取
|
26
|
+
|
26
27
|
1. [注册七牛账号](https://s.qiniu.com/umqq6n?ref=developer.qiniu.com\&s_path=%2Faitokenapi%2F12884%2Fhow-to-get-api-key)
|
27
28
|
2. [前往控制台获取 API Key](https://portal.qiniu.com/ai-inference/api-key)
|
28
|
-
|
29
|
+
<Image alt={'获取 API Key'} inStep src={'https://static.sufy.com/lobehub/438758098-119239c1-8552-420a-9906-de2eab739fc6.png'} />
|
29
30
|
|
30
31
|
- 方法二:使用小程序获取
|
31
32
|
1. 打开七牛小程序
|
@@ -1,10 +1,8 @@
|
|
1
1
|
---
|
2
2
|
title: Using Qwen2 API Key in LobeChat
|
3
3
|
description: >-
|
4
|
-
Learn how to integrate and utilize Tongyi Qianwen, a powerful language model
|
5
|
-
|
6
|
-
the service, obtain the API key, and configure Tongyi Qianwen for seamless
|
7
|
-
interaction.
|
4
|
+
Learn how to integrate and utilize Tongyi Qianwen, a powerful language model by Alibaba Cloud, in LobeChat for various tasks. Follow the steps to activate the service, obtain the API key, and configure Tongyi Qianwen for seamless interaction.
|
5
|
+
|
8
6
|
tags:
|
9
7
|
- Tongyi Qianwen
|
10
8
|
- Alibaba Cloud
|
@@ -27,7 +27,8 @@ This document will guide you on how to use SambaNova in LobeChat:
|
|
27
27
|
- Copy the obtained API key and save it securely
|
28
28
|
|
29
29
|
<Callout type={'warning'}>
|
30
|
-
Please save the generated API Key securely, as it will only appear once. If you accidentally lose
|
30
|
+
Please save the generated API Key securely, as it will only appear once. If you accidentally lose
|
31
|
+
it, you will need to create a new API key.
|
31
32
|
</Callout>
|
32
33
|
|
33
34
|
### Step 2: Configure SambaNova in LobeChat
|
@@ -1,8 +1,8 @@
|
|
1
1
|
---
|
2
2
|
title: Using SenseNova in LobeChat
|
3
3
|
description: >-
|
4
|
-
Learn how to configure and use SenseNova's API Key in LobeChat to start
|
5
|
-
|
4
|
+
Learn how to configure and use SenseNova's API Key in LobeChat to start conversations and interactions.
|
5
|
+
|
6
6
|
tags:
|
7
7
|
- LobeChat
|
8
8
|
- SenseNova
|
@@ -1,8 +1,8 @@
|
|
1
1
|
---
|
2
2
|
title: Using SiliconCloud in LobeChat
|
3
3
|
description: >-
|
4
|
-
Learn how to integrate and utilize SiliconCloud's language model APIs in
|
5
|
-
|
4
|
+
Learn how to integrate and utilize SiliconCloud's language model APIs in LobeChat.
|
5
|
+
|
6
6
|
tags:
|
7
7
|
- LobeChat
|
8
8
|
- SiliconCloud
|
@@ -1,9 +1,8 @@
|
|
1
1
|
---
|
2
2
|
title: Using Stepfun API Key in LobeChat
|
3
3
|
description: >-
|
4
|
-
Learn how to integrate Stepfun AI models into LobeChat for engaging
|
5
|
-
|
6
|
-
and select a model to start chatting.
|
4
|
+
Learn how to integrate Stepfun AI models into LobeChat for engaging conversations. Obtain Stepfun API key, configure Stepfun in LobeChat settings, and select a model to start chatting.
|
5
|
+
|
7
6
|
tags:
|
8
7
|
- Stepfun
|
9
8
|
- API key
|
@@ -1,9 +1,8 @@
|
|
1
1
|
---
|
2
2
|
title: Using Taichu API Key in LobeChat
|
3
3
|
description: >-
|
4
|
-
Learn how to integrate Taichu AI into LobeChat for enhanced conversational
|
5
|
-
|
6
|
-
models.
|
4
|
+
Learn how to integrate Taichu AI into LobeChat for enhanced conversational experiences. Follow the steps to configure Taichu AI and start using its models.
|
5
|
+
|
7
6
|
tags:
|
8
7
|
- LobeChat
|
9
8
|
- Taichu
|
@@ -1,8 +1,8 @@
|
|
1
1
|
---
|
2
2
|
title: Using Together AI in LobeChat API Key
|
3
3
|
description: >-
|
4
|
-
Learn how to integrate Together AI into LobeChat, obtain the API key,
|
5
|
-
|
4
|
+
Learn how to integrate Together AI into LobeChat, obtain the API key, configure settings, and start conversations with AI models.
|
5
|
+
|
6
6
|
tags:
|
7
7
|
- Together AI
|
8
8
|
- API key
|
@@ -21,13 +21,13 @@ This document will guide you on how to use vLLM in LobeChat:
|
|
21
21
|
|
22
22
|
vLLM has certain requirements for hardware and software environments. Be sure to configure according to the following requirements:
|
23
23
|
|
24
|
-
| Hardware Requirements
|
25
|
-
|
|
26
|
-
| GPU
|
27
|
-
| CPU
|
24
|
+
| Hardware Requirements | |
|
25
|
+
| --------------------- | ----------------------------------------------------------------------- |
|
26
|
+
| GPU | - NVIDIA CUDA <br /> - AMD ROCm <br /> - Intel XPU |
|
27
|
+
| CPU | - Intel/AMD x86 <br /> - ARM AArch64 <br /> - Apple silicon |
|
28
28
|
| Other AI Accelerators | - Google TPU <br /> - Intel Gaudi <br /> - AWS Neuron <br /> - OpenVINO |
|
29
29
|
|
30
|
-
| Software Requirements
|
30
|
+
| Software Requirements |
|
31
31
|
| --------------------------------------- |
|
32
32
|
| - OS: Linux <br /> - Python: 3.9 – 3.12 |
|
33
33
|
|
@@ -55,8 +55,10 @@ This document will guide you on how to use vLLM in LobeChat:
|
|
55
55
|
pip install vllm
|
56
56
|
```
|
57
57
|
|
58
|
-
<Callout type={
|
59
|
-
For non-CUDA platforms, please refer to the [official
|
58
|
+
<Callout type={'note'}>
|
59
|
+
For non-CUDA platforms, please refer to the [official
|
60
|
+
documentation](https://docs.vllm.ai/en/latest/getting_started/installation/index.html#installation-index)
|
61
|
+
to learn how to install vLLM.
|
60
62
|
</Callout>
|
61
63
|
|
62
64
|
### Step 3: Start Local Service
|
@@ -72,7 +74,8 @@ This document will guide you on how to use vLLM in LobeChat:
|
|
72
74
|
You can enable the server to check the API key in the header by passing the parameter `--api-key` or the environment variable `VLLM_API_KEY`. If not set, no API Key is required to access.
|
73
75
|
|
74
76
|
<Callout type={'note'}>
|
75
|
-
For more detailed vLLM server configuration, please refer to the [official
|
77
|
+
For more detailed vLLM server configuration, please refer to the [official
|
78
|
+
documentation](https://docs.vllm.ai/en/latest/).
|
76
79
|
</Callout>
|
77
80
|
|
78
81
|
### Step 4: Configure vLLM in LobeChat
|
@@ -84,15 +87,15 @@ This document will guide you on how to use vLLM in LobeChat:
|
|
84
87
|
|
85
88
|
- Open the vLLM service provider and fill in the API service address and API Key.
|
86
89
|
|
87
|
-
<Callout type={
|
88
|
-
*
|
89
|
-
|
90
|
+
<Callout type={'warning'}>
|
91
|
+
* If your vLLM is not configured with an API Key, please leave the API Key blank. \* If your vLLM
|
92
|
+
is running locally, please make sure to turn on `Client Request Mode`.
|
90
93
|
</Callout>
|
91
94
|
|
92
95
|
- Add the model you are running to the model list below.
|
93
96
|
- Select a vLLM model to run for your assistant and start the conversation.
|
94
97
|
|
95
|
-
|
98
|
+
<Image alt={'Select vLLM Model'} inStep src={'https://github.com/user-attachments/assets/fcdfb9c5-819a-488f-b28d-0857fe861219'} />
|
96
99
|
</Steps>
|
97
100
|
|
98
101
|
Now you can use the models provided by vLLM in LobeChat to have conversations.
|
@@ -55,8 +55,10 @@ tags:
|
|
55
55
|
pip install vllm
|
56
56
|
```
|
57
57
|
|
58
|
-
<Callout type={
|
59
|
-
对于非 CUDA
|
58
|
+
<Callout type={'note'}>
|
59
|
+
对于非 CUDA
|
60
|
+
平台,请参考[官方文档](https://docs.vllm.ai/en/latest/getting_started/installation/index.html#installation-index)了解如何安装
|
61
|
+
vLLM
|
60
62
|
</Callout>
|
61
63
|
|
62
64
|
### 步骤三:启动本地服务
|
@@ -84,15 +86,15 @@ tags:
|
|
84
86
|
|
85
87
|
- 打开 vLLM 服务商并填入 API 服务地址以及 API Key
|
86
88
|
|
87
|
-
<Callout type={
|
88
|
-
* 如果你的 vLLM 没有配置 API Key,请将 API Key 留空
|
89
|
-
|
89
|
+
<Callout type={'warning'}>
|
90
|
+
* 如果你的 vLLM 没有配置 API Key,请将 API Key 留空 \* 如果你的 vLLM
|
91
|
+
运行在本地,请确保打开`客户端请求模式`
|
90
92
|
</Callout>
|
91
93
|
|
92
94
|
- 在下方的模型列表中添加你运行的模型
|
93
95
|
- 为你的助手选择一个 vLLM 运行的模型即可开始对话
|
94
96
|
|
95
|
-
|
97
|
+
<Image alt={'选择 vLLM 模型'} inStep src={'https://github.com/user-attachments/assets/fcdfb9c5-819a-488f-b28d-0857fe861219'} />
|
96
98
|
</Steps>
|
97
99
|
|
98
|
-
至此你已经可以在 LobeChat 中使用 vLLM 提供的模型进行对话了。
|
100
|
+
至此你已经可以在 LobeChat 中使用 vLLM 提供的模型进行对话了。
|
@@ -8,6 +8,7 @@ tags:
|
|
8
8
|
- API Key
|
9
9
|
- Web UI
|
10
10
|
---
|
11
|
+
|
11
12
|
# Using Volcengine in LobeChat
|
12
13
|
|
13
14
|
<Image alt={'Using Volcengine in LobeChat'} cover src={'https://github.com/user-attachments/assets/b9da065e-f964-44f2-8260-59e182be2729'} />
|
@@ -18,30 +19,31 @@ This document will guide you on how to use Volcengine in LobeChat:
|
|
18
19
|
|
19
20
|
<Steps>
|
20
21
|
### Step 1: Obtain the Volcengine API Key
|
22
|
+
|
21
23
|
- First, visit the [Volcengine official website](https://www.volcengine.com/) and complete the registration and login process.
|
22
24
|
- Access the Volcengine console and navigate to [Volcano Ark](https://console.volcengine.com/ark/).
|
23
25
|
|
24
|
-
<Image alt={'Entering Volcano Ark API Management Page'} inStep src={'https://github.com/user-attachments/assets/d6ace96f-0398-4847-83e1-75c3004a0e8b'} />
|
25
|
-
|
26
|
-
- Go to the `API Key Management` menu and click `Create API Key`.
|
27
|
-
- Copy and save the created API Key.
|
26
|
+
<Image alt={'Entering Volcano Ark API Management Page'} inStep src={'https://github.com/user-attachments/assets/d6ace96f-0398-4847-83e1-75c3004a0e8b'} />
|
28
27
|
|
29
|
-
|
28
|
+
- Go to the `API Key Management` menu and click `Create API Key`.
|
29
|
+
- Copy and save the created API Key.
|
30
30
|
|
31
|
-
|
32
|
-
- Find the `Volcengine` option in the provider list.
|
31
|
+
### Step 2: Configure Volcengine in LobeChat
|
33
32
|
|
34
|
-
|
33
|
+
- Navigate to the `Application Settings` page in LobeChat and select `AI Service Providers`.
|
34
|
+
- Find the `Volcengine` option in the provider list.
|
35
35
|
|
36
|
-
|
37
|
-
- Choose a Volcengine model for your assistant to start the conversation.
|
36
|
+
<Image alt={'Entering Volcengine API Key'} inStep src={'https://github.com/user-attachments/assets/237864d6-cc5d-4fe4-8a2b-c278016855c5'} />
|
38
37
|
|
39
|
-
|
38
|
+
- Open the Volcengine service provider and enter the obtained API Key.
|
39
|
+
- Choose a Volcengine model for your assistant to start the conversation.
|
40
40
|
|
41
|
-
<
|
42
|
-
During usage, you may need to pay the API service provider, so please refer to Volcengine's pricing policy.
|
43
|
-
</Callout>
|
41
|
+
<Image alt={'Selecting a Volcengine Model'} inStep src={'https://github.com/user-attachments/assets/702c191f-8250-4462-aed7-accb18b18dea'} />
|
44
42
|
|
43
|
+
<Callout type={'warning'}>
|
44
|
+
During usage, you may need to pay the API service provider, so please refer to Volcengine's
|
45
|
+
pricing policy.
|
46
|
+
</Callout>
|
45
47
|
</Steps>
|
46
48
|
|
47
49
|
You can now use the models provided by Volcengine for conversations in LobeChat.
|
@@ -1,8 +1,8 @@
|
|
1
1
|
---
|
2
2
|
title: Using Wenxin Qianfan in LobeChat
|
3
3
|
description: >-
|
4
|
-
Learn how to integrate and utilize Wenxin Qianfan's language model APIs in
|
5
|
-
|
4
|
+
Learn how to integrate and utilize Wenxin Qianfan's language model APIs in LobeChat.
|
5
|
+
|
6
6
|
tags:
|
7
7
|
- LobeChat
|
8
8
|
- 百度
|
@@ -1,8 +1,8 @@
|
|
1
1
|
---
|
2
2
|
title: Using xAI in LobeChat
|
3
3
|
description: >-
|
4
|
-
Learn how to configure and use xAI's API Key in LobeChat to start
|
5
|
-
|
4
|
+
Learn how to configure and use xAI's API Key in LobeChat to start conversations and interactions.
|
5
|
+
|
6
6
|
tags:
|
7
7
|
- LobeChat
|
8
8
|
- xAI
|