Compare commits

..

28 Commits

Author SHA1 Message Date
Archer
019bf67e2d Snip test (#3204)
* fix: index

* fix: snapshot error; perf: snapshot diff compare

* perf: init simple edit history
2024-11-21 16:26:43 +08:00
heheer
9b2c3b242a refactor: snapshot store to diff (#3155)
* refactor: snapshot store to diff

* change initial state position

* fix old snapshot format

* encapsulate json diff
2024-11-21 13:12:42 +08:00
Archer
4f55025906 rFix textspliter (#3200)
* fix: text splitter

* perf: splitter
2024-11-21 12:01:55 +08:00
Archer
489bb076a3 更新 configuration.md (#3188) 2024-11-19 10:19:21 +08:00
Archer
a9db5b57c5 feat: add training retry time (#3187)
* feat: add training retry time

* remoce log
2024-11-18 20:57:03 +08:00
Archer
fdb3720b41 Update 4813.md (#3185) 2024-11-18 18:34:08 +08:00
Archer
00641a8652 Update official_account.md (#3181) 2024-11-18 14:54:35 +08:00
Archer
5c56b375c7 Update official_account.md (#3180) 2024-11-18 14:28:01 +08:00
Archer
d8d9b936c4 fix: custom uid (#3177)
* chat api doc

* fix: custom uid
2024-11-18 10:27:52 +08:00
Archer
b237a3ec55 更新 4813.md (#3174) 2024-11-16 10:32:27 +08:00
Archer
af894680cb Update official_account.md (#3169) 2024-11-15 15:26:28 +08:00
Archer
58745f8c35 4.8.14 test (#3164)
* perf: match base 64 image

* perf: register plugins
2024-11-15 10:35:04 +08:00
Archer
f699061dea 更新 plugin.ts (#3163) 2024-11-14 23:11:33 +08:00
Archer
3f72f88591 perf: doc2x plugins (#3162) 2024-11-14 21:56:13 +08:00
Menghuan1918
be59c2f6a7 更新Doc2X插件:适配新接口 (#3159)
* fix: doc2x now not hava the picture API

* fix: 适配doc2x V2 API

* Update to axios to request doc2x

* Add time out
2024-11-14 20:55:37 +08:00
Archer
795904a357 remove log (#3161) 2024-11-14 20:31:01 +08:00
Archer
e8824987fa 4.8.14 test (#3160)
* perf: remove base64 check

* perf: update doc
2024-11-14 18:33:43 +08:00
Archer
b6d650adfb Update 4813.md (#3158) 2024-11-14 16:46:02 +08:00
Archer
710fa37847 fix: chat variable update (#3156)
* perf: file encoding

* fix: chat variable update
2024-11-14 15:50:47 +08:00
Archer
e22031ca6c update doc (#3151) 2024-11-14 10:45:05 +08:00
heheer
0c9e10dd2b feat: add microsoft oauth (#3152) 2024-11-14 10:33:49 +08:00
Archer
0472dc2967 perf: checkbox dom (#3149)
* perf: checkbox dom

* perf: null check

* perf: simple mode variables

* perf: get csv encoding code

* fix: dataset filter

* perf: code editor ui
2024-11-13 18:29:24 +08:00
居里栈栈
98d4a6ee75 Fix garbled code (#3126)
* 修复:修复引导输入批量导入中文乱码问题

* 兼容非UTF-8的CSV导入

---------

Co-authored-by: 勤劳上班的卑微小张 <jiazhan.zhang@ggimage.com>
2024-11-13 17:02:32 +08:00
Jiangween
c12159bfb4 Remove some global variables (#3143)
* Restore docSite content based on upstream/4.8.13-dev (#3138)

* Restore docSite content based on upstream/4.8.13-dev

* 4813.md缺少更正

* 去除多余全局变量

* Revert "Restore docSite content based on upstream/4.8.13-dev (#3138)"

This reverts commit af4380a332.

* 实现了隐藏多余全局变量
2024-11-13 16:59:03 +08:00
Jiangween
abce1e9cf6 更新中文文档内容及相关图片 (#3146) 2024-11-13 16:14:03 +08:00
Archer
c3cc51c9a0 fix: share page data;Adapt findLastIndex api (#3147)
* perf: share page data

* perf: adapt findLastIndex
2024-11-13 13:08:34 +08:00
heheer
519b519458 fix extra point number input register (#3145) 2024-11-13 12:01:09 +08:00
Archer
e9d52ada73 4.8.13 feature (#3118)
* chore(ui): login page & workflow page (#3046)

* login page & number input & multirow select & llm select

* workflow

* adjust nodes

* New file upload (#3058)

* feat: toolNode aiNode readFileNode adapt new version

* update docker-compose

* update tip

* feat: adapt new file version

* perf: file input

* fix: ts

* feat: add chat history time label (#3024)

* feat:add chat and logs time

* feat: add chat history time label

* code perf

* code perf

---------

Co-authored-by: 勤劳上班的卑微小张 <jiazhan.zhang@ggimage.com>

* add chatType (#3060)

* pref: slow query of full text search (#3044)

* Adapt findLast api;perf: markdown zh format. (#3066)

* perf: context code

* fix: adapt findLast api

* perf: commercial plugin run error

* perf: markdown zh format

* perf: dockerfile proxy (#3067)

* fix ui (#3065)

* fix ui

* fix

* feat: support array reference multi-select (#3041)

* feat: support array reference multi-select

* fix build

* fix

* fix loop multi-select

* adjust condition

* fix get value

* array and non-array conversion

* fix plugin input

* merge func

* feat: iframe code block;perf: workflow selector type (#3076)

* feat: iframe code block

* perf: workflow selector type

* node pluginoutput check (#3074)

* feat: View will move when workflow check error;fix: ui refresh error when continuous file upload (#3077)

* fix: plugin output check

* fix: ui refresh error when continuous file upload

* feat: View will move when workflow check error

* add dispatch try catch (#3075)

* perf: workflow context split (#3083)

* perf: workflow context split

* perf: context

* 4.8.13 test (#3085)

* perf: workflow node ui

* chat iframe url

* feat: support sub route config (#3071)

* feat: support sub route config

* dockerfile

* fix upload

* delete unused code

* 4.8.13 test (#3087)

* fix: image expired

* fix: datacard navbar ui

* perf: build action

* fix: workflow file upload refresh (#3088)

* fix: http tool response (#3097)

* loop node dynamic height (#3092)

* loop node dynamic height

* fix

* fix

* feat: support push chat log (#3093)

* feat: custom uid/metadata

* to: custom info

* fix: chat push latest

* feat: add chat log envs

* refactor: move timer to pushChatLog

* fix: using precise log

---------

Co-authored-by: Finley Ge <m13203533462@163.com>

* 4.8.13 test (#3098)

* perf: loop node refresh

* rename context

* comment

* fix: ts

* perf: push chat log

* array reference check & node ui (#3100)

* feat: loop start add index (#3101)

* feat: loop start add index

* update doc

* 4.8.13 test (#3102)

* fix: loop index;edge parent check

* perf: reference invalid check

* fix: ts

* fix: plugin select files and ai response check (#3104)

* fix: plugin select files and ai response check

* perf: text editor selector;tool call tip;remove invalid image url;

* perf: select file

* perf: drop files

* feat: source id prefix env (#3103)

* 4.8.13 test (#3106)

* perf: select file

* perf: drop files

* perf: env template

* 4.8.13 test (#3107)

* perf: select file

* perf: drop files

* fix: imple mode adapt files

* perf: push chat log (#3109)

* fix: share page load title error (#3111)

* 4.8.13 perf (#3112)

* fix: share page load title error

* update file input doc

* perf: auto add file urls

* perf: auto ser loop node offset height

* 4.8.13 test (#3117)

* perf: plugin

* updat eaction

* feat: add more share config (#3120)

* feat: add more share config

* add i18n en

* fix: missing subroute (#3121)

* perf: outlink config (#3128)

* update action

* perf: outlink config

* fix: ts (#3129)

* 更新 docSite 文档内容 (#3131)

* fix: null pointer (#3130)

* fix: null pointer

* perf: not input text

* update doc url

* perf: outlink default value (#3134)

* update doc (#3136)

* 4.8.13 test (#3137)

* update doc

* perf: completions chat api

* Restore docSite content based on upstream/4.8.13-dev (#3138)

* Restore docSite content based on upstream/4.8.13-dev

* 4813.md缺少更正

* update doc (#3141)

---------

Co-authored-by: heheer <heheer@sealos.io>
Co-authored-by: papapatrick <109422393+Patrickill@users.noreply.github.com>
Co-authored-by: 勤劳上班的卑微小张 <jiazhan.zhang@ggimage.com>
Co-authored-by: Finley Ge <32237950+FinleyGe@users.noreply.github.com>
Co-authored-by: a.e. <49438478+I-Info@users.noreply.github.com>
Co-authored-by: Finley Ge <m13203533462@163.com>
Co-authored-by: Jiangween <145003935+Jiangween@users.noreply.github.com>
2024-11-13 11:29:53 +08:00
120 changed files with 1606 additions and 2995 deletions

Binary file not shown.

Before

Width:  |  Height:  |  Size: 65 KiB

After

Width:  |  Height:  |  Size: 54 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 34 KiB

After

Width:  |  Height:  |  Size: 71 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 144 KiB

After

Width:  |  Height:  |  Size: 164 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 138 KiB

After

Width:  |  Height:  |  Size: 152 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 338 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 16 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 37 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 52 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 148 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 174 KiB

View File

@@ -23,6 +23,7 @@ weight: 708
"systemEnv": {
"vectorMaxProcess": 15,
"qaMaxProcess": 15,
"tokenWorkers": 50, // Token 计算线程保持数,会持续占用内存,不能设置太大。
"pgHNSWEfSearch": 100 // 向量搜索参数。越大搜索越精确但是速度越慢。设置为100有99%+精度。
},
"llmModels": [

View File

@@ -35,9 +35,10 @@ curl --location --request POST 'http://localhost:3000/api/v1/chat/completions' \
--header 'Authorization: Bearer fastgpt-xxxxxx' \
--header 'Content-Type: application/json' \
--data-raw '{
"chatId": "abcd",
"chatId": "my_chatId",
"stream": false,
"detail": false,
"responseChatItemId": "my_responseChatItemId",
"variables": {
"uid": "asdfadsfasfd2323",
"name": "张三"
@@ -104,6 +105,7 @@ curl --location --request POST 'http://localhost:3000/api/v1/chat/completions' \
-`undefined` 时(不传入),不使用 FastGpt 提供的上下文功能,完全通过传入的 messages 构建上下文。 不会将你的记录存储到数据库中,你也无法在记录汇总中查阅到。
-`非空字符串`时,意味着使用 chatId 进行对话,自动从 FastGpt 数据库取历史记录,并使用 messages 数组最后一个内容作为用户问题。请自行确保 chatId 唯一长度小于250通常可以是自己系统的对话框ID。
- messages: 结构与 [GPT接口](https://platform.openai.com/docs/api-reference/chat/object) chat模式一致。
- responseChatItemId: string | undefined 。如果传入,则会将该值作为本次对话的响应消息的 IDFastGPT 会自动将该 ID 存入数据库。请确保,在当前`chatId`下,`responseChatItemId`是唯一的。
- detail: 是否返回中间值(模块状态,响应的完整结果等),`stream模式`下会通过`event`进行区分,`非stream模式`结果保存在`responseData`中。
- variables: 模块变量,一个对象,会替换模块中,输入框内容里的`{{key}}`
{{% /alert %}}

View File

@@ -32,7 +32,7 @@ curl --location --request POST 'https://{{host}}/api/admin/initv464' \
4. 优化 - 历史记录模块。弃用旧的历史记录模块,直接在对应地方填写数值即可。
5. 调整 - 知识库搜索模块 topk 逻辑,采用 MaxToken 计算,兼容不同长度的文本块
6. 调整鉴权顺序,提高 apikey 的优先级避免cookie抢占 apikey 的鉴权。
7. 链接读取支持多选择器。参考[Web 站点同步用法](/docs/course/websync)
7. 链接读取支持多选择器。参考[Web 站点同步用法](/docs/guide/knowledge_base/websync/)
8. 修复 - 分享链接图片上传鉴权问题
9. 修复 - Mongo 连接池未释放问题。
10. 修复 - Dataset Intro 无法更新

View File

@@ -21,10 +21,10 @@ weight: 831
## V4.6.5 功能介绍
1. 新增 - [问题优化模块](/docs/workflow/modules/coreferenceresolution/)
2. 新增 - [文本编辑模块](/docs/workflow/modules/text_editor/)
3. 新增 - [判断器模块](/docs/workflow/modules/tfswitch/)
4. 新增 - [自定义反馈模块](/docs/workflow/modules/custom_feedback/)
1. 新增 - [问题优化模块](/docs/guide/workbench/workflow/coreferenceresolution/)
2. 新增 - [文本编辑模块](/docs/guide/workbench/workflow/text_editor/)
3. 新增 - [判断器模块](/docs/guide/workbench/workflow/tfswitch//)
4. 新增 - [自定义反馈模块](/docs/guide/workbench/workflow/custom_feedback/)
5. 新增 - 【内容提取】模块支持选择模型,以及字段枚举
6. 优化 - docx读取兼容表格表格转markdown
7. 优化 - 高级编排连接线交互

View File

@@ -25,7 +25,7 @@ weight: 830
1. 查看 [FastGPT 2024 RoadMap](https://github.com/labring/FastGPT?tab=readme-ov-file#-%E5%9C%A8%E7%BA%BF%E4%BD%BF%E7%94%A8)
2. 新增 - Http 模块请求头支持 Json 编辑器。
3. 新增 - [ReRank模型部署](/docs/development/custom-models/reranker/)
3. 新增 - [ReRank模型部署](/docs/development/custom-models/bge-rerank/)
4. 新增 - 搜索方式:分离向量语义检索,全文检索和重排,通过 RRF 进行排序合并。
5. 优化 - 问题分类提示词id引导。测试国产商用 api 模型(百度阿里智谱讯飞)使用 Prompt 模式均可分类。
6. UI 优化未来将逐步替换新的UI设计。

View File

@@ -91,7 +91,7 @@ curl --location --request POST 'https://{{host}}/api/init/v468' \
1. 新增 - 知识库搜索合并模块。
2. 新增 - 新的 Http 模块,支持更加灵活的参数传入。同时支持了输入输出自动数据类型转化,例如:接口输出的 JSON 类型会自动转成字符串类型,直接给其他模块使用。此外,还补充了一些例子,可在文档中查看。
3. 优化 - 内容补全。将内容补全内置到【知识库搜索】中并实现了一次内容补全即可完成“指代消除”和“问题扩展”。FastGPT知识库搜索详细流程可查看[知识库搜索介绍](/docs/course/data_search/)
3. 优化 - 内容补全。将内容补全内置到【知识库搜索】中并实现了一次内容补全即可完成“指代消除”和“问题扩展”。FastGPT知识库搜索详细流程可查看[知识库搜索介绍](/docs/guide/workbench/workflow/dataset_search/)
4. 优化 - LLM 模型配置,不再区分对话、分类、提取模型。同时支持模型的默认参数,避免不同模型参数冲突,可通过`defaultConfig`传入默认的配置。
5. 优化 - 流响应,参考了`ChatNextWeb`的流,更加丝滑。此外,之前提到的乱码、中断,刷新后又正常了,可能会修复)
6. 修复 - 语音输入文件无法上传。

View File

@@ -1,5 +1,5 @@
---
title: 'V4.8.13(进行中)'
title: 'V4.8.13'
description: 'FastGPT V4.8.13 更新说明'
icon: 'upgrade'
draft: false
@@ -13,13 +13,17 @@ weight: 811
### 2. 修改镜像
- 更新 FastGPT 镜像 tag: v4.8.13-alpha
- 更新 FastGPT 管理端镜像 tag: v4.8.13-alpha fastgpt-pro镜像
- 更新 FastGPT 镜像 tag: v4.8.13-fix
- 更新 FastGPT 商业版镜像 tag: v4.8.13-fix fastgpt-pro镜像
- Sandbox 镜像,可以不更新
### 3. 调整文件上传编排
### 3. 添加环境变量
虽然依然兼容旧版的文件上传编排,但是未来两个版本内将会去除兼容代码,请尽快调整编排,以适应最新的文件上传逻辑。尤其是嵌套应用的文件传递,未来将不会自动传递,必须手动指定传递的文件
- 给 fastgpt 和 fastgpt-pro 镜像添加环境变量:`FE_DOMAIN=http://xx.com`,值为 fastgpt 前端访问地址,注意后面不要加`/`。可以自动补齐相对文件地址的前缀
### 4. 调整文件上传编排
虽然依然兼容旧版的文件上传编排,但是未来两个版本内将会去除兼容代码,请尽快调整编排,以适应最新的文件上传逻辑。尤其是嵌套应用的文件传递,未来将不会自动传递,必须手动指定传递的文件。具体内容可参考: [文件上传变更](/docs/guide/course/fileinput/#4813%E7%89%88%E6%9C%AC%E8%B5%B7%E5%85%B3%E4%BA%8E%E6%96%87%E4%BB%B6%E4%B8%8A%E4%BC%A0%E7%9A%84%E6%9B%B4%E6%96%B0)
## 更新说明
@@ -39,6 +43,9 @@ weight: 811
14. 优化 - Markdown 组件自动空格,避免分割 url 中的中文。
15. 优化 - 工作流上下文拆分,性能优化。
16. 优化 - 语音播报,不支持 mediaSource 的浏览器可等待完全生成语音后输出。
17. 修复 - Dockerfile pnpm install 支持代理。
18. 修复 - BI 图表生成无法写入文件。同时优化其解析,支持数字类型数组。
19. 修复 - 分享链接首次加载时,标题显示不正确。
17. 优化 - 对话引导 csv 读取,自动识别编码
18. 优化 - csv 导入问题引导可能乱码
19. 修复 - Dockerfile pnpm install 支持代理。。
20. 修复 - Dockerfile pnpm install 支持代理。
21. 修复 - BI 图表生成无法写入文件。同时优化其解析,支持数字类型数组。
22. 修复 - 分享链接首次加载时,标题显示不正确。

View File

@@ -0,0 +1,19 @@
---
title: 'V4.8.14(进行中)'
description: 'FastGPT V4.8.14 更新说明'
icon: 'upgrade'
draft: false
toc: true
weight: 810
---
## 更新预告
1.
2. 新增 - 工作流支持进入聊天框/点击开始对话后,自动触发一轮对话。
3. 新增 - 重写 chatContext对话测试也会有日志并且刷新后不会丢失对话。
4. 新增 - 分享链接支持配置是否允许查看原文。
5. 优化 - 工作流 ui 细节。
6. 优化 - 应用编辑记录采用 diff 存储,避免浏览器溢出。
7. 修复 - 分块策略,四级标题会被丢失。 同时新增了五级标题的支持。
8. 修复 - MongoDB 知识库集合唯一索引。

View File

@@ -66,7 +66,7 @@ Tips: 可以通过点击上下文按键查看完整的上下文组成,便于
FastGPT 知识库采用 QA 对(不一定都是问答格式,仅代表两个变量)的格式存储,在转义成字符串时候会根据**引用模板**来进行格式化。知识库包含多个可用变量: q, a, sourceId数据的ID, index(第n个数据), source(数据的集合名、文件名)score(距离得分0-1) 可以通过 {{q}} {{a}} {{sourceId}} {{index}} {{source}} {{score}} 按需引入。下面一个模板例子:
可以通过 [知识库结构讲解](/docs/course/dataset_engine/) 了解详细的知识库的结构。
可以通过 [知识库结构讲解](/docs/guide/knowledge_base/dataset_engine/) 了解详细的知识库的结构。
#### 引用模板

View File

@@ -30,5 +30,5 @@ weight: 232
{{% alert icon="🍅" context="success" %}}
具体配置参数介绍可以参考: [AI参数配置说明](/docs/course/ai_settings)
具体配置参数介绍可以参考: [AI参数配置说明](/docs/guide/course/ai_settings/)
{{% /alert %}}

View File

@@ -36,4 +36,4 @@ weight: 264
## 示例
- [接入谷歌搜索](/docs/workflow/examples/google_search/)
- [接入谷歌搜索](/docs/use-cases/app-cases/google_search/)

View File

@@ -5,4 +5,28 @@ icon: "form_input"
draft: false
toc: true
weight: 244
---
---
## 特点
- 用户交互
- 可重复添加
- 触发执行
![](/imgs/form_input1.png)
## 功能
「表单输入」节点属于用户交互节点,当触发这个节点时,对话会进入“交互”状态,会记录工作流的状态,等用户完成交互后,继续向下执行工作流
![](/imgs/form_input2.png)
比如上图中的例子,当触发表单输入节点时,对话框隐藏,对话进入“交互状态”
![](/imgs/form_input3.png)
当用户填完必填的信息并点击提交后,节点能够收集用户填写的表单信息,传递到后续的节点中使用
## 作用
能够精准收集需要的用户信息,再根据用户信息进行后续操作

View File

@@ -250,6 +250,6 @@ export default async function (ctx: FunctionContext) {
## 相关示例
- [谷歌搜索](/docs/workflow/examples/google_search/)
- [发送飞书webhook](/docs/workflow/examples/feishu_webhook/)
- [实验室预约(操作数据库)](/docs/workflow/examples/lab_appointment/)
- [谷歌搜索](/docs/use-cases/app-cases/google_search/)
- [发送飞书webhook](/docs/use-cases/app-cases/feishu_webhook/)
- [实验室预约(操作数据库)](/docs/use-cases/app-cases/lab_appointment/)

View File

@@ -29,4 +29,4 @@ weight: 246
## 示例
- [接入谷歌搜索](/docs/workflow/examples/google_search/)
- [接入谷歌搜索](/docs/use-cases/app-cases/google_search/)

View File

@@ -7,20 +7,21 @@ toc: true
weight: 236
---
![](/imgs/flow-tool1.png)
## 什么是工具
### **什么是工具**
工具可以是一个系统模块例如AI对话、知识库搜索、HTTP模块等。也可以是一个插件。
工具调用可以让 LLM 更动态的决策流程而不都是固定的流程。当然缺点就是费tokens
## 工具的组成
### **工具的组成**
1. 工具介绍。通常是模块的介绍或插件的介绍这个介绍会告诉LLM这个工具的作用是什么。
2. 工具参数。对于系统模块来说,工具参数已经是固定的,无需额外配置。对于插件来说,工具参数是一个可配置项。
## 工具是如何运行的
### **工具是如何运行的**
要了解工具如何运行的,首先需要知道它的运行条件。
@@ -29,43 +30,57 @@ weight: 236
结合工具的介绍、参数介绍和参数是否必须LLM会决定是否调用这个工具。有以下几种情况
1. 无参数的工具:直接根据工具介绍,决定是否需要执行。例如:获取当前时间。
2. 有参数的工具:
1. 无必须的参数尽管上下文中没有适合的参数也可以调用该工具。但有时候LLM会自己伪造一个参数。
2. 有必须的参数如果没有适合的参数LLM可能不会调用该工具。可以通过提示词引导用户提供参数。
#### **工具调用逻辑**
在支持`函数调用`的模型中,可以一次性调用多个工具,调用逻辑如下:
![](/imgs/flow-tool2.png)
## 怎么用
### **怎么用**
| 有工具调用模块 | 无工具调用模块 |
| --- | --- |
| ![](/imgs/flow-tool3.png) | ![](/imgs/flow-tool4.png) |
<div style="display: flex; gap: 10px;">
<img src="/imgs/flow-tool3.png" alt="工具调用模块示例 3" width="40%" />
<img src="/imgs/flow-tool4.png" alt="工具调用模块示例 4" width="60%" />
</div>
高级编排中,托动工具调用的连接点,可用的工具头部会出现一个菱形,可以将它与工具调用模块底部的菱形相连接。
<!-- ![](/imgs/flow-tool3.png)!![](/imgs/flow-tool4.png) -->
被连接的工具,会自动分离工具输入与普通的输入,并且可以编辑`描述`,可以通过调整介绍,使得该工具调用时机更加精确。对于一些内置的节点,务必修改`描述`才能让模型正常调用
高级编排中,拖动工具调用的连接点,可用的工具头部会出现一个菱形,可以将它与工具调用模块底部的菱形相连接
被连接的工具,会自动分离工具输入与普通的输入,并且可以编辑`介绍`,可以通过调整介绍,使得该工具调用时机更加精确。
关于工具调用,如何调试仍然是一个玄学,所以建议,不要一次性增加太多工具,选择少量工具调优后再进一步尝试。
## 组合节点
#### 用途
### 工具调用终止
默认清空下工具调用节点在决定调用工具后会将工具运行的结果返回给AI让 AI 对工具运行的结果进行总结输出。有时候,如果你不需要 AI 进行进一步的总结输出,可以使用该节点,将其接入对于工具流程的末尾。
工具调用默认会把子流程运行的结果作为`工具结果`,返回给模型进行回答。有时候,你可能不希望模型做回答,你可以给对应子流程的末尾增加上一个`工具调用终止`节点,这样,子流程的结果就不会被返回给模型
如下图,在执行知识库搜索后,发送给了 HTTP 请求,搜索将不会返回搜索的结果给工具调用进行 AI 总结
![alt text](/imgs/image-3.png)
![](/imgs/flow-tool5.png)
### 附加节点
当您使用了工具调用节点,同时就会出现工具调用终止节点和自定义变量节点,能够进一步提升工具调用的使用体验。
#### 工具调用终止
工具调用终止可用于结束本次调用,即可以接在某个工具后面,当工作流执行到这个节点时,便会强制结束本次工具调用,不再调用其他工具,也不会再调用 AI 针对工具调用结果回答问题。
![](/imgs/flow-tool6.png)
### 自定义工具变量
工具调用的子流程运行,有时候会依赖`AI`生成的一些变量,为了简化交互流程,我们给系统内置的节点都指定了`工具变量`。然而,有些时候,你需要的变量不仅是目标流程的`首个节点`的变量,而是需要更复杂的变量,此时你可以使用`自定义工具变量`。它允许你完全自定义该`工具流程`的变量
自定义变量可以扩展工具的变量输入,即对于一些未被视作工具参数或无法工具调用的节点,可以自定义工具变量,填上对应的参数描述,那么工具调用便会相对应的调用这个节点,进而调用其之后的工作流
![alt text](/imgs/image-4.png)
![](/imgs/flow-tool7.png)
## 相关示例
### **相关示例**
- [谷歌搜索](/docs/workflow/examples/google_search/)
- [发送飞书webhook](/docs/workflow/examples/feishu_webhook/)
- [谷歌搜索](https://doc.fastgpt.in/docs/use-cases/app-cases/google_search/)
- [发送飞书webhook](https://doc.fastgpt.in/docs/use-cases/app-cases/feishu_webhook/)

View File

@@ -39,44 +39,45 @@ weight: 506
海外版用户cloud.tryfastgpt.ai)可以填写下面的 IP 白名单:
```
34.87.20.17
35.247.161.35
34.87.51.146
34.87.110.152
35.247.163.68
34.126.163.205
34.87.20.189
34.87.102.86
35.240.227.100
35.198.192.104
34.143.149.171
34.87.152.33
34.124.237.188
35.197.149.75
34.87.44.74
34.124.189.116
34.87.79.202
34.87.173.252
34.143.240.160
34.87.180.104
34.87.51.146
34.87.79.202
35.247.163.68
34.87.102.86
35.198.192.104
34.126.163.205
34.124.189.116
34.143.149.171
34.87.173.252
34.142.157.52
34.87.180.104
34.87.20.189
34.87.110.152
34.87.44.74
34.87.152.33
35.197.149.75
35.247.161.35
```
国内版用户fastgpt.cn)可以填写下面的 IP 白名单:
```
47.97.59.172
121.43.108.48
121.41.75.88
47.97.1.240
121.43.105.217
121.41.178.7
121.40.65.187
121.196.235.183
120.55.195.90
120.55.193.112
120.26.229.115
112.124.41.79
47.97.59.172
101.37.205.32
120.55.195.90
120.26.229.115
120.55.193.112
47.98.190.173
112.124.41.79
121.196.235.183
121.41.75.88
121.43.108.48
```
## 4. 获取AES Key选择加密方式

View File

@@ -11,7 +11,7 @@ weight: 509
[chatgpt-on-wechat GitHub 地址](https://github.com/zhayujie/chatgpt-on-wechat)
由于 FastGPT 的 API 接口和 OpenAI 的规范一致,可以无需变更原来的应用即可使用 FastGPT 上编排好的应用。API 使用可参考 [这篇文章](/docs/course/openapi/)。编排示例,可参考 [高级编排介绍](/docs/workflow/intro)
由于 FastGPT 的 API 接口和 OpenAI 的规范一致,可以无需变更原来的应用即可使用 FastGPT 上编排好的应用。API 使用可参考 [这篇文章](/docs/use-cases/external-integration/openapi/)。编排示例,可参考 [高级编排介绍](/docs/workflow/intro)
## 1. 获取 OpenAPI 密钥

View File

@@ -139,8 +139,6 @@ services:
- OPENAI_BASE_URL=http://oneapi:3000/v1
# AI模型的API Key。这里默认填写了OneAPI的快速默认key测试通后务必及时修改
- CHAT_API_KEY=sk-fastgpt
# 是否将图片转成 base64 传递给模型,本地开发和内网环境使用共有模型时候需要设置为 true
- MULTIPLE_DATA_TO_BASE64=false
# 数据库最大连接数
- DB_MAX_LINK=30
# 登录凭证密钥

View File

@@ -97,8 +97,6 @@ services:
- OPENAI_BASE_URL=http://oneapi:3000/v1
# AI模型的API Key。这里默认填写了OneAPI的快速默认key测试通后务必及时修改
- CHAT_API_KEY=sk-fastgpt
# 是否将图片转成 base64 传递给模型,本地开发和内网环境使用共有模型时候需要设置为 true
- MULTIPLE_DATA_TO_BASE64=false
# 数据库最大连接数
- DB_MAX_LINK=30
# 登录凭证密钥

View File

@@ -77,8 +77,6 @@ services:
- OPENAI_BASE_URL=http://oneapi:3000/v1
# AI模型的API Key。这里默认填写了OneAPI的快速默认key测试通后务必及时修改
- CHAT_API_KEY=sk-fastgpt
# 是否将图片转成 base64 传递给模型,本地开发和内网环境使用共有模型时候需要设置为 true
- MULTIPLE_DATA_TO_BASE64=false
# 数据库最大连接数
- DB_MAX_LINK=30
# 登录凭证密钥

View File

@@ -16,7 +16,7 @@ export const bucketNameMap = {
}
};
export const ReadFileBaseUrl = `${process.env.FE_DOMAIN || ''}${process.env.NEXT_PUBLIC_BASE_URL}/api/common/file/read`;
export const ReadFileBaseUrl = `${process.env.FE_DOMAIN || ''}${process.env.NEXT_PUBLIC_BASE_URL || ''}/api/common/file/read`;
export const documentFileType = '.txt, .docx, .csv, .xlsx, .pdf, .md, .html, .pptx';
export const imageFileType =

View File

@@ -1,5 +1,6 @@
import { batchRun } from '../fn/utils';
import { simpleText } from './tools';
import { getNanoid, simpleText } from './tools';
import type { ImageType } from '../../../service/worker/readFile/type';
/* Delete redundant text in markdown */
export const simpleMarkdownText = (rawText: string) => {
@@ -92,3 +93,25 @@ export const markdownProcess = async ({
return simpleMarkdownText(imageProcess);
};
export const matchMdImgTextAndUpload = (text: string) => {
const base64Regex = /"(data:image\/[^;]+;base64[^"]+)"/g;
const imageList: ImageType[] = [];
const images = Array.from(text.match(base64Regex) || []);
for (const image of images) {
const uuid = `IMAGE_${getNanoid(12)}_IMAGE`;
const mime = image.split(';')[0].split(':')[1];
const base64 = image.split(',')[1];
text = text.replace(image, uuid);
imageList.push({
uuid,
base64,
mime
});
}
return {
text,
imageList
};
};

View File

@@ -99,7 +99,7 @@ ${mdSplitString}
5. 标点分割:重叠
*/
const commonSplit = (props: SplitProps): SplitResponse => {
let { text = '', chunkLen, overlapRatio = 0.2, customReg = [] } = props;
let { text = '', chunkLen, overlapRatio = 0.15, customReg = [] } = props;
const splitMarker = 'SPLIT_HERE_SPLIT_HERE';
const codeBlockMarker = 'CODE_BLOCK_LINE_MARKER';
@@ -113,6 +113,8 @@ const commonSplit = (props: SplitProps): SplitResponse => {
text = text.replace(/(\r?\n|\r){3,}/g, '\n\n\n');
// The larger maxLen is, the next sentence is less likely to trigger splitting
const markdownIndex = 4;
const forbidOverlapIndex = 8;
const stepReges: { reg: RegExp; maxLen: number }[] = [
...customReg.map((text) => ({
reg: new RegExp(`(${replaceRegChars(text)})`, 'g'),
@@ -122,9 +124,11 @@ const commonSplit = (props: SplitProps): SplitResponse => {
{ reg: /^(##\s[^\n]+\n)/gm, maxLen: chunkLen * 1.4 },
{ reg: /^(###\s[^\n]+\n)/gm, maxLen: chunkLen * 1.6 },
{ reg: /^(####\s[^\n]+\n)/gm, maxLen: chunkLen * 1.8 },
{ reg: /^(#####\s[^\n]+\n)/gm, maxLen: chunkLen * 1.8 },
{ reg: /([\n]([`~]))/g, maxLen: chunkLen * 4 }, // code block
{ reg: /([\n](?!\s*[\*\-|>0-9]))/g, maxLen: chunkLen * 2 }, // 增大块,尽可能保证它是一个完整的段落。 (?![\*\-|>`0-9]): markdown special char
{ reg: /([\n](?=\s*[0-9]+\.))/g, maxLen: chunkLen * 2 }, // 增大块,尽可能保证它是一个完整的段落。 (?![\*\-|>`0-9]): markdown special char
{ reg: /(\n{2,})/g, maxLen: chunkLen * 1.6 },
{ reg: /([\n])/g, maxLen: chunkLen * 1.2 },
// ------ There's no overlap on the top
{ reg: /([。]|([a-zA-Z])\.\s)/g, maxLen: chunkLen * 1.2 },
@@ -136,8 +140,9 @@ const commonSplit = (props: SplitProps): SplitResponse => {
const customRegLen = customReg.length;
const checkIsCustomStep = (step: number) => step < customRegLen;
const checkIsMarkdownSplit = (step: number) => step >= customRegLen && step <= 3 + customRegLen;
const checkForbidOverlap = (step: number) => step <= 6 + customRegLen;
const checkIsMarkdownSplit = (step: number) => step >= customRegLen && step <= markdownIndex;
+customReg.length;
const checkForbidOverlap = (step: number) => step <= forbidOverlapIndex + customReg.length;
// if use markdown title split, Separate record title
const getSplitTexts = ({ text, step }: { text: string; step: number }) => {
@@ -231,7 +236,7 @@ const commonSplit = (props: SplitProps): SplitResponse => {
// use slice-chunkLen to split text
const chunks: string[] = [];
for (let i = 0; i < text.length; i += chunkLen - overlapLen) {
chunks.push(`${parentTitle}${text.slice(i, i + chunkLen)}`);
chunks.push(text.slice(i, i + chunkLen));
}
return chunks;
}
@@ -241,7 +246,6 @@ const commonSplit = (props: SplitProps): SplitResponse => {
const maxLen = splitTexts.length > 1 ? stepReges[step].maxLen : chunkLen;
const minChunkLen = chunkLen * 0.7;
// console.log(splitTexts, stepReges[step].reg);
const chunks: string[] = [];
for (let i = 0; i < splitTexts.length; i++) {
@@ -249,12 +253,34 @@ const commonSplit = (props: SplitProps): SplitResponse => {
const lastTextLen = lastText.length;
const currentText = item.text;
const currentTextLen = currentText.length;
const newText = lastText + currentText;
const newTextLen = lastTextLen + currentTextLen;
const newTextLen = newText.length;
// Markdown 模式下,会强制向下拆分最小块,并再最后一个标题时候,给小块都补充上所有标题(包含父级标题)
if (isMarkdownStep) {
// split new Text, split chunks must will greater 1 (small lastText)
const innerChunks = splitTextRecursively({
text: newText,
step: step + 1,
lastText: '',
parentTitle: parentTitle + item.title
});
const lastChunk = innerChunks[innerChunks.length - 1];
if (!lastChunk) continue;
chunks.push(
...innerChunks.map(
(chunk) =>
step === markdownIndex + customRegLen ? `${parentTitle}${item.title}${chunk}` : chunk // 合并进 Markdown 分块时,需要补标题
)
);
continue;
}
// newText is too large(now, The lastText must be smaller than chunkLen)
if (newTextLen > maxLen || isMarkdownStep) {
if (newTextLen > maxLen) {
// lastText greater minChunkLen, direct push it to chunks, not add to next chunk. (large lastText)
if (lastTextLen > minChunkLen) {
chunks.push(lastText);
@@ -278,15 +304,6 @@ const commonSplit = (props: SplitProps): SplitResponse => {
if (!lastChunk) continue;
if (forbidConcat) {
chunks.push(
...innerChunks.map(
(chunk) => (step === 3 + customRegLen ? `${parentTitle}${chunk}` : chunk) // 合并进 Markdown 分块时,需要补标题
)
);
continue;
}
// last chunk is too small, concat it to lastText(next chunk start)
if (lastChunk.length < minChunkLen) {
chunks.push(...innerChunks.slice(0, -1));
@@ -304,11 +321,11 @@ const commonSplit = (props: SplitProps): SplitResponse => {
continue;
}
// new text is small
// New text is small
// Not overlap
if (forbidConcat) {
chunks.push(`${parentTitle}${item.title}${item.text}`);
chunks.push(item.text);
continue;
}

View File

@@ -56,6 +56,7 @@ export type FastGPTFeConfigsType = {
github?: string;
google?: string;
wechat?: string;
microsoft?: string;
};
limit?: {
exportDatasetLimitMinutes?: number;

View File

@@ -78,11 +78,15 @@ export const getHistoryPreview = (
};
export const filterPublicNodeResponseData = ({
flowResponses = []
flowResponses = [],
responseDetail = false
}: {
flowResponses?: ChatHistoryItemResType[];
responseDetail?: boolean;
}) => {
const filedList = ['quoteList', 'moduleType', 'pluginOutput', 'runningTime'];
const filedList = responseDetail
? ['quoteList', 'moduleType', 'pluginOutput', 'runningTime']
: ['moduleType', 'pluginOutput', 'runningTime'];
const filterModuleTypeList: any[] = [
FlowNodeTypeEnum.pluginModule,
FlowNodeTypeEnum.datasetSearchNode,

View File

@@ -95,10 +95,10 @@ export const DatasetSearchModule: FlowNodeTemplateType = {
},
{
key: NodeInputKeyEnum.collectionFilterMatch,
renderTypeList: [FlowNodeInputTypeEnum.JSONEditor, FlowNodeInputTypeEnum.reference],
renderTypeList: [FlowNodeInputTypeEnum.textarea, FlowNodeInputTypeEnum.reference],
label: i18nT('workflow:collection_metadata_filter'),
valueType: WorkflowIOValueTypeEnum.object,
valueType: WorkflowIOValueTypeEnum.string,
isPro: true,
description: i18nT('workflow:filter_description')
}

View File

@@ -14,5 +14,6 @@ export const userStatusMap = {
export enum OAuthEnum {
github = 'github',
google = 'google',
wechat = 'wechat'
wechat = 'wechat',
microsoft = 'microsoft'
}

View File

@@ -5,18 +5,7 @@ import { cloneDeep } from 'lodash';
import { WorkerNameEnum, runWorker } from '@fastgpt/service/worker/utils';
// Run in main thread
const staticPluginList = [
'getTime',
'fetchUrl',
'Doc2X',
'Doc2X/URLPDF2text',
'Doc2X/URLImg2text',
`Doc2X/FilePDF2text`,
`Doc2X/FileImg2text`,
'feishu',
'google',
'bing'
];
const staticPluginList = ['getTime', 'fetchUrl', 'feishu', 'google', 'bing'];
// Run in worker thread (Have npm packages)
const packagePluginList = [
'mathExprVal',
@@ -28,7 +17,9 @@ const packagePluginList = [
'drawing',
'drawing/baseChart',
'wiki',
'databaseConnection'
'databaseConnection',
'Doc2X',
'Doc2X/PDF2text'
];
export const list = [...staticPluginList, ...packagePluginList];
@@ -55,6 +46,8 @@ export const getCommunityPlugins = () => {
};
export const getSystemPluginTemplates = () => {
if (!global.systemPlugins) return [];
const oldPlugins = global.communityPlugins ?? [];
return [...oldPlugins, ...cloneDeep(global.systemPlugins)];
};
@@ -96,7 +89,3 @@ export const getCommunityCb = async () => {
{}
);
};
export const getSystemPluginCb = async () => {
return global.systemPluginCb;
};

View File

@@ -1,172 +0,0 @@
import { delay } from '@fastgpt/global/common/system/utils';
import { addLog } from '@fastgpt/service/common/system/log';
type Props = {
apikey: string;
files: Array<string>;
img_correction: boolean;
formula: boolean;
};
type Response = Promise<{
result: string;
failreason: string;
success: boolean;
}>;
const main = async ({ apikey, files, img_correction, formula }: Props): Response => {
// Check the apikey
if (!apikey) {
return {
result: '',
failreason: `API key is required`,
success: false
};
}
let real_api_key = apikey;
if (!apikey.startsWith('sk-')) {
const response = await fetch('https://api.doc2x.noedgeai.com/api/token/refresh', {
method: 'POST',
headers: {
Authorization: `Bearer ${apikey}`
}
});
if (response.status !== 200) {
return {
result: '',
failreason: `Get token failed: ${await response.text()}`,
success: false
};
}
const data = await response.json();
real_api_key = data.data.token;
}
let final_result = '';
let fail_reason = '';
let flag = false;
//Process each file one by one
for await (const url of files) {
// Fetch the image and check its content type
const imageResponse = await fetch(url);
if (!imageResponse.ok) {
fail_reason += `\n---\nFile:${url} \n<Content>\nFailed to fetch image from URL\n</Content>\n`;
flag = true;
continue;
}
const contentType = imageResponse.headers.get('content-type');
const fileName = url.match(/read\?filename=([^&]+)/)?.[1] || 'unknown.png';
if (!contentType || !contentType.startsWith('image/')) {
fail_reason += `\n---\nFile:${url} \n<Content>\nThe provided URL does not point to an image: ${contentType}\n</Content>\n`;
flag = true;
continue;
}
const blob = await imageResponse.blob();
const formData = new FormData();
formData.append('file', blob, fileName);
formData.append('img_correction', img_correction ? '1' : '0');
formData.append('equation', formula ? '1' : '0');
let upload_url = 'https://api.doc2x.noedgeai.com/api/platform/async/img';
if (real_api_key.startsWith('sk-')) {
upload_url = 'https://api.doc2x.noedgeai.com/api/v1/async/img';
}
let uuid;
let upload_flag = true;
const uploadAttempts = [1, 2, 3];
for await (const attempt of uploadAttempts) {
const upload_response = await fetch(upload_url, {
method: 'POST',
headers: {
Authorization: `Bearer ${real_api_key}`
},
body: formData
});
if (!upload_response.ok) {
// Rate limit, wait for 10s and retry at most 3 times
if (upload_response.status === 429 && attempt < 3) {
await delay(10000);
continue;
}
fail_reason += `\n---\nFile:${fileName}\n<Content>\nFailed to upload file: ${await upload_response.text()}\n</Content>\n`;
flag = true;
upload_flag = false;
break;
}
if (!upload_flag) {
continue;
}
const upload_data = await upload_response.json();
uuid = upload_data.data.uuid;
break;
}
// Get the result by uuid
let result_url = 'https://api.doc2x.noedgeai.com/api/platform/async/status?uuid=' + uuid;
if (real_api_key.startsWith('sk-')) {
result_url = 'https://api.doc2x.noedgeai.com/api/v1/async/status?uuid=' + uuid;
}
let required_flag = true;
const maxAttempts = 100;
// Wait for the result, at most 100s
for await (const _ of Array(maxAttempts).keys()) {
const result_response = await fetch(result_url, {
headers: {
Authorization: `Bearer ${real_api_key}`
}
});
if (!result_response.ok) {
fail_reason += `\n---\nFile:${fileName}\n<Content>\nFailed to get result: ${await result_response.text()}\n</Content>\n`;
flag = true;
required_flag = false;
break;
}
const result_data = await result_response.json();
if (['ready', 'processing'].includes(result_data.data.status)) {
await delay(1000);
} else if (result_data.data.status === 'pages limit exceeded') {
fail_reason += `\n---\nFile:${fileName}\n<Content>\nFailed to get result: pages limit exceeded\n</Content>\n`;
flag = true;
required_flag = false;
break;
} else if (result_data.data.status === 'success') {
let result;
try {
result = result_data.data.result.pages[0].md;
result = result.replace(/\\[\(\)]/g, '$').replace(/\\[\[\]]/g, '$$');
} catch {
// no pages
final_result += `\n---\nFile:${fileName}\n<Content>\n \n</Content>\n`;
required_flag = false;
}
final_result += `\n---\nFile:${fileName}\n<Content>\n${result}\n</Content>\n`;
required_flag = false;
break;
} else {
fail_reason += `\n---\nFile:${fileName}\n<Content>\nFailed to get result: ${result_data.data.status}\n</Content>\n`;
flag = true;
required_flag = false;
break;
}
}
if (required_flag) {
fail_reason += `\n---\nFile:${fileName}\n<Content>\nTimeout waiting for result\n</Content>\n`;
flag = true;
}
}
return {
result: final_result,
failreason: fail_reason,
success: !flag
};
};
export default main;

View File

@@ -1,500 +0,0 @@
{
"author": "Menghuan1918",
"version": "488",
"name": "Doc2X 图像(文件)识别",
"avatar": "plugins/doc2x",
"intro": "将上传的图片文件发送至Doc2X进行解析返回带LaTeX公式的markdown格式的文本",
"courseUrl": "https://fael3z0zfze.feishu.cn/wiki/Rkc5witXWiJoi5kORd2cofh6nDg?fromScene=spaceOverview",
"showStatus": true,
"weight": 10,
"isTool": true,
"templateType": "tools",
"workflow": {
"nodes": [
{
"nodeId": "pluginConfig",
"name": "common:core.module.template.system_config",
"intro": "",
"avatar": "core/workflow/template/systemConfig",
"flowNodeType": "pluginConfig",
"position": {
"x": -90.53591960393504,
"y": -17.580286776561252
},
"version": "4811",
"inputs": [],
"outputs": []
},
{
"nodeId": "pluginInput",
"name": "插件开始",
"intro": "可以配置插件需要哪些输入,利用这些输入来运行插件",
"avatar": "core/workflow/template/workflowStart",
"flowNodeType": "pluginInput",
"showStatus": false,
"position": {
"x": 368.6800424053505,
"y": -17.580286776561252
},
"version": "481",
"inputs": [
{
"renderTypeList": ["input"],
"selectedTypeIndex": 0,
"valueType": "string",
"canEdit": true,
"key": "apikey",
"label": "apikey",
"description": "Doc2X的验证密匙对于个人用户可以从Doc2X官网 - 个人信息 - 身份令牌获得",
"required": true,
"toolDescription": "",
"defaultValue": ""
},
{
"renderTypeList": ["reference"],
"selectedTypeIndex": 0,
"valueType": "arrayString",
"canEdit": true,
"key": "files",
"label": "files",
"description": "待处理图片文件",
"required": true,
"toolDescription": "待处理图片文件"
},
{
"renderTypeList": ["switch"],
"selectedTypeIndex": 0,
"valueType": "boolean",
"canEdit": true,
"key": "img_correction",
"label": "img_correction",
"description": "是否启用图形矫正功能",
"required": true,
"toolDescription": "",
"defaultValue": false
},
{
"renderTypeList": ["switch"],
"selectedTypeIndex": 0,
"valueType": "boolean",
"canEdit": true,
"key": "formula",
"label": "formula",
"description": "是否开启纯公式识别(仅适用于图片内容仅有公式时)",
"required": true,
"toolDescription": "",
"defaultValue": false
}
],
"outputs": [
{
"id": "apikey",
"valueType": "string",
"key": "apikey",
"label": "apikey",
"type": "hidden"
},
{
"id": "url",
"valueType": "arrayString",
"key": "files",
"label": "files",
"type": "hidden"
},
{
"id": "img_correction",
"valueType": "boolean",
"key": "img_correction",
"label": "img_correction",
"type": "hidden"
},
{
"id": "formula",
"valueType": "boolean",
"key": "formula",
"label": "formula",
"type": "hidden"
}
]
},
{
"nodeId": "pluginOutput",
"name": "插件输出",
"intro": "自定义配置外部输出,使用插件时,仅暴露自定义配置的输出",
"avatar": "core/workflow/template/pluginOutput",
"flowNodeType": "pluginOutput",
"showStatus": false,
"position": {
"x": 1796.2235867744578,
"y": 6.419713223438748
},
"version": "481",
"inputs": [
{
"renderTypeList": ["reference"],
"valueType": "string",
"canEdit": true,
"key": "result",
"label": "result",
"description": "处理结果(或者是报错信息)",
"value": ["zHG5jJBkXmjB", "xWQuEf50F3mr"]
},
{
"renderTypeList": ["reference"],
"valueType": "string",
"canEdit": true,
"key": "failreason",
"label": "failreason",
"description": "文件处理失败原因,由文件名以及报错组成,多个文件之间由横线分隔开",
"value": ["zHG5jJBkXmjB", "jbv4nVZvmFXm"]
},
{
"renderTypeList": ["reference"],
"valueType": "boolean",
"canEdit": true,
"key": "success",
"label": "success",
"description": "是否全部文件都处理成功如有没有处理成功的文件失败原因将会输出在failreason中",
"value": ["zHG5jJBkXmjB", "k46cjNulVk5Y"]
}
],
"outputs": []
},
{
"nodeId": "zHG5jJBkXmjB",
"name": "HTTP 请求",
"intro": "可以发出一个 HTTP 请求,实现更为复杂的操作(联网搜索、数据库查询等)",
"avatar": "core/workflow/template/httpRequest",
"flowNodeType": "httpRequest468",
"showStatus": true,
"position": {
"x": 1081.967607938733,
"y": -426.08028677656125
},
"version": "481",
"inputs": [
{
"key": "system_addInputParam",
"renderTypeList": ["addInputParam"],
"valueType": "dynamic",
"label": "",
"required": false,
"description": "common:core.module.input.description.HTTP Dynamic Input",
"customInputConfig": {
"selectValueTypeList": [
"string",
"number",
"boolean",
"object",
"arrayString",
"arrayNumber",
"arrayBoolean",
"arrayObject",
"arrayAny",
"any",
"chatHistory",
"datasetQuote",
"dynamic",
"selectApp",
"selectDataset"
],
"showDescription": false,
"showDefaultValue": true
},
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpMethod",
"renderTypeList": ["custom"],
"valueType": "string",
"label": "",
"value": "POST",
"required": true,
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpTimeout",
"renderTypeList": ["custom"],
"valueType": "number",
"label": "",
"value": 30,
"min": 5,
"max": 600,
"required": true,
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpReqUrl",
"renderTypeList": ["hidden"],
"valueType": "string",
"label": "",
"description": "common:core.module.input.description.Http Request Url",
"placeholder": "https://api.ai.com/getInventory",
"required": false,
"value": "Doc2X/FileImg2text",
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpHeader",
"renderTypeList": ["custom"],
"valueType": "any",
"value": [],
"label": "",
"description": "common:core.module.input.description.Http Request Header",
"placeholder": "common:core.module.input.description.Http Request Header",
"required": false,
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpParams",
"renderTypeList": ["hidden"],
"valueType": "any",
"value": [],
"label": "",
"required": false,
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpJsonBody",
"renderTypeList": ["hidden"],
"valueType": "any",
"value": "{\n \"apikey\": \"{{apikey}}\",\n \"files\": {{files}},\n \"img_correction\": {{img_correction}},\n \"formula\": {{formula}}\n}",
"label": "",
"required": false,
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpFormBody",
"renderTypeList": ["hidden"],
"valueType": "any",
"value": [],
"label": "",
"required": false,
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpContentType",
"renderTypeList": ["hidden"],
"valueType": "string",
"value": "json",
"label": "",
"required": false,
"debugLabel": "",
"toolDescription": ""
},
{
"renderTypeList": ["reference"],
"valueType": "string",
"canEdit": true,
"key": "apikey",
"label": "apikey",
"customInputConfig": {
"selectValueTypeList": [
"string",
"number",
"boolean",
"object",
"arrayString",
"arrayNumber",
"arrayBoolean",
"arrayObject",
"arrayAny",
"any",
"chatHistory",
"datasetQuote",
"dynamic",
"selectApp",
"selectDataset"
],
"showDescription": false,
"showDefaultValue": true
},
"required": true,
"value": ["pluginInput", "apikey"]
},
{
"renderTypeList": ["reference"],
"valueType": "arrayString",
"canEdit": true,
"key": "files",
"label": "files",
"customInputConfig": {
"selectValueTypeList": [
"string",
"number",
"boolean",
"object",
"arrayString",
"arrayNumber",
"arrayBoolean",
"arrayObject",
"arrayAny",
"any",
"chatHistory",
"datasetQuote",
"dynamic",
"selectApp",
"selectDataset"
],
"showDescription": false,
"showDefaultValue": true
},
"required": true,
"value": ["pluginInput", "url"]
},
{
"renderTypeList": ["reference"],
"valueType": "boolean",
"canEdit": true,
"key": "img_correction",
"label": "img_correction",
"customInputConfig": {
"selectValueTypeList": [
"string",
"number",
"boolean",
"object",
"arrayString",
"arrayNumber",
"arrayBoolean",
"arrayObject",
"arrayAny",
"any",
"chatHistory",
"datasetQuote",
"dynamic",
"selectApp",
"selectDataset"
],
"showDescription": false,
"showDefaultValue": true
},
"required": true,
"value": ["pluginInput", "img_correction"]
},
{
"renderTypeList": ["reference"],
"valueType": "boolean",
"canEdit": true,
"key": "formula",
"label": "formula",
"customInputConfig": {
"selectValueTypeList": [
"string",
"number",
"boolean",
"object",
"arrayString",
"arrayNumber",
"arrayBoolean",
"arrayObject",
"arrayAny",
"any",
"chatHistory",
"datasetQuote",
"dynamic",
"selectApp",
"selectDataset"
],
"showDescription": false,
"showDefaultValue": true
},
"required": true,
"value": ["pluginInput", "formula"]
}
],
"outputs": [
{
"id": "error",
"key": "error",
"label": "workflow:request_error",
"description": "HTTP请求错误信息成功时返回空",
"valueType": "object",
"type": "static"
},
{
"id": "httpRawResponse",
"key": "httpRawResponse",
"required": true,
"label": "workflow:raw_response",
"description": "HTTP请求的原始响应。只能接受字符串或JSON类型响应数据。",
"valueType": "any",
"type": "static"
},
{
"id": "system_addOutputParam",
"key": "system_addOutputParam",
"type": "dynamic",
"valueType": "dynamic",
"label": "",
"customFieldConfig": {
"selectValueTypeList": [
"string",
"number",
"boolean",
"object",
"arrayString",
"arrayNumber",
"arrayBoolean",
"arrayObject",
"any",
"chatHistory",
"datasetQuote",
"dynamic",
"selectApp",
"selectDataset"
],
"showDescription": false,
"showDefaultValue": false
}
},
{
"id": "xWQuEf50F3mr",
"valueType": "string",
"type": "dynamic",
"key": "result",
"label": "result"
},
{
"id": "jbv4nVZvmFXm",
"valueType": "string",
"type": "dynamic",
"key": "failreason",
"label": "failreason"
},
{
"id": "k46cjNulVk5Y",
"valueType": "boolean",
"type": "dynamic",
"key": "success",
"label": "success"
}
]
}
],
"edges": [
{
"source": "pluginInput",
"target": "zHG5jJBkXmjB",
"sourceHandle": "pluginInput-source-right",
"targetHandle": "zHG5jJBkXmjB-target-left"
},
{
"source": "zHG5jJBkXmjB",
"target": "pluginOutput",
"sourceHandle": "zHG5jJBkXmjB-source-right",
"targetHandle": "pluginOutput-target-left"
}
]
}
}

View File

@@ -1,165 +0,0 @@
import { delay } from '@fastgpt/global/common/system/utils';
import { addLog } from '@fastgpt/service/common/system/log';
import { result } from 'lodash';
type Props = {
apikey: string;
files: Array<string>;
ocr: boolean;
};
// Response type same as HTTP outputs
type Response = Promise<{
result: string;
failreason: string;
success: boolean;
}>;
const main = async ({ apikey, files, ocr }: Props): Response => {
// Check the apikey
if (!apikey) {
return {
result: '',
failreason: `API key is required`,
success: false
};
}
let real_api_key = apikey;
if (!apikey.startsWith('sk-')) {
const response = await fetch('https://api.doc2x.noedgeai.com/api/token/refresh', {
method: 'POST',
headers: {
Authorization: `Bearer ${apikey}`
}
});
if (response.status !== 200) {
return {
result: '',
failreason: `Get token failed: ${await response.text()}`,
success: false
};
}
const data = await response.json();
real_api_key = data.data.token;
}
let final_result = '';
let fail_reason = '';
let flag = false;
//Process each file one by one
for await (const url of files) {
//Fetch the pdf and check its contene type
const PDFResponse = await fetch(url);
if (!PDFResponse.ok) {
fail_reason += `\n---\nFile:${url} \n<Content>\nFailed to fetch PDF from URL\n</Content>\n`;
flag = true;
continue;
}
const contentType = PDFResponse.headers.get('content-type');
const file_name = url.match(/read\?filename=([^&]+)/)?.[1] || 'unknown.pdf';
if (!contentType || !contentType.startsWith('application/pdf')) {
fail_reason += `\n---\nFile:${file_name}\n<Content>\nThe provided file does not point to a PDF: ${contentType}\n</Content>\n`;
flag = true;
continue;
}
const blob = await PDFResponse.blob();
const formData = new FormData();
formData.append('file', blob, file_name);
formData.append('ocr', ocr ? '1' : '0');
let upload_url = 'https://api.doc2x.noedgeai.com/api/platform/async/pdf';
if (real_api_key.startsWith('sk-')) {
upload_url = 'https://api.doc2x.noedgeai.com/api/v1/async/pdf';
}
let uuid;
let upload_flag = true;
const uploadAttempts = [1, 2, 3];
for await (const attempt of uploadAttempts) {
const upload_response = await fetch(upload_url, {
method: 'POST',
headers: {
Authorization: `Bearer ${real_api_key}`
},
body: formData
});
if (!upload_response.ok) {
// Rate limit, wait for 10s and retry at most 3 times
if (upload_response.status === 429 && attempt < 3) {
await delay(10000);
continue;
}
fail_reason += `\n---\nFile:${file_name}\n<Content>\nFailed to upload file: ${await upload_response.text()}\n</Content>\n`;
flag = true;
upload_flag = false;
}
if (!upload_flag) {
continue;
}
const upload_data = await upload_response.json();
uuid = upload_data.data.uuid;
break;
}
// Get the result by uuid
let result_url = 'https://api.doc2x.noedgeai.com/api/platform/async/status?uuid=' + uuid;
if (real_api_key.startsWith('sk-')) {
result_url = 'https://api.doc2x.noedgeai.com/api/v1/async/status?uuid=' + uuid;
}
let required_flag = true;
let result = '';
// Wait for the result, at most 100s
const maxAttempts = 100;
for await (const _ of Array(maxAttempts).keys()) {
const result_response = await fetch(result_url, {
headers: {
Authorization: `Bearer ${real_api_key}`
}
});
if (!result_response.ok) {
fail_reason += `\n---\nFile:${file_name}\n<Content>\nFailed to get result: ${await result_response.text()}\n</Content>\n`;
flag = true;
required_flag = false;
break;
}
const result_data = await result_response.json();
if (['ready', 'processing'].includes(result_data.data.status)) {
await delay(1000);
} else if (result_data.data.status === 'pages limit exceeded') {
fail_reason += `\n---\nFile:${file_name}\n<Content>\nPages limit exceeded\n</Content>\n`;
flag = true;
required_flag = false;
break;
} else if (result_data.data.status === 'success') {
result = await Promise.all(
result_data.data.result.pages.map((page: { md: any }) => page.md)
).then((pages) => pages.join('\n'));
result = result.replace(/\\[\(\)]/g, '$').replace(/\\[\[\]]/g, '$$');
final_result += `\n---\nFile:${file_name}\n<Content>\n${result}\n</Content>\n`;
required_flag = false;
break;
} else {
fail_reason += `\n---\nFile:${file_name}\n<Content>\nFailed to get result: ${result_data.data.status}\n</Content>\n`;
flag = true;
required_flag = false;
break;
}
}
if (required_flag) {
fail_reason += `\n---\nFile:${file_name}\n<Content>\nTimeout after 100s for uuid ${uuid}\n</Content>\n`;
flag = true;
}
}
return {
result: final_result,
failreason: fail_reason,
success: !flag
};
};
export default main;

View File

@@ -0,0 +1,157 @@
import { delay } from '@fastgpt/global/common/system/utils';
import axios from 'axios';
import { getErrText } from '@fastgpt/global/common/error/utils';
type Props = {
apikey: string;
files: string[];
};
// Response type same as HTTP outputs
type Response = Promise<{
result: string;
success: boolean;
error?: Record<string, any>;
}>;
const main = async ({ apikey, files }: Props): Response => {
// Check the apikey
if (!apikey) {
return Promise.reject(`API key is required`);
}
const successResult = [];
const failedResult = [];
const axiosInstance = axios.create({
timeout: 30000 // 30 seconds timeout
});
//Process each file one by one
for await (const url of files) {
try {
//Fetch the pdf and check its content type
const PDFResponse = await axiosInstance.get(url, { responseType: 'arraybuffer' });
if (PDFResponse.status !== 200) {
throw new Error(
`File:${url} \n<Content>\nFailed to fetch PDF from URL: ${PDFResponse.statusText}\n</Content>`
);
}
const contentType = PDFResponse.headers['content-type'];
const file_name = url.match(/read\/([^?]+)/)?.[1] || 'unknown.pdf';
if (!contentType || !contentType.startsWith('application/pdf')) {
throw new Error(
`File:${file_name}\n<Content>\nThe provided file does not point to a PDF: ${contentType}\n</Content>`
);
}
const blob = new Blob([PDFResponse.data], { type: 'application/pdf' });
// Get pre-upload URL first
const preupload_response = await axiosInstance.post(
'https://v2.doc2x.noedgeai.com/api/v2/parse/preupload',
null,
{
headers: {
Authorization: `Bearer ${apikey}`
}
}
);
if (preupload_response.status !== 200) {
throw new Error(
`File:${file_name}\n<Content>\nFailed to get pre-upload URL: ${preupload_response.statusText}\n</Content>`
);
}
const preupload_data = preupload_response.data;
if (preupload_data.code !== 'success') {
throw new Error(
`File:${file_name}\n<Content>\nFailed to get pre-upload URL: ${JSON.stringify(preupload_data)}\n</Content>`
);
}
const upload_url = preupload_data.data.url;
const uid = preupload_data.data.uid;
// Upload file to pre-signed URL with binary stream
const response = await axiosInstance.put(upload_url, blob, {
headers: {
'Content-Type': 'application/pdf'
}
});
if (response.status !== 200) {
throw new Error(`Upload failed with status ${response.status}: ${response.statusText}`);
}
// Get the result by uid
// Wait for the result, at most 90s
const checkResult = async (retry = 30) => {
if (retry <= 0)
return Promise.reject(
`File:${file_name}\n<Content>\nFailed to get result (uid: ${uid}): Get result timeout\n</Content>`
);
try {
const result_response = await axiosInstance.get(
`https://v2.doc2x.noedgeai.com/api/v2/parse/status?uid=${uid}`,
{
headers: {
Authorization: `Bearer ${apikey}`
}
}
);
const result_data = result_response.data;
if (!['ok', 'success'].includes(result_data.code)) {
return Promise.reject(
`File:${file_name}\n<Content>\nFailed to get result (uid: ${uid}): ${JSON.stringify(result_data)}\n</Content>`
);
}
if (['ready', 'processing'].includes(result_data.data.status)) {
await delay(3000);
return checkResult(retry - 1);
}
if (result_data.data.status === 'success') {
const result = (
await Promise.all(
result_data.data.result.pages.map((page: { md: any }) => page.md)
).then((pages) => pages.join('\n'))
)
// Do some post-processing
.replace(/\\[\(\)]/g, '$')
.replace(/\\[\[\]]/g, '$$')
.replace(/<img\s+src="([^"]+)"(?:\s*\?[^>]*)?(?:\s*\/>|>)/g, '![img]($1)');
return `File:${file_name}\n<Content>\n${result}\n</Content>`;
}
await delay(100);
return checkResult(retry - 1);
} catch (error) {
await delay(100);
return checkResult(retry - 1);
}
};
const result = await checkResult();
successResult.push(result);
} catch (error) {
failedResult.push(
`File:${url} \n<Content>\nFailed to fetch image from URL: ${getErrText(error)}\n</Content>`
);
}
}
return {
result: successResult.join('\n******\n'),
error: {
message: failedResult.join('\n******\n')
},
success: failedResult.length === 0
};
};
export default main;

View File

@@ -1,9 +1,9 @@
{
"author": "Menghuan1918",
"version": "488",
"name": "Doc2X PDF文件(文件)识别",
"name": "PDF识别",
"avatar": "plugins/doc2x",
"intro": "将上传的PDF文件发送至Doc2X进行解析返回LaTeX公式的markdown格式的文本",
"intro": "将PDF文件发送至Doc2X进行解析返回结构化的LaTeX公式的文本(markdown)支持传入String类型的URL或者流程输出中的文件链接变量",
"courseUrl": "https://fael3z0zfze.feishu.cn/wiki/Rkc5witXWiJoi5kORd2cofh6nDg?fromScene=spaceOverview",
"showStatus": true,
"weight": 10,
@@ -13,30 +13,16 @@
"workflow": {
"nodes": [
{
"nodeId": "pluginConfig",
"name": "common:core.module.template.system_config",
"intro": "",
"avatar": "core/workflow/template/systemConfig",
"flowNodeType": "pluginConfig",
"position": {
"x": -30.474351356537454,
"y": -101.45216221730038
},
"version": "4811",
"inputs": [],
"outputs": []
},
{
"nodeId": "pluginInput",
"name": "插件开始",
"name": "自定义插件输入",
"intro": "可以配置插件需要哪些输入,利用这些输入来运行插件",
"avatar": "core/workflow/template/workflowStart",
"flowNodeType": "pluginInput",
"showStatus": false,
"position": {
"x": 407.2817920483865,
"y": -101.45216221730038
"x": -137.96875104510553,
"y": -90.9968973555371
},
"version": "481",
"inputs": [
@@ -47,33 +33,25 @@
"canEdit": true,
"key": "apikey",
"label": "apikey",
"description": "Doc2X的验证密匙对于个人用户可以从Doc2X官网 - 个人信息 - 身份令牌获得",
"description": "Doc2X的API密匙可以从Doc2X开放平台获得",
"required": true,
"toolDescription": "",
"defaultValue": ""
"defaultValue": "",
"list": []
},
{
"renderTypeList": ["reference"],
"renderTypeList": ["fileSelect"],
"selectedTypeIndex": 0,
"valueType": "arrayString",
"canEdit": true,
"key": "files",
"label": "files",
"description": "处理的PDF文件",
"description": "需要处理的PDF地址",
"required": true,
"toolDescription": "待处理的PDF文件"
},
{
"renderTypeList": ["switch"],
"selectedTypeIndex": 0,
"valueType": "boolean",
"canEdit": true,
"key": "ocr",
"label": "ocr",
"description": "是否开启对PDF文件内图片的OCR识别建议开启",
"required": true,
"toolDescription": "",
"defaultValue": true
"list": [],
"canSelectFile": true,
"canSelectImg": false,
"maxFiles": 14,
"defaultValue": ""
}
],
"outputs": [
@@ -90,26 +68,19 @@
"key": "files",
"label": "files",
"type": "hidden"
},
{
"id": "formula",
"valueType": "boolean",
"key": "ocr",
"label": "ocr",
"type": "hidden"
}
]
},
{
"nodeId": "pluginOutput",
"name": "插件输出",
"name": "自定义插件输出",
"intro": "自定义配置外部输出,使用插件时,仅暴露自定义配置的输出",
"avatar": "core/workflow/template/pluginOutput",
"flowNodeType": "pluginOutput",
"showStatus": false,
"position": {
"x": 1842.070888321717,
"y": -101.45216221730038
"x": 1505.494975310334,
"y": -4.14668564643415
},
"version": "481",
"inputs": [
@@ -124,12 +95,13 @@
},
{
"renderTypeList": ["reference"],
"valueType": "string",
"valueType": "object",
"canEdit": true,
"key": "failreason",
"label": "failreason",
"description": "文件处理失败原因,由文件名以及报错组成,多个文件之间由横线分隔开",
"value": ["zHG5jJBkXmjB", "yDxzW5CFalGw"]
"key": "error",
"label": "error",
"description": "",
"value": ["zHG5jJBkXmjB", "httpRawResponse"],
"isToolOutput": true
},
{
"renderTypeList": ["reference"],
@@ -138,7 +110,8 @@
"key": "success",
"label": "success",
"description": "是否全部文件都处理成功如有没有处理成功的文件失败原因将会输出在failreason中",
"value": ["zHG5jJBkXmjB", "m6CJJj7GFud5"]
"value": ["zHG5jJBkXmjB", "m6CJJj7GFud5"],
"isToolOutput": false
}
],
"outputs": []
@@ -151,8 +124,8 @@
"flowNodeType": "httpRequest468",
"showStatus": true,
"position": {
"x": 1077.7986740892777,
"y": -496.9521622173004
"x": 619.0661933308237,
"y": -472.91377894611503
},
"version": "481",
"inputs": [
@@ -202,7 +175,7 @@
"renderTypeList": ["custom"],
"valueType": "number",
"label": "",
"value": 30,
"value": 300,
"min": 5,
"max": 600,
"required": true,
@@ -217,7 +190,7 @@
"description": "common:core.module.input.description.Http Request Url",
"placeholder": "https://api.ai.com/getInventory",
"required": false,
"value": "Doc2X/FilePDF2text",
"value": "Doc2X/PDF2text",
"debugLabel": "",
"toolDescription": ""
},
@@ -247,7 +220,7 @@
"key": "system_httpJsonBody",
"renderTypeList": ["hidden"],
"valueType": "any",
"value": "{\n \"apikey\": \"{{apikey}}\",\n \"files\": {{files}},\n \"ocr\": {{ocr}}\n}",
"value": "{\n \"apikey\": \"{{apikey}}\",\n \"files\": {{files}}\n}",
"label": "",
"required": false,
"debugLabel": "",
@@ -331,37 +304,7 @@
"showDefaultValue": true
},
"required": true,
"value": ["pluginInput", "url"]
},
{
"renderTypeList": ["reference"],
"valueType": "boolean",
"canEdit": true,
"key": "ocr",
"label": "ocr",
"customInputConfig": {
"selectValueTypeList": [
"string",
"number",
"boolean",
"object",
"arrayString",
"arrayNumber",
"arrayBoolean",
"arrayObject",
"arrayAny",
"any",
"chatHistory",
"datasetQuote",
"dynamic",
"selectApp",
"selectDataset"
],
"showDescription": false,
"showDefaultValue": true
},
"required": true,
"value": ["pluginInput", "formula"]
"value": [["pluginInput", "url"]]
}
],
"outputs": [
@@ -422,30 +365,42 @@
"type": "dynamic",
"key": "success",
"label": "success"
},
{
"id": "yDxzW5CFalGw",
"valueType": "string",
"type": "dynamic",
"key": "failreason",
"label": "failreason"
}
]
}
],
"edges": [
{
"source": "pluginInput",
"target": "zHG5jJBkXmjB",
"sourceHandle": "pluginInput-source-right",
"targetHandle": "zHG5jJBkXmjB-target-left"
},
{
"source": "zHG5jJBkXmjB",
"target": "pluginOutput",
"sourceHandle": "zHG5jJBkXmjB-source-right",
"targetHandle": "pluginOutput-target-left"
},
{
"source": "pluginInput",
"target": "zHG5jJBkXmjB",
"sourceHandle": "pluginInput-source-right",
"targetHandle": "zHG5jJBkXmjB-target-left"
}
]
],
"chatConfig": {
"questionGuide": false,
"ttsConfig": {
"type": "web"
},
"whisperConfig": {
"open": false,
"autoSend": false,
"autoTTSResponse": false
},
"chatInputGuide": {
"open": false,
"textList": [],
"customUrl": ""
},
"instruction": "",
"variables": [],
"welcomeText": ""
}
}
}

View File

@@ -1,166 +0,0 @@
import { delay } from '@fastgpt/global/common/system/utils';
import { addLog } from '@fastgpt/service/common/system/log';
type Props = {
apikey: string;
url: string;
img_correction: boolean;
formula: boolean;
};
type Response = Promise<{
result: string;
success: boolean;
}>;
const main = async ({ apikey, url, img_correction, formula }: Props): Response => {
// Check the apikey
if (!apikey) {
return {
result: `API key is required`,
success: false
};
}
let real_api_key = apikey;
if (!apikey.startsWith('sk-')) {
const response = await fetch('https://api.doc2x.noedgeai.com/api/token/refresh', {
method: 'POST',
headers: {
Authorization: `Bearer ${apikey}`
}
});
if (response.status !== 200) {
return {
result: `Get token failed: ${await response.text()}`,
success: false
};
}
const data = await response.json();
real_api_key = data.data.token;
}
let imageResponse;
// Fetch the image and check its content type
try {
imageResponse = await fetch(url);
} catch (e) {
return {
result: `Failed to fetch image from URL: ${url} with error: ${e}`,
success: false
};
}
if (!imageResponse.ok) {
return {
result: `Failed to fetch image from URL: ${url}`,
success: false
};
}
const contentType = imageResponse.headers.get('content-type');
if (!contentType || !contentType.startsWith('image/')) {
return {
result: `The provided URL does not point to an image: ${contentType}`,
success: false
};
}
const blob = await imageResponse.blob();
const formData = new FormData();
const fileName = url.split('/').pop()?.split('?')[0] || 'image';
formData.append('file', blob, fileName);
formData.append('img_correction', img_correction ? '1' : '0');
formData.append('equation', formula ? '1' : '0');
let upload_url = 'https://api.doc2x.noedgeai.com/api/platform/async/img';
if (real_api_key.startsWith('sk-')) {
upload_url = 'https://api.doc2x.noedgeai.com/api/v1/async/img';
}
let uuid;
const uploadAttempts = [1, 2, 3];
for await (const attempt of uploadAttempts) {
const upload_response = await fetch(upload_url, {
method: 'POST',
headers: {
Authorization: `Bearer ${real_api_key}`
},
body: formData
});
if (!upload_response.ok) {
// Rate limit, wait for 10s and retry at most 3 times
if (upload_response.status === 429 && attempt < 3) {
await delay(10000);
continue;
}
return {
result: `Failed to upload image: ${await upload_response.text()}`,
success: false
};
}
const upload_data = await upload_response.json();
uuid = upload_data.data.uuid;
break;
}
// Get the result by uuid
let result_url = 'https://api.doc2x.noedgeai.com/api/platform/async/status?uuid=' + uuid;
if (real_api_key.startsWith('sk-')) {
result_url = 'https://api.doc2x.noedgeai.com/api/v1/async/status?uuid=' + uuid;
}
const maxAttempts = 100;
// Wait for the result, at most 100s
for await (const _ of Array(maxAttempts).keys()) {
const result_response = await fetch(result_url, {
headers: {
Authorization: `Bearer ${real_api_key}`
}
});
if (!result_response.ok) {
return {
result: `Failed to get result: ${await result_response.text()}`,
success: false
};
}
const result_data = await result_response.json();
if (['ready', 'processing'].includes(result_data.data.status)) {
await delay(1000);
} else if (result_data.data.status === 'pages limit exceeded') {
return {
result: 'Doc2X Pages limit exceeded',
success: false
};
} else if (result_data.data.status === 'success') {
let result;
try {
result = result_data.data.result.pages[0].md;
result = result.replace(/\\[\(\)]/g, '$').replace(/\\[\[\]]/g, '$$');
} catch {
// no pages
return {
result: '',
success: true
};
}
return {
result: result,
success: true
};
} else {
return {
result: `Failed to get result: ${await result_data.text()}`,
success: false
};
}
}
return {
result: 'Timeout waiting for result',
success: false
};
};
export default main;

View File

@@ -1,484 +0,0 @@
{
"author": "Menghuan1918",
"version": "488",
"name": "Doc2X 图像(URL)识别",
"avatar": "plugins/doc2x",
"intro": "从URL下载图片并发送至Doc2X进行解析返回带LaTeX公式的markdown格式的文本",
"courseUrl": "https://fael3z0zfze.feishu.cn/wiki/Rkc5witXWiJoi5kORd2cofh6nDg?fromScene=spaceOverview",
"showStatus": true,
"weight": 10,
"isTool": true,
"templateType": "tools",
"workflow": {
"nodes": [
{
"nodeId": "pluginInput",
"name": "插件开始",
"intro": "可以配置插件需要哪些输入,利用这些输入来运行插件",
"avatar": "core/workflow/template/workflowStart",
"flowNodeType": "pluginInput",
"showStatus": false,
"position": {
"x": 353.91678143999377,
"y": -75.09744210499466
},
"version": "481",
"inputs": [
{
"renderTypeList": ["input"],
"selectedTypeIndex": 0,
"valueType": "string",
"canEdit": true,
"key": "apikey",
"label": "apikey",
"description": "Doc2X的验证密匙对于个人用户可以从Doc2X官网 - 个人信息 - 身份令牌获得",
"required": true,
"toolDescription": "",
"defaultValue": ""
},
{
"renderTypeList": ["reference"],
"selectedTypeIndex": 0,
"valueType": "string",
"canEdit": true,
"key": "url",
"label": "url",
"description": "待处理图片的URL",
"required": true,
"toolDescription": "待处理图片的URL"
},
{
"renderTypeList": ["switch"],
"selectedTypeIndex": 0,
"valueType": "boolean",
"canEdit": true,
"key": "img_correction",
"label": "img_correction",
"description": "是否启用图形矫正功能",
"required": true,
"toolDescription": "",
"defaultValue": false
},
{
"renderTypeList": ["switch"],
"selectedTypeIndex": 0,
"valueType": "boolean",
"canEdit": true,
"key": "formula",
"label": "formula",
"description": "是否开启纯公式识别(仅适用于图片内容仅有公式时)",
"required": true,
"toolDescription": "",
"defaultValue": false
}
],
"outputs": [
{
"id": "apikey",
"valueType": "string",
"key": "apikey",
"label": "apikey",
"type": "hidden"
},
{
"id": "url",
"valueType": "string",
"key": "url",
"label": "url",
"type": "hidden"
},
{
"id": "img_correction",
"valueType": "boolean",
"key": "img_correction",
"label": "img_correction",
"type": "hidden"
},
{
"id": "formula",
"valueType": "boolean",
"key": "formula",
"label": "formula",
"type": "hidden"
}
]
},
{
"nodeId": "pluginOutput",
"name": "插件输出",
"intro": "自定义配置外部输出,使用插件时,仅暴露自定义配置的输出",
"avatar": "core/workflow/template/pluginOutput",
"flowNodeType": "pluginOutput",
"showStatus": false,
"position": {
"x": 1703.581616889916,
"y": -14.097442104994656
},
"version": "481",
"inputs": [
{
"renderTypeList": ["reference"],
"valueType": "string",
"canEdit": true,
"key": "result",
"label": "result",
"description": "处理结果(或者是报错信息)",
"value": ["zHG5jJBkXmjB", "xWQuEf50F3mr"]
},
{
"renderTypeList": ["reference"],
"valueType": "boolean",
"canEdit": true,
"key": "success",
"label": "success",
"description": "是否处理成功",
"value": ["zHG5jJBkXmjB", "m6CJJj7GFud5"]
}
],
"outputs": []
},
{
"nodeId": "zHG5jJBkXmjB",
"name": "HTTP 请求",
"intro": "可以发出一个 HTTP 请求,实现更为复杂的操作(联网搜索、数据库查询等)",
"avatar": "core/workflow/template/httpRequest",
"flowNodeType": "httpRequest468",
"showStatus": true,
"position": {
"x": 1000.6685388413375,
"y": -457.0974421049947
},
"version": "481",
"inputs": [
{
"key": "system_addInputParam",
"renderTypeList": ["addInputParam"],
"valueType": "dynamic",
"label": "",
"required": false,
"description": "common:core.module.input.description.HTTP Dynamic Input",
"customInputConfig": {
"selectValueTypeList": [
"string",
"number",
"boolean",
"object",
"arrayString",
"arrayNumber",
"arrayBoolean",
"arrayObject",
"arrayAny",
"any",
"chatHistory",
"datasetQuote",
"dynamic",
"selectApp",
"selectDataset"
],
"showDescription": false,
"showDefaultValue": true
},
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpMethod",
"renderTypeList": ["custom"],
"valueType": "string",
"label": "",
"value": "POST",
"required": true,
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpTimeout",
"renderTypeList": ["custom"],
"valueType": "number",
"label": "",
"value": 30,
"min": 5,
"max": 600,
"required": true,
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpReqUrl",
"renderTypeList": ["hidden"],
"valueType": "string",
"label": "",
"description": "common:core.module.input.description.Http Request Url",
"placeholder": "https://api.ai.com/getInventory",
"required": false,
"value": "Doc2X/URLImg2text",
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpHeader",
"renderTypeList": ["custom"],
"valueType": "any",
"value": [],
"label": "",
"description": "common:core.module.input.description.Http Request Header",
"placeholder": "common:core.module.input.description.Http Request Header",
"required": false,
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpParams",
"renderTypeList": ["hidden"],
"valueType": "any",
"value": [],
"label": "",
"required": false,
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpJsonBody",
"renderTypeList": ["hidden"],
"valueType": "any",
"value": "{\n \"apikey\": \"{{apikey}}\",\n \"url\": \"{{url}}\",\n \"img_correction\": {{img_correction}},\n \"formula\": {{formula}}\n}",
"label": "",
"required": false,
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpFormBody",
"renderTypeList": ["hidden"],
"valueType": "any",
"value": [],
"label": "",
"required": false,
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpContentType",
"renderTypeList": ["hidden"],
"valueType": "string",
"value": "json",
"label": "",
"required": false,
"debugLabel": "",
"toolDescription": ""
},
{
"renderTypeList": ["reference"],
"valueType": "string",
"canEdit": true,
"key": "apikey",
"label": "apikey",
"customInputConfig": {
"selectValueTypeList": [
"string",
"number",
"boolean",
"object",
"arrayString",
"arrayNumber",
"arrayBoolean",
"arrayObject",
"arrayAny",
"any",
"chatHistory",
"datasetQuote",
"dynamic",
"selectApp",
"selectDataset"
],
"showDescription": false,
"showDefaultValue": true
},
"required": true,
"value": ["pluginInput", "apikey"]
},
{
"renderTypeList": ["reference"],
"valueType": "string",
"canEdit": true,
"key": "url",
"label": "url",
"customInputConfig": {
"selectValueTypeList": [
"string",
"number",
"boolean",
"object",
"arrayString",
"arrayNumber",
"arrayBoolean",
"arrayObject",
"arrayAny",
"any",
"chatHistory",
"datasetQuote",
"dynamic",
"selectApp",
"selectDataset"
],
"showDescription": false,
"showDefaultValue": true
},
"required": true,
"value": ["pluginInput", "url"]
},
{
"renderTypeList": ["reference"],
"valueType": "boolean",
"canEdit": true,
"key": "img_correction",
"label": "img_correction",
"customInputConfig": {
"selectValueTypeList": [
"string",
"number",
"boolean",
"object",
"arrayString",
"arrayNumber",
"arrayBoolean",
"arrayObject",
"arrayAny",
"any",
"chatHistory",
"datasetQuote",
"dynamic",
"selectApp",
"selectDataset"
],
"showDescription": false,
"showDefaultValue": true
},
"required": true,
"value": ["pluginInput", "img_correction"]
},
{
"renderTypeList": ["reference"],
"valueType": "boolean",
"canEdit": true,
"key": "formula",
"label": "formula",
"customInputConfig": {
"selectValueTypeList": [
"string",
"number",
"boolean",
"object",
"arrayString",
"arrayNumber",
"arrayBoolean",
"arrayObject",
"arrayAny",
"any",
"chatHistory",
"datasetQuote",
"dynamic",
"selectApp",
"selectDataset"
],
"showDescription": false,
"showDefaultValue": true
},
"required": true,
"value": ["pluginInput", "formula"]
}
],
"outputs": [
{
"id": "error",
"key": "error",
"label": "workflow:request_error",
"description": "HTTP请求错误信息成功时返回空",
"valueType": "object",
"type": "static"
},
{
"id": "httpRawResponse",
"key": "httpRawResponse",
"required": true,
"label": "workflow:raw_response",
"description": "HTTP请求的原始响应。只能接受字符串或JSON类型响应数据。",
"valueType": "any",
"type": "static"
},
{
"id": "system_addOutputParam",
"key": "system_addOutputParam",
"type": "dynamic",
"valueType": "dynamic",
"label": "",
"customFieldConfig": {
"selectValueTypeList": [
"string",
"number",
"boolean",
"object",
"arrayString",
"arrayNumber",
"arrayBoolean",
"arrayObject",
"any",
"chatHistory",
"datasetQuote",
"dynamic",
"selectApp",
"selectDataset"
],
"showDescription": false,
"showDefaultValue": false
}
},
{
"id": "xWQuEf50F3mr",
"valueType": "string",
"type": "dynamic",
"key": "result",
"label": "result"
},
{
"id": "m6CJJj7GFud5",
"valueType": "boolean",
"type": "dynamic",
"key": "success",
"label": "success"
}
]
},
{
"nodeId": "sWEDDSeuI9ar",
"name": "系统配置",
"intro": "",
"avatar": "core/workflow/template/systemConfig",
"flowNodeType": "pluginConfig",
"position": {
"x": -117.03701176267538,
"y": -75.09744210499466
},
"version": "4811",
"inputs": [],
"outputs": []
}
],
"edges": [
{
"source": "pluginInput",
"target": "zHG5jJBkXmjB",
"sourceHandle": "pluginInput-source-right",
"targetHandle": "zHG5jJBkXmjB-target-left"
},
{
"source": "zHG5jJBkXmjB",
"target": "pluginOutput",
"sourceHandle": "zHG5jJBkXmjB-source-right",
"targetHandle": "pluginOutput-target-left"
}
]
}
}

View File

@@ -1,156 +0,0 @@
import { delay } from '@fastgpt/global/common/system/utils';
import { addLog } from '@fastgpt/service/common/system/log';
type Props = {
apikey: string;
url: string;
ocr: boolean;
};
// Response type same as HTTP outputs
type Response = Promise<{
result: string;
success: boolean;
}>;
const main = async ({ apikey, url, ocr }: Props): Response => {
// Check the apikey
if (!apikey) {
return {
result: `API key is required`,
success: false
};
}
let real_api_key = apikey;
if (!apikey.startsWith('sk-')) {
const response = await fetch('https://api.doc2x.noedgeai.com/api/token/refresh', {
method: 'POST',
headers: {
Authorization: `Bearer ${apikey}`
}
});
if (response.status !== 200) {
return {
result: `Get token failed: ${await response.text()}`,
success: false
};
}
const data = await response.json();
real_api_key = data.data.token;
}
//Fetch the pdf and check its contene type
let PDFResponse;
try {
PDFResponse = await fetch(url);
} catch (e) {
return {
result: `Failed to fetch PDF from URL: ${url} with error: ${e}`,
success: false
};
}
if (!PDFResponse.ok) {
return {
result: `Failed to fetch PDF from URL: ${url}`,
success: false
};
}
const contentType = PDFResponse.headers.get('content-type');
if (!contentType || !contentType.startsWith('application/pdf')) {
return {
result: `The provided URL does not point to a PDF: ${contentType}`,
success: false
};
}
const blob = await PDFResponse.blob();
const formData = new FormData();
const fileName = url.split('/').pop()?.split('?')[0] || 'pdf';
formData.append('file', blob, fileName);
formData.append('ocr', ocr ? '1' : '0');
let upload_url = 'https://api.doc2x.noedgeai.com/api/platform/async/pdf';
if (real_api_key.startsWith('sk-')) {
upload_url = 'https://api.doc2x.noedgeai.com/api/v1/async/pdf';
}
let uuid;
const uploadAttempts = [1, 2, 3];
for await (const attempt of uploadAttempts) {
const upload_response = await fetch(upload_url, {
method: 'POST',
headers: {
Authorization: `Bearer ${real_api_key}`
},
body: formData
});
if (!upload_response.ok) {
if (upload_response.status === 429 && attempt < 3) {
await delay(10000);
continue;
}
return {
result: `Failed to upload file: ${await upload_response.text()}`,
success: false
};
}
const upload_data = await upload_response.json();
uuid = upload_data.data.uuid;
break;
}
// Get the result by uuid
let result_url = 'https://api.doc2x.noedgeai.com/api/platform/async/status?uuid=' + uuid;
if (real_api_key.startsWith('sk-')) {
result_url = 'https://api.doc2x.noedgeai.com/api/v1/async/status?uuid=' + uuid;
}
let result = '';
// Wait for the result, at most 100s
const maxAttempts = 100;
for await (const _ of Array(maxAttempts).keys()) {
const result_response = await fetch(result_url, {
headers: {
Authorization: `Bearer ${real_api_key}`
}
});
if (!result_response.ok) {
return {
result: `Failed to get result: ${await result_response.text()}`,
success: false
};
}
const result_data = await result_response.json();
if (['ready', 'processing'].includes(result_data.data.status)) {
await delay(1000);
} else if (result_data.data.status === 'pages limit exceeded') {
return {
result: 'Doc2X Pages limit exceeded',
success: false
};
} else if (result_data.data.status === 'success') {
result = await Promise.all(
result_data.data.result.pages.map((page: { md: any }) => page.md)
).then((pages) => pages.join('\n'));
result = result.replace(/\\[\(\)]/g, '$').replace(/\\[\[\]]/g, '$$');
return {
result: result,
success: true
};
} else {
return {
result: `Failed to get result: ${await result_data.text()}`,
success: false
};
}
}
return {
result: 'Timeout waiting for result',
success: false
};
};
export default main;

View File

@@ -1,435 +0,0 @@
{
"author": "Menghuan1918",
"version": "488",
"name": "Doc2X PDF文件(URL)识别",
"avatar": "plugins/doc2x",
"intro": "从URL下载PDF文件并发送至Doc2X进行解析返回带LaTeX公式的markdown格式的文本",
"courseUrl": "https://fael3z0zfze.feishu.cn/wiki/Rkc5witXWiJoi5kORd2cofh6nDg?fromScene=spaceOverview",
"showStatus": true,
"weight": 10,
"isTool": true,
"templateType": "tools",
"workflow": {
"nodes": [
{
"nodeId": "pluginInput",
"name": "插件开始",
"intro": "可以配置插件需要哪些输入,利用这些输入来运行插件",
"avatar": "core/workflow/template/workflowStart",
"flowNodeType": "pluginInput",
"showStatus": false,
"position": {
"x": 388.243055058894,
"y": -75.09744210499466
},
"version": "481",
"inputs": [
{
"renderTypeList": ["input"],
"selectedTypeIndex": 0,
"valueType": "string",
"canEdit": true,
"key": "apikey",
"label": "apikey",
"description": "Doc2X的验证密匙对于个人用户可以从Doc2X官网 - 个人信息 - 身份令牌获得",
"required": true,
"toolDescription": "",
"defaultValue": ""
},
{
"renderTypeList": ["reference"],
"selectedTypeIndex": 0,
"valueType": "string",
"canEdit": true,
"key": "url",
"label": "url",
"description": "待处理PDF文件的URL",
"required": true,
"toolDescription": "待处理PDF文件的URL"
},
{
"renderTypeList": ["switch"],
"selectedTypeIndex": 0,
"valueType": "boolean",
"canEdit": true,
"key": "ocr",
"label": "ocr",
"description": "是否开启对PDF文件内图片的OCR识别建议开启",
"required": true,
"toolDescription": "",
"defaultValue": true
}
],
"outputs": [
{
"id": "apikey",
"valueType": "string",
"key": "apikey",
"label": "apikey",
"type": "hidden"
},
{
"id": "url",
"valueType": "string",
"key": "url",
"label": "url",
"type": "hidden"
},
{
"id": "formula",
"valueType": "boolean",
"key": "ocr",
"label": "ocr",
"type": "hidden"
}
]
},
{
"nodeId": "pluginOutput",
"name": "插件输出",
"intro": "自定义配置外部输出,使用插件时,仅暴露自定义配置的输出",
"avatar": "core/workflow/template/pluginOutput",
"flowNodeType": "pluginOutput",
"showStatus": false,
"position": {
"x": 1665.6420513111314,
"y": -40.597442104994656
},
"version": "481",
"inputs": [
{
"renderTypeList": ["reference"],
"valueType": "string",
"canEdit": true,
"key": "result",
"label": "result",
"description": "处理结果(或者是报错信息)",
"value": ["zHG5jJBkXmjB", "xWQuEf50F3mr"]
},
{
"renderTypeList": ["reference"],
"valueType": "boolean",
"canEdit": true,
"key": "success",
"label": "success",
"description": "是否处理成功",
"value": ["zHG5jJBkXmjB", "m6CJJj7GFud5"]
}
],
"outputs": []
},
{
"nodeId": "zHG5jJBkXmjB",
"name": "HTTP 请求",
"intro": "可以发出一个 HTTP 请求,实现更为复杂的操作(联网搜索、数据库查询等)",
"avatar": "core/workflow/template/httpRequest",
"flowNodeType": "httpRequest468",
"showStatus": true,
"position": {
"x": 966.3422652224374,
"y": -446.5974421049947
},
"version": "481",
"inputs": [
{
"key": "system_addInputParam",
"renderTypeList": ["addInputParam"],
"valueType": "dynamic",
"label": "",
"required": false,
"description": "common:core.module.input.description.HTTP Dynamic Input",
"customInputConfig": {
"selectValueTypeList": [
"string",
"number",
"boolean",
"object",
"arrayString",
"arrayNumber",
"arrayBoolean",
"arrayObject",
"arrayAny",
"any",
"chatHistory",
"datasetQuote",
"dynamic",
"selectApp",
"selectDataset"
],
"showDescription": false,
"showDefaultValue": true
},
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpMethod",
"renderTypeList": ["custom"],
"valueType": "string",
"label": "",
"value": "POST",
"required": true,
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpTimeout",
"renderTypeList": ["custom"],
"valueType": "number",
"label": "",
"value": 30,
"min": 5,
"max": 600,
"required": true,
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpReqUrl",
"renderTypeList": ["hidden"],
"valueType": "string",
"label": "",
"description": "common:core.module.input.description.Http Request Url",
"placeholder": "https://api.ai.com/getInventory",
"required": false,
"value": "Doc2X/URLPDF2text",
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpHeader",
"renderTypeList": ["custom"],
"valueType": "any",
"value": [],
"label": "",
"description": "common:core.module.input.description.Http Request Header",
"placeholder": "common:core.module.input.description.Http Request Header",
"required": false,
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpParams",
"renderTypeList": ["hidden"],
"valueType": "any",
"value": [],
"label": "",
"required": false,
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpJsonBody",
"renderTypeList": ["hidden"],
"valueType": "any",
"value": "{\n \"apikey\": \"{{apikey}}\",\n \"url\": \"{{url}}\",\n \"ocr\": {{ocr}}\n}",
"label": "",
"required": false,
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpFormBody",
"renderTypeList": ["hidden"],
"valueType": "any",
"value": [],
"label": "",
"required": false,
"debugLabel": "",
"toolDescription": ""
},
{
"key": "system_httpContentType",
"renderTypeList": ["hidden"],
"valueType": "string",
"value": "json",
"label": "",
"required": false,
"debugLabel": "",
"toolDescription": ""
},
{
"renderTypeList": ["reference"],
"valueType": "string",
"canEdit": true,
"key": "apikey",
"label": "apikey",
"customInputConfig": {
"selectValueTypeList": [
"string",
"number",
"boolean",
"object",
"arrayString",
"arrayNumber",
"arrayBoolean",
"arrayObject",
"arrayAny",
"any",
"chatHistory",
"datasetQuote",
"dynamic",
"selectApp",
"selectDataset"
],
"showDescription": false,
"showDefaultValue": true
},
"required": true,
"value": ["pluginInput", "apikey"]
},
{
"renderTypeList": ["reference"],
"valueType": "string",
"canEdit": true,
"key": "url",
"label": "url",
"customInputConfig": {
"selectValueTypeList": [
"string",
"number",
"boolean",
"object",
"arrayString",
"arrayNumber",
"arrayBoolean",
"arrayObject",
"arrayAny",
"any",
"chatHistory",
"datasetQuote",
"dynamic",
"selectApp",
"selectDataset"
],
"showDescription": false,
"showDefaultValue": true
},
"required": true,
"value": ["pluginInput", "url"]
},
{
"renderTypeList": ["reference"],
"valueType": "boolean",
"canEdit": true,
"key": "ocr",
"label": "ocr",
"customInputConfig": {
"selectValueTypeList": [
"string",
"number",
"boolean",
"object",
"arrayString",
"arrayNumber",
"arrayBoolean",
"arrayObject",
"arrayAny",
"any",
"chatHistory",
"datasetQuote",
"dynamic",
"selectApp",
"selectDataset"
],
"showDescription": false,
"showDefaultValue": true
},
"required": true,
"value": ["pluginInput", "formula"]
}
],
"outputs": [
{
"id": "error",
"key": "error",
"label": "workflow:request_error",
"description": "HTTP请求错误信息成功时返回空",
"valueType": "object",
"type": "static"
},
{
"id": "httpRawResponse",
"key": "httpRawResponse",
"required": true,
"label": "workflow:raw_response",
"description": "HTTP请求的原始响应。只能接受字符串或JSON类型响应数据。",
"valueType": "any",
"type": "static"
},
{
"id": "system_addOutputParam",
"key": "system_addOutputParam",
"type": "dynamic",
"valueType": "dynamic",
"label": "",
"customFieldConfig": {
"selectValueTypeList": [
"string",
"number",
"boolean",
"object",
"arrayString",
"arrayNumber",
"arrayBoolean",
"arrayObject",
"any",
"chatHistory",
"datasetQuote",
"dynamic",
"selectApp",
"selectDataset"
],
"showDescription": false,
"showDefaultValue": false
}
},
{
"id": "xWQuEf50F3mr",
"valueType": "string",
"type": "dynamic",
"key": "result",
"label": "result"
},
{
"id": "m6CJJj7GFud5",
"valueType": "boolean",
"type": "dynamic",
"key": "success",
"label": "success"
}
]
},
{
"nodeId": "rZmLfANEyyJe",
"name": "系统配置",
"intro": "",
"avatar": "core/workflow/template/systemConfig",
"flowNodeType": "pluginConfig",
"position": {
"x": -93.55061402342784,
"y": -55.907069101622824
},
"version": "4811",
"inputs": [],
"outputs": []
}
],
"edges": [
{
"source": "pluginInput",
"target": "zHG5jJBkXmjB",
"sourceHandle": "pluginInput-source-right",
"targetHandle": "zHG5jJBkXmjB-target-left"
},
{
"source": "zHG5jJBkXmjB",
"target": "pluginOutput",
"sourceHandle": "zHG5jJBkXmjB-source-right",
"targetHandle": "pluginOutput-target-left"
}
]
}
}

View File

@@ -36,6 +36,7 @@ export async function uploadFile({
path,
filename,
contentType,
encoding,
metadata = {}
}: {
bucketName: `${BucketNameEnum}`;
@@ -44,6 +45,7 @@ export async function uploadFile({
path: string;
filename: string;
contentType?: string;
encoding: string;
metadata?: Record<string, any>;
}) {
if (!path) return Promise.reject(`filePath is empty`);
@@ -52,7 +54,7 @@ export async function uploadFile({
const stats = await fsp.stat(path);
if (!stats.isFile()) return Promise.reject(`${path} is not a file`);
const { stream: readStream, encoding } = await stream2Encoding(fs.createReadStream(path));
const readStream = fs.createReadStream(path);
// Add default metadata
metadata.teamId = teamId;

View File

@@ -4,16 +4,17 @@ import FormData from 'form-data';
import { WorkerNameEnum, runWorker } from '../../../worker/utils';
import fs from 'fs';
import { detectFileEncoding } from '@fastgpt/global/common/file/tools';
import type { ReadFileResponse } from '../../../worker/readFile/type';
import axios from 'axios';
import { addLog } from '../../system/log';
import { batchRun } from '@fastgpt/global/common/fn/utils';
import { addHours } from 'date-fns';
import { matchMdImgTextAndUpload } from '@fastgpt/global/common/string/markdown';
export type readRawTextByLocalFileParams = {
teamId: string;
path: string;
encoding: string;
metadata?: Record<string, any>;
};
export const readRawTextByLocalFile = async (params: readRawTextByLocalFileParams) => {
@@ -22,13 +23,12 @@ export const readRawTextByLocalFile = async (params: readRawTextByLocalFileParam
const extension = path?.split('.')?.pop()?.toLowerCase() || '';
const buffer = fs.readFileSync(path);
const encoding = detectFileEncoding(buffer);
const { rawText } = await readRawContentByFileBuffer({
extension,
isQAImport: false,
teamId: params.teamId,
encoding,
encoding: params.encoding,
buffer,
metadata: params.metadata
});
@@ -53,6 +53,7 @@ export const readRawContentByFileBuffer = async ({
encoding: string;
metadata?: Record<string, any>;
}) => {
// Custom read file service
const customReadfileUrl = process.env.CUSTOM_READ_FILE_URL;
const customReadFileExtension = process.env.CUSTOM_READ_FILE_EXTENSION || '';
const ocrParse = process.env.CUSTOM_READ_FILE_OCR || 'false';
@@ -78,6 +79,7 @@ export const readRawContentByFileBuffer = async ({
data: {
page: number;
markdown: string;
duration: number;
};
}>(customReadfileUrl, data, {
timeout: 600000,
@@ -89,10 +91,12 @@ export const readRawContentByFileBuffer = async ({
addLog.info(`Use custom read file service, time: ${Date.now() - start}ms`);
const rawText = response.data.markdown;
const { text, imageList } = matchMdImgTextAndUpload(rawText);
return {
rawText,
formatText: rawText
rawText: text,
formatText: rawText,
imageList
};
};
@@ -119,6 +123,9 @@ export const readRawContentByFileBuffer = async ({
}
});
rawText = rawText.replace(item.uuid, src);
if (formatText) {
formatText = formatText.replace(item.uuid, src);
}
});
}
@@ -127,7 +134,7 @@ export const readRawContentByFileBuffer = async ({
if (isQAImport) {
rawText = rawText || '';
} else {
rawText = formatText || '';
rawText = formatText || rawText;
}
}

View File

@@ -1,5 +1,11 @@
import type { UserModelSchema } from '@fastgpt/global/support/user/type';
import OpenAI from '@fastgpt/global/core/ai';
import {
ChatCompletionCreateParamsNonStreaming,
ChatCompletionCreateParamsStreaming
} from '@fastgpt/global/core/ai/type';
import { getErrText } from '@fastgpt/global/common/error/utils';
import { addLog } from '../../common/system/log';
export const openaiBaseUrl = process.env.OPENAI_BASE_URL || 'https://api.openai.com/v1';
@@ -34,3 +40,55 @@ export const getAxiosConfig = (props?: { userKey?: UserModelSchema['openaiAccoun
authorization: `Bearer ${apiKey}`
};
};
type CompletionsBodyType =
| ChatCompletionCreateParamsNonStreaming
| ChatCompletionCreateParamsStreaming;
type InferResponseType<T extends CompletionsBodyType> =
T extends ChatCompletionCreateParamsStreaming
? OpenAI.Chat.Completions.ChatCompletionChunk
: OpenAI.Chat.Completions.ChatCompletion;
export const createChatCompletion = async <T extends CompletionsBodyType>({
body,
userKey,
timeout,
options
}: {
body: T;
userKey?: UserModelSchema['openaiAccount'];
timeout?: number;
options?: OpenAI.RequestOptions;
}): Promise<{
response: InferResponseType<T>;
isStreamResponse: boolean;
}> => {
try {
const formatTimeout = timeout ? timeout : body.stream ? 60000 : 600000;
const ai = getAIApi({
userKey,
timeout: formatTimeout
});
const response = await ai.chat.completions.create(body, options);
const isStreamResponse =
typeof response === 'object' &&
response !== null &&
('iterator' in response || 'controller' in response);
return {
response: response as InferResponseType<T>,
isStreamResponse
};
} catch (error) {
addLog.error(`LLM response error`, error);
addLog.warn(`LLM response error`, {
baseUrl: userKey?.baseUrl,
requestBody: body
});
if (userKey?.baseUrl) {
return Promise.reject(`您的 OpenAI key 出错了: ${getErrText(error)}`);
}
return Promise.reject(error);
}
};

View File

@@ -55,7 +55,7 @@ export async function getVectorsByText({ model, input, type }: GetVectorProps) {
return result;
} catch (error) {
console.log(`Embedding Error`, error);
addLog.error(`Embedding Error`, error);
return Promise.reject(error);
}

View File

@@ -1,5 +1,5 @@
import type { ChatCompletionMessageParam } from '@fastgpt/global/core/ai/type.d';
import { getAIApi } from '../config';
import { createChatCompletion } from '../config';
import { countGptMessagesTokens } from '../../../common/string/tiktoken/index';
import { loadRequestMessages } from '../../chat/utils';
import { llmCompletionsBodyFormat } from '../utils';
@@ -29,11 +29,8 @@ export async function createQuestionGuide({
}
];
const ai = getAIApi({
timeout: 480000
});
const data = await ai.chat.completions.create(
llmCompletionsBodyFormat(
const { response: data } = await createChatCompletion({
body: llmCompletionsBodyFormat(
{
model,
temperature: 0.1,
@@ -46,7 +43,7 @@ export async function createQuestionGuide({
},
model
)
);
});
const answer = data.choices?.[0]?.message?.content || '';

View File

@@ -1,8 +1,7 @@
import { replaceVariable } from '@fastgpt/global/common/string/tools';
import { getAIApi } from '../config';
import { createChatCompletion } from '../config';
import { ChatItemType } from '@fastgpt/global/core/chat/type';
import { countGptMessagesTokens } from '../../../common/string/tiktoken/index';
import { ChatCompletion, ChatCompletionMessageParam } from '@fastgpt/global/core/ai/type';
import { chatValue2RuntimePrompt } from '@fastgpt/global/core/chat/adapt';
import { getLLMModel } from '../model';
import { llmCompletionsBodyFormat } from '../utils';
@@ -138,10 +137,6 @@ A: ${chatBg}
const modelData = getLLMModel(model);
const ai = getAIApi({
timeout: 480000
});
const messages = [
{
role: 'user',
@@ -150,20 +145,19 @@ A: ${chatBg}
histories: concatFewShot
})
}
] as ChatCompletionMessageParam[];
] as any;
const result = (await ai.chat.completions.create(
llmCompletionsBodyFormat(
const { response: result } = await createChatCompletion({
body: llmCompletionsBodyFormat(
{
stream: false,
model: modelData.model,
temperature: 0.01,
// @ts-ignore
messages
},
modelData
)
)) as ChatCompletion;
});
let answer = result.choices?.[0]?.message?.content || '';
if (!answer) {

View File

@@ -48,14 +48,17 @@ export const computedTemperature = ({
type CompletionsBodyType =
| ChatCompletionCreateParamsNonStreaming
| ChatCompletionCreateParamsStreaming;
type InferCompletionsBody<T> = T extends { stream: true }
? ChatCompletionCreateParamsStreaming
: ChatCompletionCreateParamsNonStreaming;
export const llmCompletionsBodyFormat = <T extends CompletionsBodyType>(
body: T,
model: string | LLMModelItemType
) => {
): InferCompletionsBody<T> => {
const modelData = typeof model === 'string' ? getLLMModel(model) : model;
if (!modelData) {
return body;
return body as InferCompletionsBody<T>;
}
const requestBody: T = {
@@ -81,5 +84,5 @@ export const llmCompletionsBodyFormat = <T extends CompletionsBodyType>(
// console.log(requestBody);
return requestBody;
return requestBody as InferCompletionsBody<T>;
};

View File

@@ -109,7 +109,7 @@ export const loadRequestMessages = async ({
}
return Promise.all(
messages.map(async (item) => {
if (item.type === 'image_url' && process.env.MULTIPLE_DATA_TO_BASE64 === 'true') {
if (item.type === 'image_url') {
// Remove url origin
const imgUrl = (() => {
if (origin && item.image_url.url.startsWith(origin)) {

View File

@@ -118,7 +118,7 @@ try {
{
unique: true,
partialFilterExpression: {
externalFileId: { $exists: true }
externalFileId: { $exists: true, $ne: '' }
}
}
);

View File

@@ -118,7 +118,10 @@ export async function searchDatasetData(props: SearchDatasetDataProps) {
let createTimeCollectionIdList: string[] | undefined = undefined;
try {
const jsonMatch = json5.parse(collectionFilterMatch);
const jsonMatch =
typeof collectionFilterMatch === 'object'
? collectionFilterMatch
: json5.parse(collectionFilterMatch);
// Tag
let andTags = jsonMatch?.tags?.$and as (string | null)[] | undefined;
@@ -347,7 +350,7 @@ export async function searchDatasetData(props: SearchDatasetDataProps) {
teamId: new Types.ObjectId(teamId),
datasetId: new Types.ObjectId(id),
$text: { $search: jiebaSplit({ text: query }) },
...(filterCollectionIdList && filterCollectionIdList.length > 0
...(filterCollectionIdList
? {
collectionId: {
$in: filterCollectionIdList.map((id) => new Types.ObjectId(id))

View File

@@ -51,6 +51,11 @@ const TrainingDataSchema = new Schema({
type: Date,
default: () => new Date('2000/1/1')
},
retryCount: {
type: Number,
default: 5
},
model: {
// ai model
type: String,
@@ -97,7 +102,7 @@ try {
// lock training data(teamId); delete training data
TrainingDataSchema.index({ teamId: 1, datasetId: 1 });
// get training data and sort
TrainingDataSchema.index({ mode: 1, lockTime: 1, weight: -1 });
TrainingDataSchema.index({ mode: 1, retryCount: 1, lockTime: 1, weight: -1 });
TrainingDataSchema.index({ expireAt: 1 }, { expireAfterSeconds: 7 * 24 * 60 * 60 }); // 7 days
} catch (error) {
console.log(error);

View File

@@ -2,7 +2,7 @@ import { chats2GPTMessages } from '@fastgpt/global/core/chat/adapt';
import { countMessagesTokens } from '../../../../common/string/tiktoken/index';
import type { ChatItemType } from '@fastgpt/global/core/chat/type.d';
import { ChatItemValueTypeEnum, ChatRoleEnum } from '@fastgpt/global/core/chat/constants';
import { getAIApi } from '../../../ai/config';
import { createChatCompletion } from '../../../ai/config';
import type { ClassifyQuestionAgentItemType } from '@fastgpt/global/core/workflow/template/system/classifyQuestion/type';
import { NodeInputKeyEnum, NodeOutputKeyEnum } from '@fastgpt/global/core/workflow/constants';
import { DispatchNodeResponseKeyEnum } from '@fastgpt/global/core/workflow/runtime/constants';
@@ -120,13 +120,8 @@ const completions = async ({
useVision: false
});
const ai = getAIApi({
userKey: user.openaiAccount,
timeout: 480000
});
const data = await ai.chat.completions.create(
llmCompletionsBodyFormat(
const { response: data } = await createChatCompletion({
body: llmCompletionsBodyFormat(
{
model: cqModel.model,
temperature: 0.01,
@@ -134,8 +129,9 @@ const completions = async ({
stream: false
},
cqModel
)
);
),
userKey: user.openaiAccount
});
const answer = data.choices?.[0].message?.content || '';
// console.log(JSON.stringify(chats2GPTMessages({ messages, reserveId: false }), null, 2));

View File

@@ -6,7 +6,7 @@ import {
countGptMessagesTokens
} from '../../../../common/string/tiktoken/index';
import { ChatItemValueTypeEnum, ChatRoleEnum } from '@fastgpt/global/core/chat/constants';
import { getAIApi } from '../../../ai/config';
import { createChatCompletion } from '../../../ai/config';
import type { ContextExtractAgentItemType } from '@fastgpt/global/core/workflow/template/system/contextExtract/type';
import { NodeInputKeyEnum, NodeOutputKeyEnum } from '@fastgpt/global/core/workflow/constants';
import { DispatchNodeResponseKeyEnum } from '@fastgpt/global/core/workflow/runtime/constants';
@@ -222,13 +222,8 @@ const toolChoice = async (props: ActionProps) => {
}
];
const ai = getAIApi({
userKey: user.openaiAccount,
timeout: 480000
});
const response = await ai.chat.completions.create(
llmCompletionsBodyFormat(
const { response } = await createChatCompletion({
body: llmCompletionsBodyFormat(
{
model: extractModel.model,
temperature: 0.01,
@@ -237,8 +232,9 @@ const toolChoice = async (props: ActionProps) => {
tool_choice: { type: 'function', function: { name: agentFunName } }
},
extractModel
)
);
),
userKey: user.openaiAccount
});
const arg: Record<string, any> = (() => {
try {
@@ -272,13 +268,8 @@ const functionCall = async (props: ActionProps) => {
const { agentFunction, filterMessages } = await getFunctionCallSchema(props);
const functions: ChatCompletionCreateParams.Function[] = [agentFunction];
const ai = getAIApi({
userKey: user.openaiAccount,
timeout: 480000
});
const response = await ai.chat.completions.create(
llmCompletionsBodyFormat(
const { response } = await createChatCompletion({
body: llmCompletionsBodyFormat(
{
model: extractModel.model,
temperature: 0.01,
@@ -289,8 +280,9 @@ const functionCall = async (props: ActionProps) => {
functions
},
extractModel
)
);
),
userKey: user.openaiAccount
});
try {
const arg = JSON.parse(response?.choices?.[0]?.message?.function_call?.arguments || '');
@@ -358,12 +350,8 @@ Human: ${content}`
useVision: false
});
const ai = getAIApi({
userKey: user.openaiAccount,
timeout: 480000
});
const data = await ai.chat.completions.create(
llmCompletionsBodyFormat(
const { response: data } = await createChatCompletion({
body: llmCompletionsBodyFormat(
{
model: extractModel.model,
temperature: 0.01,
@@ -371,8 +359,9 @@ Human: ${content}`
stream: false
},
extractModel
)
);
),
userKey: user.openaiAccount
});
const answer = data.choices?.[0].message?.content || '';
// parse response

View File

@@ -1,5 +1,4 @@
import { LLMModelItemType } from '@fastgpt/global/core/ai/model.d';
import { getAIApi } from '../../../../ai/config';
import { createChatCompletion } from '../../../../ai/config';
import { filterGPTMessageByMaxTokens, loadRequestMessages } from '../../../../chat/utils';
import {
ChatCompletion,
@@ -22,12 +21,12 @@ import { DispatchFlowResponse, WorkflowResponseType } from '../../type';
import { countGptMessagesTokens } from '../../../../../common/string/tiktoken/index';
import { getNanoid, sliceStrStartEnd } from '@fastgpt/global/common/string/tools';
import { AIChatItemType } from '@fastgpt/global/core/chat/type';
import { chats2GPTMessages, GPTMessages2Chats } from '@fastgpt/global/core/chat/adapt';
import { GPTMessages2Chats } from '@fastgpt/global/core/chat/adapt';
import { formatToolResponse, initToolCallEdges, initToolNodes } from './utils';
import { computedMaxToken, llmCompletionsBodyFormat } from '../../../../ai/utils';
import { toolValueTypeList } from '@fastgpt/global/core/workflow/constants';
import { WorkflowInteractiveResponseType } from '@fastgpt/global/core/workflow/template/system/interactive/type';
import { ChatItemValueTypeEnum, ChatRoleEnum } from '@fastgpt/global/core/chat/constants';
import { ChatItemValueTypeEnum } from '@fastgpt/global/core/chat/constants';
import { i18nT } from '../../../../../../web/i18n/utils';
type FunctionRunResponseType = {
@@ -45,7 +44,7 @@ export const runToolWithFunctionCall = async (
requestOrigin,
runtimeNodes,
runtimeEdges,
node,
user,
stream,
workflowStreamResponse,
params: { temperature = 0, maxToken = 4000, aiChatVision }
@@ -217,17 +216,18 @@ export const runToolWithFunctionCall = async (
// console.log(JSON.stringify(requestMessages, null, 2));
/* Run llm */
const ai = getAIApi({
timeout: 480000
});
const aiResponse = await ai.chat.completions.create(requestBody, {
headers: {
Accept: 'application/json, text/plain, */*'
const { response: aiResponse, isStreamResponse } = await createChatCompletion({
body: requestBody,
userKey: user.openaiAccount,
options: {
headers: {
Accept: 'application/json, text/plain, */*'
}
}
});
const { answer, functionCalls } = await (async () => {
if (res && stream) {
if (res && isStreamResponse) {
return streamResponse({
res,
toolNodes,

View File

@@ -29,6 +29,7 @@ import { getFileContentFromLinks, getHistoryFileLinks } from '../../tools/readFi
import { parseUrlToFileType } from '@fastgpt/global/common/file/tools';
import { Prompt_DocumentQuote } from '@fastgpt/global/core/ai/prompt/AIChat';
import { FlowNodeTypeEnum } from '@fastgpt/global/core/workflow/node/constant';
import { postTextCensor } from '../../../../../common/api/requestPlusApi';
type Response = DispatchNodeResultType<{
[NodeOutputKeyEnum.answerText]: string;
@@ -45,6 +46,7 @@ export const dispatchRunTools = async (props: DispatchToolModuleProps): Promise<
requestOrigin,
chatConfig,
runningAppInfo: { teamId },
user,
params: {
model,
systemPrompt,
@@ -150,6 +152,15 @@ export const dispatchRunTools = async (props: DispatchToolModuleProps): Promise<
return value;
})();
// censor model and system key
if (toolModel.censor && !user.openaiAccount?.key) {
await postTextCensor({
text: `${systemPrompt}
${userChatInput}
`
});
}
const {
toolWorkflowInteractiveResponse,
dispatchFlowResponse, // tool flow response
@@ -217,13 +228,14 @@ export const dispatchRunTools = async (props: DispatchToolModuleProps): Promise<
tokens: toolNodeTokens,
modelType: ModelTypeEnum.llm
});
const toolAIUsage = user.openaiAccount?.key ? 0 : totalPoints;
// flat child tool response
const childToolResponse = dispatchFlowResponse.map((item) => item.flowResponses).flat();
// concat tool usage
const totalPointsUsage =
totalPoints +
toolAIUsage +
dispatchFlowResponse.reduce((sum, item) => {
const childrenTotal = item.flowUsages.reduce((sum, item) => sum + item.totalPoints, 0);
return sum + childrenTotal;
@@ -240,6 +252,7 @@ export const dispatchRunTools = async (props: DispatchToolModuleProps): Promise<
.join(''),
[DispatchNodeResponseKeyEnum.assistantResponses]: previewAssistantResponses,
[DispatchNodeResponseKeyEnum.nodeResponse]: {
// 展示的积分消耗
totalPoints: totalPointsUsage,
toolCallTokens: toolNodeTokens,
childTotalPoints: flatUsages.reduce((sum, item) => sum + item.totalPoints, 0),
@@ -254,12 +267,14 @@ export const dispatchRunTools = async (props: DispatchToolModuleProps): Promise<
mergeSignId: nodeId
},
[DispatchNodeResponseKeyEnum.nodeDispatchUsages]: [
// 工具调用本身的积分消耗
{
moduleName: name,
totalPoints,
totalPoints: toolAIUsage,
model: modelName,
tokens: toolNodeTokens
},
// 工具的消耗
...flatUsages
],
[DispatchNodeResponseKeyEnum.interactive]: toolWorkflowInteractiveResponse

View File

@@ -1,4 +1,4 @@
import { getAIApi } from '../../../../ai/config';
import { createChatCompletion } from '../../../../ai/config';
import { filterGPTMessageByMaxTokens, loadRequestMessages } from '../../../../chat/utils';
import {
ChatCompletion,
@@ -52,7 +52,7 @@ export const runToolWithPromptCall = async (
requestOrigin,
runtimeNodes,
runtimeEdges,
node,
user,
stream,
workflowStreamResponse,
params: { temperature = 0, maxToken = 4000, aiChatVision }
@@ -225,18 +225,15 @@ export const runToolWithPromptCall = async (
// console.log(JSON.stringify(requestMessages, null, 2));
/* Run llm */
const ai = getAIApi({
timeout: 480000
});
const aiResponse = await ai.chat.completions.create(requestBody, {
headers: {
Accept: 'application/json, text/plain, */*'
const { response: aiResponse, isStreamResponse } = await createChatCompletion({
body: requestBody,
userKey: user.openaiAccount,
options: {
headers: {
Accept: 'application/json, text/plain, */*'
}
}
});
const isStreamResponse =
typeof aiResponse === 'object' &&
aiResponse !== null &&
('iterator' in aiResponse || 'controller' in aiResponse);
const answer = await (async () => {
if (res && isStreamResponse) {

View File

@@ -1,4 +1,4 @@
import { getAIApi } from '../../../../ai/config';
import { createChatCompletion } from '../../../../ai/config';
import { filterGPTMessageByMaxTokens, loadRequestMessages } from '../../../../chat/utils';
import {
ChatCompletion,
@@ -92,6 +92,7 @@ export const runToolWithToolChoice = async (
runtimeNodes,
runtimeEdges,
stream,
user,
workflowStreamResponse,
params: { temperature = 0, maxToken = 4000, aiChatVision }
} = workflowProps;
@@ -271,277 +272,265 @@ export const runToolWithToolChoice = async (
);
// console.log(JSON.stringify(requestBody, null, 2), '==requestBody');
/* Run llm */
const ai = getAIApi({
timeout: 480000
});
try {
const aiResponse = await ai.chat.completions.create(requestBody, {
const { response: aiResponse, isStreamResponse } = await createChatCompletion({
body: requestBody,
userKey: user.openaiAccount,
options: {
headers: {
Accept: 'application/json, text/plain, */*'
}
});
const isStreamResponse =
typeof aiResponse === 'object' &&
aiResponse !== null &&
('iterator' in aiResponse || 'controller' in aiResponse);
}
});
const { answer, toolCalls } = await (async () => {
if (res && isStreamResponse) {
return streamResponse({
res,
workflowStreamResponse,
toolNodes,
stream: aiResponse
});
} else {
const result = aiResponse as ChatCompletion;
const calls = result.choices?.[0]?.message?.tool_calls || [];
const answer = result.choices?.[0]?.message?.content || '';
const { answer, toolCalls } = await (async () => {
if (res && isStreamResponse) {
return streamResponse({
res,
workflowStreamResponse,
toolNodes,
stream: aiResponse
});
} else {
const result = aiResponse as ChatCompletion;
const calls = result.choices?.[0]?.message?.tool_calls || [];
const answer = result.choices?.[0]?.message?.content || '';
// 加上name和avatar
const toolCalls = calls.map((tool) => {
const toolNode = toolNodes.find((item) => item.nodeId === tool.function?.name);
return {
...tool,
toolName: toolNode?.name || '',
toolAvatar: toolNode?.avatar || ''
};
});
// 加上name和avatar
const toolCalls = calls.map((tool) => {
const toolNode = toolNodes.find((item) => item.nodeId === tool.function?.name);
return {
...tool,
toolName: toolNode?.name || '',
toolAvatar: toolNode?.avatar || ''
};
});
// 不支持 stream 模式的模型的流失响应
toolCalls.forEach((tool) => {
workflowStreamResponse?.({
event: SseResponseEventEnum.toolCall,
data: {
tool: {
id: tool.id,
toolName: tool.toolName,
toolAvatar: tool.toolAvatar,
functionName: tool.function.name,
params: tool.function?.arguments ?? '',
response: ''
}
// 不支持 stream 模式的模型的流失响应
toolCalls.forEach((tool) => {
workflowStreamResponse?.({
event: SseResponseEventEnum.toolCall,
data: {
tool: {
id: tool.id,
toolName: tool.toolName,
toolAvatar: tool.toolAvatar,
functionName: tool.function.name,
params: tool.function?.arguments ?? '',
response: ''
}
});
}
});
});
if (answer) {
workflowStreamResponse?.({
event: SseResponseEventEnum.fastAnswer,
data: textAdaptGptResponse({
text: answer
})
});
}
return {
answer,
toolCalls: toolCalls
};
}
})();
// Run the selected tool by LLM.
const toolsRunResponse = (
await Promise.all(
toolCalls.map(async (tool) => {
const toolNode = toolNodes.find((item) => item.nodeId === tool.function?.name);
if (!toolNode) return;
const startParams = (() => {
try {
return json5.parse(tool.function.arguments);
} catch (error) {
return {};
}
})();
initToolNodes(runtimeNodes, [toolNode.nodeId], startParams);
const toolRunResponse = await dispatchWorkFlow({
...workflowProps,
isToolCall: true
});
const stringToolResponse = formatToolResponse(toolRunResponse.toolResponses);
const toolMsgParams: ChatCompletionToolMessageParam = {
tool_call_id: tool.id,
role: ChatCompletionRequestMessageRoleEnum.Tool,
name: tool.function.name,
content: stringToolResponse
};
workflowStreamResponse?.({
event: SseResponseEventEnum.toolResponse,
data: {
tool: {
id: tool.id,
toolName: '',
toolAvatar: '',
params: '',
response: sliceStrStartEnd(stringToolResponse, 5000, 5000)
}
}
});
if (answer) {
workflowStreamResponse?.({
event: SseResponseEventEnum.fastAnswer,
data: textAdaptGptResponse({
text: answer
})
});
}
return {
answer,
toolCalls: toolCalls
toolRunResponse,
toolMsgParams
};
})
)
).filter(Boolean) as ToolRunResponseType;
const flatToolsResponseData = toolsRunResponse.map((item) => item.toolRunResponse).flat();
// concat tool responses
const dispatchFlowResponse = response
? response.dispatchFlowResponse.concat(flatToolsResponseData)
: flatToolsResponseData;
if (toolCalls.length > 0 && !res?.closed) {
// Run the tool, combine its results, and perform another round of AI calls
const assistantToolMsgParams: ChatCompletionAssistantMessageParam[] = [
...(answer
? [
{
role: ChatCompletionRequestMessageRoleEnum.Assistant as 'assistant',
content: answer
}
]
: []),
{
role: ChatCompletionRequestMessageRoleEnum.Assistant,
tool_calls: toolCalls
}
})();
];
// Run the selected tool by LLM.
const toolsRunResponse = (
await Promise.all(
toolCalls.map(async (tool) => {
const toolNode = toolNodes.find((item) => item.nodeId === tool.function?.name);
if (!toolNode) return;
const startParams = (() => {
try {
return json5.parse(tool.function.arguments);
} catch (error) {
return {};
}
})();
initToolNodes(runtimeNodes, [toolNode.nodeId], startParams);
const toolRunResponse = await dispatchWorkFlow({
...workflowProps,
isToolCall: true
});
const stringToolResponse = formatToolResponse(toolRunResponse.toolResponses);
const toolMsgParams: ChatCompletionToolMessageParam = {
tool_call_id: tool.id,
role: ChatCompletionRequestMessageRoleEnum.Tool,
name: tool.function.name,
content: stringToolResponse
};
workflowStreamResponse?.({
event: SseResponseEventEnum.toolResponse,
data: {
tool: {
id: tool.id,
toolName: '',
toolAvatar: '',
params: '',
response: sliceStrStartEnd(stringToolResponse, 5000, 5000)
}
}
});
return {
toolRunResponse,
toolMsgParams
};
})
)
).filter(Boolean) as ToolRunResponseType;
const flatToolsResponseData = toolsRunResponse.map((item) => item.toolRunResponse).flat();
// concat tool responses
const dispatchFlowResponse = response
? response.dispatchFlowResponse.concat(flatToolsResponseData)
: flatToolsResponseData;
if (toolCalls.length > 0 && !res?.closed) {
// Run the tool, combine its results, and perform another round of AI calls
const assistantToolMsgParams: ChatCompletionAssistantMessageParam[] = [
...(answer
? [
{
role: ChatCompletionRequestMessageRoleEnum.Assistant as 'assistant',
content: answer
}
]
: []),
{
role: ChatCompletionRequestMessageRoleEnum.Assistant,
tool_calls: toolCalls
}
];
/*
/*
...
user
assistant: tool data
*/
const concatToolMessages = [
...requestMessages,
...assistantToolMsgParams
] as ChatCompletionMessageParam[];
const concatToolMessages = [
...requestMessages,
...assistantToolMsgParams
] as ChatCompletionMessageParam[];
// Only toolCall tokens are counted here, Tool response tokens count towards the next reply
const tokens = await countGptMessagesTokens(concatToolMessages, tools);
/*
// Only toolCall tokens are counted here, Tool response tokens count towards the next reply
const tokens = await countGptMessagesTokens(concatToolMessages, tools);
/*
...
user
assistant: tool data
tool: tool response
*/
const completeMessages = [
...concatToolMessages,
...toolsRunResponse.map((item) => item?.toolMsgParams)
];
const completeMessages = [
...concatToolMessages,
...toolsRunResponse.map((item) => item?.toolMsgParams)
];
/*
/*
Get tool node assistant response
history assistant
current tool assistant
tool child assistant
*/
const toolNodeAssistant = GPTMessages2Chats([
...assistantToolMsgParams,
...toolsRunResponse.map((item) => item?.toolMsgParams)
])[0] as AIChatItemType;
const toolChildAssistants = flatToolsResponseData
.map((item) => item.assistantResponses)
.flat()
.filter((item) => item.type !== ChatItemValueTypeEnum.interactive); // 交互节点留着下次记录
const toolNodeAssistants = [
...assistantResponses,
...toolNodeAssistant.value,
...toolChildAssistants
];
const toolNodeAssistant = GPTMessages2Chats([
...assistantToolMsgParams,
...toolsRunResponse.map((item) => item?.toolMsgParams)
])[0] as AIChatItemType;
const toolChildAssistants = flatToolsResponseData
.map((item) => item.assistantResponses)
.flat()
.filter((item) => item.type !== ChatItemValueTypeEnum.interactive); // 交互节点留着下次记录
const toolNodeAssistants = [
...assistantResponses,
...toolNodeAssistant.value,
...toolChildAssistants
];
const runTimes =
(response?.runTimes || 0) +
flatToolsResponseData.reduce((sum, item) => sum + item.runTimes, 0);
const toolNodeTokens = response ? response.toolNodeTokens + tokens : tokens;
const runTimes =
(response?.runTimes || 0) +
flatToolsResponseData.reduce((sum, item) => sum + item.runTimes, 0);
const toolNodeTokens = response ? response.toolNodeTokens + tokens : tokens;
// Check stop signal
const hasStopSignal = flatToolsResponseData.some(
(item) => !!item.flowResponses?.find((item) => item.toolStop)
);
// Check interactive response(Only 1 interaction is reserved)
const workflowInteractiveResponseItem = toolsRunResponse.find(
(item) => item.toolRunResponse.workflowInteractiveResponse
);
if (hasStopSignal || workflowInteractiveResponseItem) {
// Get interactive tool data
const workflowInteractiveResponse =
workflowInteractiveResponseItem?.toolRunResponse.workflowInteractiveResponse;
// Check stop signal
const hasStopSignal = flatToolsResponseData.some(
(item) => !!item.flowResponses?.find((item) => item.toolStop)
);
// Check interactive response(Only 1 interaction is reserved)
const workflowInteractiveResponseItem = toolsRunResponse.find(
(item) => item.toolRunResponse.workflowInteractiveResponse
);
if (hasStopSignal || workflowInteractiveResponseItem) {
// Get interactive tool data
const workflowInteractiveResponse =
workflowInteractiveResponseItem?.toolRunResponse.workflowInteractiveResponse;
// Flashback traverses completeMessages, intercepting messages that know the first user
const firstUserIndex = completeMessages.findLastIndex((item) => item.role === 'user');
const newMessages = completeMessages.slice(firstUserIndex + 1);
// Flashback traverses completeMessages, intercepting messages that know the first user
const firstUserIndex = completeMessages.findLastIndex((item) => item.role === 'user');
const newMessages = completeMessages.slice(firstUserIndex + 1);
const toolWorkflowInteractiveResponse: WorkflowInteractiveResponseType | undefined =
workflowInteractiveResponse
? {
...workflowInteractiveResponse,
toolParams: {
entryNodeIds: workflowInteractiveResponse.entryNodeIds,
toolCallId: workflowInteractiveResponseItem?.toolMsgParams.tool_call_id,
memoryMessages: newMessages
}
const toolWorkflowInteractiveResponse: WorkflowInteractiveResponseType | undefined =
workflowInteractiveResponse
? {
...workflowInteractiveResponse,
toolParams: {
entryNodeIds: workflowInteractiveResponse.entryNodeIds,
toolCallId: workflowInteractiveResponseItem?.toolMsgParams.tool_call_id,
memoryMessages: newMessages
}
: undefined;
return {
dispatchFlowResponse,
toolNodeTokens,
completeMessages,
assistantResponses: toolNodeAssistants,
runTimes,
toolWorkflowInteractiveResponse
};
}
return runToolWithToolChoice(
{
...props,
maxRunToolTimes: maxRunToolTimes - 1,
messages: completeMessages
},
{
dispatchFlowResponse,
toolNodeTokens,
assistantResponses: toolNodeAssistants,
runTimes
}
);
} else {
// No tool is invoked, indicating that the process is over
const gptAssistantResponse: ChatCompletionAssistantMessageParam = {
role: ChatCompletionRequestMessageRoleEnum.Assistant,
content: answer
};
const completeMessages = filterMessages.concat(gptAssistantResponse);
const tokens = await countGptMessagesTokens(completeMessages, tools);
// concat tool assistant
const toolNodeAssistant = GPTMessages2Chats([gptAssistantResponse])[0] as AIChatItemType;
}
: undefined;
return {
dispatchFlowResponse: response?.dispatchFlowResponse || [],
toolNodeTokens: response ? response.toolNodeTokens + tokens : tokens,
dispatchFlowResponse,
toolNodeTokens,
completeMessages,
assistantResponses: [...assistantResponses, ...toolNodeAssistant.value],
runTimes: (response?.runTimes || 0) + 1
assistantResponses: toolNodeAssistants,
runTimes,
toolWorkflowInteractiveResponse
};
}
} catch (error) {
console.log(error);
addLog.warn(`LLM response error`, {
requestBody
});
return Promise.reject(error);
return runToolWithToolChoice(
{
...props,
maxRunToolTimes: maxRunToolTimes - 1,
messages: completeMessages
},
{
dispatchFlowResponse,
toolNodeTokens,
assistantResponses: toolNodeAssistants,
runTimes
}
);
} else {
// No tool is invoked, indicating that the process is over
const gptAssistantResponse: ChatCompletionAssistantMessageParam = {
role: ChatCompletionRequestMessageRoleEnum.Assistant,
content: answer
};
const completeMessages = filterMessages.concat(gptAssistantResponse);
const tokens = await countGptMessagesTokens(completeMessages, tools);
// concat tool assistant
const toolNodeAssistant = GPTMessages2Chats([gptAssistantResponse])[0] as AIChatItemType;
return {
dispatchFlowResponse: response?.dispatchFlowResponse || [],
toolNodeTokens: response ? response.toolNodeTokens + tokens : tokens,
completeMessages,
assistantResponses: [...assistantResponses, ...toolNodeAssistant.value],
runTimes: (response?.runTimes || 0) + 1
};
}
};

View File

@@ -4,7 +4,7 @@ import type { ChatItemType, UserChatItemValueItemType } from '@fastgpt/global/co
import { ChatRoleEnum } from '@fastgpt/global/core/chat/constants';
import { SseResponseEventEnum } from '@fastgpt/global/core/workflow/runtime/constants';
import { textAdaptGptResponse } from '@fastgpt/global/core/workflow/runtime/utils';
import { getAIApi } from '../../../ai/config';
import { createChatCompletion } from '../../../ai/config';
import type { ChatCompletion, StreamChatType } from '@fastgpt/global/core/ai/type.d';
import { formatModelChars2Points } from '../../../../support/wallet/usage/utils';
import type { LLMModelItemType } from '@fastgpt/global/core/ai/model.d';
@@ -138,7 +138,6 @@ export const dispatchChatCompletion = async (props: ChatProps): Promise<ChatResp
if (modelConstantsData.censor && !user.openaiAccount?.key) {
return postTextCensor({
text: `${systemPrompt}
${datasetQuoteText}
${userChatInput}
`
});
@@ -171,21 +170,16 @@ export const dispatchChatCompletion = async (props: ChatProps): Promise<ChatResp
);
// console.log(JSON.stringify(requestBody, null, 2), '===');
try {
const ai = getAIApi({
const { response, isStreamResponse } = await createChatCompletion({
body: requestBody,
userKey: user.openaiAccount,
timeout: 480000
});
const response = await ai.chat.completions.create(requestBody, {
headers: {
Accept: 'application/json, text/plain, */*'
options: {
headers: {
Accept: 'application/json, text/plain, */*'
}
}
});
const isStreamResponse =
typeof response === 'object' &&
response !== null &&
('iterator' in response || 'controller' in response);
const { answerText } = await (async () => {
if (res && isStreamResponse) {
// sse response
@@ -262,11 +256,6 @@ export const dispatchChatCompletion = async (props: ChatProps): Promise<ChatResp
history: chatCompleteMessages
};
} catch (error) {
addLog.warn(`LLM response error`, {
baseUrl: user.openaiAccount?.baseUrl,
requestBody
});
if (user.openaiAccount?.baseUrl) {
return Promise.reject(`您的 OpenAI key 出错了: ${getErrText(error)}`);
}

View File

@@ -65,7 +65,17 @@ export async function dispatchDatasetSearch(
}
if (!userChatInput) {
return Promise.reject(i18nT('common:core.chat.error.User input empty'));
return {
quoteQA: [],
[DispatchNodeResponseKeyEnum.nodeResponse]: {
totalPoints: 0,
query: '',
limit,
searchMode
},
nodeDispatchUsages: [],
[DispatchNodeResponseKeyEnum.toolResponses]: []
};
}
// query extension

View File

@@ -18,7 +18,6 @@ import {
textAdaptGptResponse,
replaceEditorVariable
} from '@fastgpt/global/core/workflow/runtime/utils';
import { getSystemPluginCb } from '../../../../../plugins/register';
import { ContentTypes } from '@fastgpt/global/core/workflow/constants';
import { uploadFileFromBase64Img } from '../../../../common/file/gridfs/controller';
import { ReadFileBaseUrl } from '@fastgpt/global/common/file/constants';
@@ -209,7 +208,8 @@ export const dispatchHttp468Request = async (props: HttpRequestProps): Promise<H
try {
const { formatResponse, rawResponse } = await (async () => {
const systemPluginCb = await getSystemPluginCb();
const systemPluginCb = global.systemPluginCb;
console.log(systemPluginCb, '-=', httpReqUrl);
if (systemPluginCb[httpReqUrl]) {
const pluginResult = await replaceSystemPluginResponse({
response: await systemPluginCb[httpReqUrl](requestBody),

View File

@@ -1,5 +1,6 @@
import TurndownService from 'turndown';
import { ImageType } from '../readFile/type';
import { matchMdImgTextAndUpload } from '@fastgpt/global/common/string/markdown';
// @ts-ignore
const turndownPluginGfm = require('joplin-turndown-plugin-gfm');
@@ -24,23 +25,10 @@ export const html2md = (
turndownService.remove(['i', 'script', 'iframe', 'style']);
turndownService.use(turndownPluginGfm.gfm);
const base64Regex = /"(data:image\/[^;]+;base64[^"]+)"/g;
const imageList: ImageType[] = [];
const images = Array.from(html.match(base64Regex) || []);
for (const image of images) {
const uuid = crypto.randomUUID();
const mime = image.split(';')[0].split(':')[1];
const base64 = image.split(',')[1];
html = html.replace(image, uuid);
imageList.push({
uuid,
base64,
mime
});
}
const { text, imageList } = matchMdImgTextAndUpload(html);
return {
rawText: turndownService.turndown(html),
rawText: turndownService.turndown(text),
imageList
};
} catch (error) {

View File

@@ -18,9 +18,17 @@ const rawEncodingList = [
// 加载源文件内容
export const readFileRawText = ({ buffer, encoding }: ReadRawTextByBuffer): ReadFileResponse => {
const content = rawEncodingList.includes(encoding)
? buffer.toString(encoding as BufferEncoding)
: iconv.decode(buffer, 'gbk');
const content = (() => {
try {
if (rawEncodingList.includes(encoding)) {
return buffer.toString(encoding as BufferEncoding);
}
return iconv.decode(buffer, encoding);
} catch (error) {
return buffer.toString('utf-8');
}
})();
return {
rawText: content

View File

@@ -61,7 +61,14 @@ export const readFileRawText = ({
reject(getErrText(err, 'Load file error'));
};
reader.readAsText(file);
detectFileEncoding(file).then((encoding) => {
console.log(encoding);
reader.readAsText(
file,
['iso-8859-1', 'windows-1252'].includes(encoding) ? 'gb2312' : 'utf-8'
);
});
} catch (error) {
reject('The browser does not support file content reading');
}
@@ -71,6 +78,24 @@ export const readFileRawText = ({
export const readCsvRawText = async ({ file }: { file: File }) => {
const rawText = await readFileRawText({ file });
const csvArr = Papa.parse(rawText).data as string[][];
return csvArr;
};
async function detectFileEncoding(file: File): Promise<string> {
const buffer = await loadFile2Buffer({ file });
const encoding = (() => {
const encodings = ['utf-8', 'iso-8859-1', 'windows-1252'];
for (let encoding of encodings) {
try {
const decoder = new TextDecoder(encoding, { fatal: true });
decoder.decode(buffer);
return encoding; // 如果解码成功,返回当前编码
} catch (e) {
// continue to try next encoding
}
}
return null; // 如果没有编码匹配返回null
})();
return encoding || 'utf-8';
}

View File

@@ -50,6 +50,7 @@ export const iconPaths = {
'common/list': () => import('./icons/common/list.svg'),
'common/loading': () => import('./icons/common/loading.svg'),
'common/logLight': () => import('./icons/common/logLight.svg'),
'common/microsoft': () => import('./icons/common/microsoft.svg'),
'common/monitor': () => import('./icons/common/monitor.svg'),
'common/navbar/pluginFill': () => import('./icons/common/navbar/pluginFill.svg'),
'common/navbar/pluginLight': () => import('./icons/common/navbar/pluginLight.svg'),

View File

@@ -0,0 +1 @@
<svg t="1731513229844" class="icon" viewBox="0 0 1024 1024" version="1.1" xmlns="http://www.w3.org/2000/svg" p-id="5087" width="200" height="200"><path d="M493.0048 492.9536H128.0512V128H493.056v364.9536z" fill="#F1511B" p-id="5088"></path><path d="M895.9488 492.9536H530.944V128H896v364.9536z" fill="#80CC28" p-id="5089"></path><path d="M493.0048 896H128v-364.9024H493.056V896z" fill="#00ADEF" p-id="5090"></path><path d="M895.8976 896h-364.9024v-364.9024h364.9024V896z" fill="#FBBC09" p-id="5091"></path></svg>

After

Width:  |  Height:  |  Size: 512 B

View File

@@ -245,13 +245,7 @@ export const MultipleRowArraySelect = ({
onClick={() => handleSelect(item)}
{...(isSelected ? { color: 'primary.600' } : {})}
>
{showCheckbox && (
<Checkbox
isChecked={isChecked}
icon={<MyIcon name={'common/check'} w={'12px'} />}
mr={1}
/>
)}
{showCheckbox && <Checkbox isChecked={isChecked} mr={1} />}
<Box>{item.label}</Box>
</Flex>
);

View File

@@ -14,7 +14,6 @@ type EditorVariablePickerType = {
};
export type Props = Omit<BoxProps, 'resize' | 'onChange'> & {
height?: number;
resize?: boolean;
defaultValue?: string;
value?: string;
@@ -111,7 +110,7 @@ const MyEditor = ({
borderWidth={'1px'}
borderRadius={'md'}
borderColor={'myGray.200'}
py={2}
py={1}
height={height}
position={'relative'}
pl={2}
@@ -132,8 +131,8 @@ const MyEditor = ({
{resize && (
<Box
position={'absolute'}
right={'-1'}
bottom={'-1'}
right={'-2.5'}
bottom={'-3.5'}
zIndex={10}
cursor={'ns-resize'}
px={'4px'}

View File

@@ -19,9 +19,11 @@ const CodeEditor = (props: Props) => {
iconSrc="modal/edit"
title={t('common:code_editor')}
w={'full'}
h={'85vh'}
isCentered
>
<ModalBody>
<MyEditor {...props} bg={'myGray.50'} defaultHeight={600} />
<ModalBody flex={'1 0 0'} overflow={'auto'}>
<MyEditor {...props} bg={'myGray.50'} height={'100%'} />
</ModalBody>
<ModalFooter>
<Button mr={2} onClick={onClose} px={6}>

View File

@@ -12,25 +12,27 @@ const LANG_KEY = 'NEXT_LOCALE';
export const useI18nLng = () => {
const { i18n } = useTranslation();
const languageMap: Record<string, string> = {
zh: 'zh',
'zh-CN': 'zh',
'zh-Hans': 'zh',
en: 'en',
'en-US': 'en'
};
const onChangeLng = (lng: string) => {
setCookie(LANG_KEY, lng, {
expires: 30,
sameSite: 'None',
secure: true
const lang = languageMap[lng] || 'en';
setCookie(LANG_KEY, lang, {
expires: 30
});
i18n?.changeLanguage(lng);
i18n?.changeLanguage(lang);
};
const setUserDefaultLng = () => {
if (!navigator || !localStorage) return;
if (getCookie(LANG_KEY)) return onChangeLng(getCookie(LANG_KEY) as string);
const languageMap: Record<string, string> = {
zh: 'zh',
'zh-CN': 'zh'
};
const lang = languageMap[navigator.language] || 'en';
// currentLng not in userLang

View File

@@ -11,11 +11,14 @@ export const useWidthVariable = <T = any>({
}) => {
const value = useMemo(() => {
// 根据 width 计算,找到第一个大于 width 的值
const index = widthList.findLastIndex((item) => width > item);
const reversedWidthList = [...widthList].reverse();
const reversedList = [...list].reverse();
const index = reversedWidthList.findIndex((item) => width > item);
if (index === -1) {
return list[0];
return reversedList[0];
}
return list[index];
return reversedList[index];
}, [list, width, widthList]);
return value;

View File

@@ -997,6 +997,7 @@
"support.user.login.Email": "Email",
"support.user.login.Github": "GitHub Login",
"support.user.login.Google": "Google Login",
"support.user.login.Microsoft": "Microsoft Login",
"support.user.login.Password": "Password",
"support.user.login.Password login": "Password Login",
"support.user.login.Phone": "Phone Login",

View File

@@ -18,9 +18,6 @@
"FAQ.switch_package_a": "套餐使用规则为优先使用更高级的套餐,因此,购买的新套餐若比当前套餐更高级,则新套餐立即生效:否则将继续使用当前套餐。",
"FAQ.switch_package_q": "是否切换订阅套餐?",
"Folder": "文件夹",
"just_now": "刚刚",
"yesterday": "昨天",
"yesterday_detail_time": "昨天 {{time}}",
"Login": "登录",
"Move": "移动",
"Name": "名称",
@@ -125,6 +122,7 @@
"common.Documents": "文档",
"common.Done": "完成",
"common.Edit": "编辑",
"common.Error": "错误",
"common.Exit": "退出",
"common.Exit Directly": "直接退出",
"common.Expired Time": "过期时间",
@@ -194,7 +192,6 @@
"common.Update Successful": "更新成功",
"common.Username": "用户名",
"common.Waiting": "等待中",
"common.Error": "错误",
"common.Warning": "警告",
"common.Website": "网站",
"common.all_result": "完整结果",
@@ -551,6 +548,7 @@
"core.dataset.import.Chunk Range": "范围:{{min}}~{{max}}",
"core.dataset.import.Chunk Split": "直接分段",
"core.dataset.import.Chunk Split Tip": "将文本按一定的规则进行分段处理后,转成可进行语义搜索的格式,适合绝大多数场景。不需要调用模型额外处理,成本低。",
"core.dataset.import.Continue upload": "继续上传",
"core.dataset.import.Custom process": "自定义规则",
"core.dataset.import.Custom process desc": "自定义设置数据处理规则",
"core.dataset.import.Custom prompt": "自定义提示词",
@@ -579,11 +577,10 @@
"core.dataset.import.Select source": "选择来源",
"core.dataset.import.Source name": "来源名",
"core.dataset.import.Sources list": "来源列表",
"core.dataset.import.Continue upload": "继续上传",
"core.dataset.import.Upload complete": "完成上传",
"core.dataset.import.Start upload": "开始上传",
"core.dataset.import.Total files": "共 {{total}} 个文件",
"core.dataset.import.Training mode": "训练模式",
"core.dataset.import.Upload complete": "完成上传",
"core.dataset.import.Upload data": "确认上传",
"core.dataset.import.Upload file progress": "文件上传进度",
"core.dataset.import.Upload status": "状态",
@@ -894,6 +891,7 @@
"is_using": "正在使用",
"item_description": "字段描述",
"item_name": "字段名",
"just_now": "刚刚",
"key_repetition": "key 重复",
"move.confirm": "确认移动",
"navbar.Account": "账号",
@@ -998,6 +996,7 @@
"support.user.login.Email": "邮箱",
"support.user.login.Github": "GitHub 登录",
"support.user.login.Google": "Google 登录",
"support.user.login.Microsoft": "微软登录",
"support.user.login.Password": "密码",
"support.user.login.Password login": "密码登录",
"support.user.login.Phone": "手机号登录",
@@ -1214,5 +1213,7 @@
"user.type": "类型",
"verification": "验证",
"xx_search_result": "{{key}} 的搜索结果",
"yes": "是"
"yes": "是",
"yesterday": "昨天",
"yesterday_detail_time": "昨天 {{time}}"
}

121
pnpm-lock.yaml generated
View File

@@ -22,7 +22,7 @@ importers:
version: 13.3.0
next-i18next:
specifier: 15.3.0
version: 15.3.0(i18next@23.11.5)(next@14.2.5(@babel/core@7.24.9)(babel-plugin-macros@3.1.0)(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(sass@1.77.8))(react-i18next@14.1.2(i18next@23.11.5)(react-dom@18.3.1(react@18.3.1))(react@18.3.1))(react@18.3.1)
version: 15.3.0(i18next@23.11.5)(next@14.2.5(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(sass@1.77.8))(react-i18next@14.1.2(i18next@23.11.5)(react-dom@18.3.1(react@18.3.1))(react@18.3.1))(react@18.3.1)
prettier:
specifier: 3.2.4
version: 3.2.4
@@ -61,7 +61,7 @@ importers:
version: 4.0.2
next:
specifier: 14.2.5
version: 14.2.5(@babel/core@7.24.9)(babel-plugin-macros@3.1.0)(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(sass@1.77.8)
version: 14.2.5(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(sass@1.77.8)
openai:
specifier: 4.61.0
version: 4.61.0(encoding@0.1.13)
@@ -201,7 +201,7 @@ importers:
version: 1.4.5-lts.1
next:
specifier: 14.2.5
version: 14.2.5(@babel/core@7.24.9)(babel-plugin-macros@3.1.0)(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(sass@1.77.8)
version: 14.2.5(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(sass@1.77.8)
nextjs-cors:
specifier: ^2.2.0
version: 2.2.0(next@14.2.5(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(sass@1.77.8))
@@ -277,7 +277,7 @@ importers:
version: 2.1.1(@chakra-ui/system@2.6.1(@emotion/react@11.11.1(@types/react@18.3.1)(react@18.3.1))(@emotion/styled@11.11.0(@emotion/react@11.11.1(@types/react@18.3.1)(react@18.3.1))(@types/react@18.3.1)(react@18.3.1))(react@18.3.1))(react@18.3.1)
'@chakra-ui/next-js':
specifier: 2.1.5
version: 2.1.5(@chakra-ui/react@2.8.1(@emotion/react@11.11.1(@types/react@18.3.1)(react@18.3.1))(@emotion/styled@11.11.0(@emotion/react@11.11.1(@types/react@18.3.1)(react@18.3.1))(@types/react@18.3.1)(react@18.3.1))(@types/react@18.3.1)(framer-motion@9.1.7(react-dom@18.3.1(react@18.3.1))(react@18.3.1))(react-dom@18.3.1(react@18.3.1))(react@18.3.1))(@emotion/react@11.11.1(@types/react@18.3.1)(react@18.3.1))(next@14.2.5(@babel/core@7.24.9)(babel-plugin-macros@3.1.0)(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(sass@1.77.8))(react@18.3.1)
version: 2.1.5(@chakra-ui/react@2.8.1(@emotion/react@11.11.1(@types/react@18.3.1)(react@18.3.1))(@emotion/styled@11.11.0(@emotion/react@11.11.1(@types/react@18.3.1)(react@18.3.1))(@types/react@18.3.1)(react@18.3.1))(@types/react@18.3.1)(framer-motion@9.1.7(react-dom@18.3.1(react@18.3.1))(react@18.3.1))(react-dom@18.3.1(react@18.3.1))(react@18.3.1))(@emotion/react@11.11.1(@types/react@18.3.1)(react@18.3.1))(next@14.2.5(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(sass@1.77.8))(react@18.3.1)
'@chakra-ui/react':
specifier: 2.8.1
version: 2.8.1(@emotion/react@11.11.1(@types/react@18.3.1)(react@18.3.1))(@emotion/styled@11.11.0(@emotion/react@11.11.1(@types/react@18.3.1)(react@18.3.1))(@types/react@18.3.1)(react@18.3.1))(@types/react@18.3.1)(framer-motion@9.1.7(react-dom@18.3.1(react@18.3.1))(react@18.3.1))(react-dom@18.3.1(react@18.3.1))(react@18.3.1)
@@ -340,7 +340,7 @@ importers:
version: 4.17.21
next-i18next:
specifier: 15.3.0
version: 15.3.0(i18next@23.11.5)(next@14.2.5(@babel/core@7.24.9)(babel-plugin-macros@3.1.0)(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(sass@1.77.8))(react-i18next@14.1.2(i18next@23.11.5)(react-dom@18.3.1(react@18.3.1))(react@18.3.1))(react@18.3.1)
version: 15.3.0(i18next@23.11.5)(next@14.2.5(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(sass@1.77.8))(react-i18next@14.1.2(i18next@23.11.5)(react-dom@18.3.1(react@18.3.1))(react@18.3.1))(react@18.3.1)
papaparse:
specifier: ^5.4.1
version: 5.4.1
@@ -486,6 +486,9 @@ importers:
json5:
specifier: ^2.2.3
version: 2.2.3
jsondiffpatch:
specifier: ^0.6.0
version: 0.6.0
jsonwebtoken:
specifier: ^9.0.2
version: 9.0.2
@@ -700,7 +703,7 @@ importers:
version: 6.3.4
ts-jest:
specifier: ^29.1.0
version: 29.2.2(@babel/core@7.24.9)(@jest/transform@29.7.0)(@jest/types@29.6.3)(babel-jest@29.7.0(@babel/core@7.24.9))(jest@29.7.0(@types/node@20.14.11)(babel-plugin-macros@3.1.0))(typescript@5.5.3)
version: 29.2.2(@babel/core@7.24.9)(@jest/transform@29.7.0)(@jest/types@29.6.3)(babel-jest@29.7.0(@babel/core@7.24.9))(jest@29.7.0(@types/node@20.14.11)(babel-plugin-macros@3.1.0)(ts-node@10.9.2(@types/node@20.14.11)(typescript@5.5.3)))(typescript@5.5.3)
ts-loader:
specifier: ^9.4.3
version: 9.5.1(typescript@5.5.3)(webpack@5.92.1)
@@ -3177,8 +3180,8 @@ packages:
'@tanstack/react-query@4.36.1':
resolution: {integrity: sha512-y7ySVHFyyQblPl3J3eQBWpXZkliroki3ARnBKsdJchlgt7yJLRDUcf4B8soufgiYt3pEQIkBWBx1N9/ZPIeUWw==}
peerDependencies:
react: ^16.8.0 || ^17.0.0 || ^18.0.0
react-dom: ^16.8.0 || ^17.0.0 || ^18.0.0
react: 18.3.1
react-dom: 18.3.1
react-native: '*'
peerDependenciesMeta:
react-dom:
@@ -3331,6 +3334,9 @@ packages:
'@types/decompress@4.2.7':
resolution: {integrity: sha512-9z+8yjKr5Wn73Pt17/ldnmQToaFHZxK0N1GHysuk/JIPT8RIdQeoInM01wWPgypRcvb6VH1drjuFpQ4zmY437g==}
'@types/diff-match-patch@1.0.36':
resolution: {integrity: sha512-xFdR6tkm0MWvBfO8xXCSsinYxHcqkQUlcHeSpMC2ukzOb6lwQAfDmW+Qt0AvlGd8HpsS28qKsB+oPeJn9I39jg==}
'@types/eslint-scope@3.7.7':
resolution: {integrity: sha512-MzMFlSLBqNF2gcHWO0G1vP/YQyfvrxZ0bF+u7mzUdZ1/xK4A4sru+nraZz5i3iEIk1l1uyicaDVTB4QbbEkAYg==}
@@ -4848,6 +4854,9 @@ packages:
dezalgo@1.0.4:
resolution: {integrity: sha512-rXSP0bf+5n0Qonsb+SVVfNfIsimO4HEtmnIpPHY8Q1UCzKlQrDMfdobr8nJOOsRgWCyMRqeSBQzmWUMq7zvVig==}
diff-match-patch@1.0.5:
resolution: {integrity: sha512-IayShXAgj/QMXgB0IWmKx+rOPuGMhqm5w6jvFxmVenXKIzRqTAAsbBPT3kWQeGANj3jGgvcvv4yK6SxqYmikgw==}
diff-sequences@29.6.3:
resolution: {integrity: sha512-EjePK1srD3P08o2j4f0ExnylqRs5B9tJjcp9t1krH2qRi8CCdsYfwe9JgSLurFBWwq4uOlipzfk5fHNvwFKr8Q==}
engines: {node: ^14.15.0 || ^16.10.0 || >=18.0.0}
@@ -5139,6 +5148,7 @@ packages:
eslint@8.56.0:
resolution: {integrity: sha512-Go19xM6T9puCOWntie1/P997aXxFsOi37JIHRWI514Hc6ZnaHGKY9xFhrU65RT6CcBEzZoGG1e6Nq+DT04ZtZQ==}
engines: {node: ^12.22.0 || ^14.17.0 || >=16.0.0}
deprecated: This version is no longer supported. Please see https://eslint.org/version-support for other options.
hasBin: true
espree@9.6.1:
@@ -6308,6 +6318,11 @@ packages:
jsonc-parser@3.3.1:
resolution: {integrity: sha512-HUgH65KyejrUFPvHFPbqOY0rsFip3Bo5wb4ngvdi1EpCYWUQDC5V+Y7mZws+DLkr4M//zQJoanu1SP+87Dv1oQ==}
jsondiffpatch@0.6.0:
resolution: {integrity: sha512-3QItJOXp2AP1uv7waBkao5nCvhEv+QmJAd38Ybq7wNI74Q+BBmnLn4EDKz6yI9xGAIQoUF87qHt+kc1IVxB4zQ==}
engines: {node: ^18.0.0 || >=20.0.0}
hasBin: true
jsonfile@6.1.0:
resolution: {integrity: sha512-5dgndWOriYSm5cnYaJNhalLNDKOqFwyDB/rr1E9ZsGciGvKPs8R2xYGCacuf3z6K1YKDz182fd+fY3cn3pMqXQ==}
@@ -10462,6 +10477,14 @@ snapshots:
next: 14.2.5(@babel/core@7.24.9)(babel-plugin-macros@3.1.0)(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(sass@1.77.8)
react: 18.3.1
'@chakra-ui/next-js@2.1.5(@chakra-ui/react@2.8.1(@emotion/react@11.11.1(@types/react@18.3.1)(react@18.3.1))(@emotion/styled@11.11.0(@emotion/react@11.11.1(@types/react@18.3.1)(react@18.3.1))(@types/react@18.3.1)(react@18.3.1))(@types/react@18.3.1)(framer-motion@9.1.7(react-dom@18.3.1(react@18.3.1))(react@18.3.1))(react-dom@18.3.1(react@18.3.1))(react@18.3.1))(@emotion/react@11.11.1(@types/react@18.3.1)(react@18.3.1))(next@14.2.5(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(sass@1.77.8))(react@18.3.1)':
dependencies:
'@chakra-ui/react': 2.8.1(@emotion/react@11.11.1(@types/react@18.3.1)(react@18.3.1))(@emotion/styled@11.11.0(@emotion/react@11.11.1(@types/react@18.3.1)(react@18.3.1))(@types/react@18.3.1)(react@18.3.1))(@types/react@18.3.1)(framer-motion@9.1.7(react-dom@18.3.1(react@18.3.1))(react@18.3.1))(react-dom@18.3.1(react@18.3.1))(react@18.3.1)
'@emotion/cache': 11.11.0
'@emotion/react': 11.11.1(@types/react@18.3.1)(react@18.3.1)
next: 14.2.5(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(sass@1.77.8)
react: 18.3.1
'@chakra-ui/number-input@2.1.1(@chakra-ui/system@2.6.1(@emotion/react@11.11.1(@types/react@18.3.1)(react@18.3.1))(@emotion/styled@11.11.0(@emotion/react@11.11.1(@types/react@18.3.1)(react@18.3.1))(@types/react@18.3.1)(react@18.3.1))(react@18.3.1))(react@18.3.1)':
dependencies:
'@chakra-ui/counter': 2.1.0(react@18.3.1)
@@ -12393,6 +12416,8 @@ snapshots:
dependencies:
'@types/node': 22.7.8
'@types/diff-match-patch@1.0.36': {}
'@types/eslint-scope@3.7.7':
dependencies:
'@types/eslint': 8.56.10
@@ -13203,7 +13228,7 @@ snapshots:
axios@1.7.7:
dependencies:
follow-redirects: 1.15.9(debug@4.3.7)
follow-redirects: 1.15.9
form-data: 4.0.1
proxy-from-env: 1.1.0
transitivePeerDependencies:
@@ -14182,6 +14207,8 @@ snapshots:
asap: 2.0.6
wrappy: 1.0.2
diff-match-patch@1.0.5: {}
diff-sequences@29.6.3: {}
diff@4.0.2: {}
@@ -14982,6 +15009,8 @@ snapshots:
follow-redirects@1.15.6: {}
follow-redirects@1.15.9: {}
follow-redirects@1.15.9(debug@4.3.4):
optionalDependencies:
debug: 4.3.4
@@ -15044,7 +15073,7 @@ snapshots:
dependencies:
react: 18.3.1
react-dom: 18.3.1(react@18.3.1)
tslib: 2.7.0
tslib: 2.8.0
optionalDependencies:
'@emotion/is-prop-valid': 0.8.8
@@ -16141,6 +16170,12 @@ snapshots:
jsonc-parser@3.3.1: {}
jsondiffpatch@0.6.0:
dependencies:
'@types/diff-match-patch': 1.0.36
chalk: 5.3.0
diff-match-patch: 1.0.5
jsonfile@6.1.0:
dependencies:
universalify: 2.0.1
@@ -17323,6 +17358,18 @@ snapshots:
react: 18.3.1
react-i18next: 14.1.2(i18next@23.11.5)(react-dom@18.3.1(react@18.3.1))(react@18.3.1)
next-i18next@15.3.0(i18next@23.11.5)(next@14.2.5(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(sass@1.77.8))(react-i18next@14.1.2(i18next@23.11.5)(react-dom@18.3.1(react@18.3.1))(react@18.3.1))(react@18.3.1):
dependencies:
'@babel/runtime': 7.24.8
'@types/hoist-non-react-statics': 3.3.5
core-js: 3.37.1
hoist-non-react-statics: 3.3.2
i18next: 23.11.5
i18next-fs-backend: 2.3.1
next: 14.2.5(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(sass@1.77.8)
react: 18.3.1
react-i18next: 14.1.2(i18next@23.11.5)(react-dom@18.3.1(react@18.3.1))(react@18.3.1)
next@14.2.5(@babel/core@7.24.9)(babel-plugin-macros@3.1.0)(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(sass@1.77.8):
dependencies:
'@next/env': 14.2.5
@@ -17349,10 +17396,36 @@ snapshots:
- '@babel/core'
- babel-plugin-macros
next@14.2.5(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(sass@1.77.8):
dependencies:
'@next/env': 14.2.5
'@swc/helpers': 0.5.5
busboy: 1.6.0
caniuse-lite: 1.0.30001669
graceful-fs: 4.2.11
postcss: 8.4.31
react: 18.3.1
react-dom: 18.3.1(react@18.3.1)
styled-jsx: 5.1.1(react@18.3.1)
optionalDependencies:
'@next/swc-darwin-arm64': 14.2.5
'@next/swc-darwin-x64': 14.2.5
'@next/swc-linux-arm64-gnu': 14.2.5
'@next/swc-linux-arm64-musl': 14.2.5
'@next/swc-linux-x64-gnu': 14.2.5
'@next/swc-linux-x64-musl': 14.2.5
'@next/swc-win32-arm64-msvc': 14.2.5
'@next/swc-win32-ia32-msvc': 14.2.5
'@next/swc-win32-x64-msvc': 14.2.5
sass: 1.77.8
transitivePeerDependencies:
- '@babel/core'
- babel-plugin-macros
nextjs-cors@2.2.0(next@14.2.5(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(sass@1.77.8)):
dependencies:
cors: 2.8.5
next: 14.2.5(@babel/core@7.24.9)(babel-plugin-macros@3.1.0)(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(sass@1.77.8)
next: 14.2.5(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(sass@1.77.8)
nextjs-node-loader@1.1.5(webpack@5.92.1):
dependencies:
@@ -18410,7 +18483,7 @@ snapshots:
dependencies:
chokidar: 3.6.0
immutable: 4.3.6
source-map-js: 1.2.0
source-map-js: 1.2.1
sax@1.4.1: {}
@@ -18755,6 +18828,11 @@ snapshots:
'@babel/core': 7.24.9
babel-plugin-macros: 3.1.0
styled-jsx@5.1.1(react@18.3.1):
dependencies:
client-only: 0.0.1
react: 18.3.1
stylis@4.2.0: {}
stylis@4.3.2: {}
@@ -18956,6 +19034,25 @@ snapshots:
ts-dedent@2.2.0: {}
ts-jest@29.2.2(@babel/core@7.24.9)(@jest/transform@29.7.0)(@jest/types@29.6.3)(babel-jest@29.7.0(@babel/core@7.24.9))(jest@29.7.0(@types/node@20.14.11)(babel-plugin-macros@3.1.0)(ts-node@10.9.2(@types/node@20.14.11)(typescript@5.5.3)))(typescript@5.5.3):
dependencies:
bs-logger: 0.2.6
ejs: 3.1.10
fast-json-stable-stringify: 2.1.0
jest: 29.7.0(@types/node@20.14.11)(babel-plugin-macros@3.1.0)(ts-node@10.9.2(@types/node@20.14.11)(typescript@5.5.3))
jest-util: 29.7.0
json5: 2.2.3
lodash.memoize: 4.1.2
make-error: 1.3.6
semver: 7.6.3
typescript: 5.5.3
yargs-parser: 21.1.1
optionalDependencies:
'@babel/core': 7.24.9
'@jest/transform': 29.7.0
'@jest/types': 29.6.3
babel-jest: 29.7.0(@babel/core@7.24.9)
ts-jest@29.2.2(@babel/core@7.24.9)(@jest/transform@29.7.0)(@jest/types@29.6.3)(babel-jest@29.7.0(@babel/core@7.24.9))(jest@29.7.0(@types/node@20.14.11)(babel-plugin-macros@3.1.0))(typescript@5.5.3):
dependencies:
bs-logger: 0.2.6

View File

@@ -33,7 +33,7 @@ MILVUS_TOKEN=133964348b00b4b4e4b51bef680a61350950385c8c64a3ec16b1ab92d3c67dcc4e0
SANDBOX_URL=http://localhost:3001
# 商业版地址
PRO_URL=
# 页面的地址,用于自动补全相对路径资源的 domain
# 页面的地址,用于自动补全相对路径资源的 domain,注意后面不要跟 /
FE_DOMAIN=http://localhost:3000
# 二级路由,需要打包时候就确定
# NEXT_PUBLIC_BASE_URL=/fastai

View File

@@ -42,6 +42,7 @@
"jest": "^29.5.0",
"js-yaml": "^4.1.0",
"json5": "^2.2.3",
"jsondiffpatch": "^0.6.0",
"jsonwebtoken": "^9.0.2",
"lodash": "^4.17.21",
"mermaid": "^10.2.3",

View File

@@ -206,12 +206,7 @@ const DatasetParamsModal = ({
</Box>
</Box>
<Box position={'relative'} w={'18px'} h={'18px'}>
<Checkbox
colorScheme="primary"
isChecked={getValues('usingReRank')}
size="lg"
icon={<MyIcon name={'common/check'} w={'12px'} />}
/>
<Checkbox colorScheme="primary" isChecked={getValues('usingReRank')} size="lg" />
<Box position={'absolute'} top={0} right={0} bottom={0} left={0} zIndex={1}></Box>
</Box>
</Flex>

View File

@@ -30,6 +30,7 @@ const ResponseTags = ({
const { t } = useTranslation();
const quoteListRef = React.useRef<HTMLDivElement>(null);
const dataId = historyItem.dataId;
const {
totalQuoteList: quoteList = [],
llmModuleAccount = 0,

View File

@@ -323,7 +323,7 @@ const ChatBox = (
})
};
} else if (event === SseResponseEventEnum.updateVariables && variables) {
variablesForm.reset(variables);
variablesForm.setValue('variables', variables);
} else if (event === SseResponseEventEnum.interactive) {
const val: AIChatItemValueItemType = {
type: ChatItemValueTypeEnum.interactive,
@@ -408,7 +408,7 @@ const ChatBox = (
isInteractivePrompt = false
}) => {
variablesForm.handleSubmit(
async ({ variables }) => {
async ({ variables = {} }) => {
if (!onStartChat) return;
if (isChatting) {
toast({
@@ -435,7 +435,7 @@ const ChatBox = (
// Only declared variables are kept
const requestVariables: Record<string, any> = {};
allVariableList?.forEach((item) => {
requestVariables[item.key] = variables[item.key] || '';
requestVariables[item.key] = variables[item.key];
});
const responseChatId = getNanoid(24);

View File

@@ -117,11 +117,12 @@ function AddMemberModal({ onClose, mode = 'member' }: AddModalPropsType) {
<Flex flexDirection="column" mt="2" overflow={'auto'} maxH="400px">
{filterGroups.map((group) => {
const onChange = () => {
if (selectedGroupIdList.includes(group._id)) {
setSelectedGroupIdList(selectedGroupIdList.filter((v) => v !== group._id));
} else {
setSelectedGroupIdList([...selectedGroupIdList, group._id]);
}
setSelectedGroupIdList((state) => {
if (state.includes(group._id)) {
return state.filter((v) => v !== group._id);
}
return [...state, group._id];
});
};
const collaborator = collaboratorList.find((v) => v.groupId === group._id);
return (
@@ -141,10 +142,7 @@ function AddMemberModal({ onClose, mode = 'member' }: AddModalPropsType) {
}}
onClick={onChange}
>
<Checkbox
isChecked={selectedGroupIdList.includes(group._id)}
icon={<MyIcon name={'common/check'} w={'12px'} />}
/>
<Checkbox isChecked={selectedGroupIdList.includes(group._id)} />
<MyAvatar src={group.avatar} w="1.5rem" borderRadius={'50%'} />
<Box ml="2" w="full">
{group.name === DefaultGroupName ? userInfo?.team.teamName : group.name}
@@ -157,11 +155,12 @@ function AddMemberModal({ onClose, mode = 'member' }: AddModalPropsType) {
})}
{filterMembers.map((member) => {
const onChange = () => {
if (selectedMemberIdList.includes(member.tmbId)) {
setSelectedMembers(selectedMemberIdList.filter((v) => v !== member.tmbId));
} else {
setSelectedMembers([...selectedMemberIdList, member.tmbId]);
}
setSelectedMembers((state) => {
if (state.includes(member.tmbId)) {
return state.filter((v) => v !== member.tmbId);
}
return [...state, member.tmbId];
});
};
const collaborator = collaboratorList.find((v) => v.tmbId === member.tmbId);
return (
@@ -205,11 +204,12 @@ function AddMemberModal({ onClose, mode = 'member' }: AddModalPropsType) {
<Flex flexDirection="column" mt="2" overflow={'auto'} maxH="400px">
{selectedGroupIdList.map((groupId) => {
const onChange = () => {
if (selectedGroupIdList.includes(groupId)) {
setSelectedGroupIdList(selectedGroupIdList.filter((v) => v !== groupId));
} else {
setSelectedGroupIdList([...selectedGroupIdList, groupId]);
}
setSelectedGroupIdList((state) => {
if (state.includes(groupId)) {
return state.filter((v) => v !== groupId);
}
return [...state, groupId];
});
};
const group = groups.find((v) => String(v._id) === groupId);
return (

View File

@@ -1,12 +1,12 @@
import React, { useCallback } from 'react';
import MyModal from '@fastgpt/web/components/common/MyModal';
import { useUserStore } from '@/web/support/user/useUserStore';
import { useQuery } from '@tanstack/react-query';
import { Button, ModalBody, ModalFooter, useDisclosure } from '@chakra-ui/react';
import { useTranslation } from 'next-i18next';
import { LOGO_ICON } from '@fastgpt/global/common/system/constants';
import { getSystemMsgModalData } from '@/web/support/user/inform/api';
import dynamic from 'next/dynamic';
import { useRequest2 } from '@fastgpt/web/hooks/useRequest';
const Markdown = dynamic(() => import('@/components/Markdown'), { ssr: false });
const SystemMsgModal = ({}: {}) => {
@@ -15,7 +15,9 @@ const SystemMsgModal = ({}: {}) => {
const { isOpen, onOpen, onClose } = useDisclosure();
const { data } = useQuery(['initSystemMsgModal', systemMsgReadId], getSystemMsgModalData, {
const { data } = useRequest2(getSystemMsgModalData, {
refreshDeps: [systemMsgReadId],
manual: false,
onSuccess(res) {
if (res?.content && (!systemMsgReadId || res.id !== systemMsgReadId)) {
onOpen();

View File

@@ -6,11 +6,15 @@ import { FlowNodeTypeEnum } from '@fastgpt/global/core/workflow/node/constant';
const isLLMNode = (item: ChatHistoryItemResType) =>
item.moduleType === FlowNodeTypeEnum.chatNode || item.moduleType === FlowNodeTypeEnum.tools;
export function transformPreviewHistories(histories: ChatItemType[]): ChatItemType[] {
export function transformPreviewHistories(
histories: ChatItemType[],
responseDetail: boolean
): ChatItemType[] {
return histories.map((item) => {
return {
...addStatisticalDataToHistoryItem(item),
responseData: undefined
responseData: undefined,
...(responseDetail ? {} : { totalQuoteList: undefined })
};
});
}
@@ -18,6 +22,7 @@ export function transformPreviewHistories(histories: ChatItemType[]): ChatItemTy
export function addStatisticalDataToHistoryItem(historyItem: ChatItemType) {
if (historyItem.obj !== ChatRoleEnum.AI) return historyItem;
if (historyItem.totalQuoteList !== undefined) return historyItem;
if (!historyItem.responseData) return historyItem;
// Flat children
const flatResData: ChatHistoryItemResType[] =

View File

@@ -45,6 +45,7 @@ import { TeamMemberRoleEnum } from '@fastgpt/global/support/user/team/constant';
import QuestionTip from '@fastgpt/web/components/common/MyTooltip/QuestionTip';
import { useSystem } from '@fastgpt/web/hooks/useSystem';
import MyImage from '@fastgpt/web/components/common/Image/MyImage';
import { getWebReqUrl } from '@fastgpt/web/common/system/utils';
const StandDetailModal = dynamic(() => import('./standardDetailModal'));
const TeamMenu = dynamic(() => import('@/components/support/user/team/TeamMenu'));
@@ -494,7 +495,7 @@ const PlanUsage = () => {
</Box>
</Flex>
<Link
href={EXTRA_PLAN_CARD_ROUTE}
href={getWebReqUrl(EXTRA_PLAN_CARD_ROUTE)}
transform={'translateX(15px)'}
display={'flex'}
alignItems={'center'}

View File

@@ -51,6 +51,7 @@ async function handler(req: NextApiRequest, res: NextApiResponse<any>) {
path: file.path,
filename: file.originalname,
contentType: file.mimetype,
encoding: file.encoding,
metadata: metadata
});

View File

@@ -82,11 +82,16 @@ async function handler(
limit: pageSize
});
const responseDetail = !shareChat || shareChat.responseDetail;
// Remove important information
if (shareChat && app.type !== AppTypeEnum.plugin) {
histories.forEach((item) => {
if (item.obj === ChatRoleEnum.AI) {
item.responseData = filterPublicNodeResponseData({ flowResponses: item.responseData });
item.responseData = filterPublicNodeResponseData({
flowResponses: item.responseData,
responseDetail
});
if (shareChat.showNodeStatus === false) {
item.value = item.value.filter((v) => v.type !== ChatItemValueTypeEnum.tool);
@@ -96,7 +101,7 @@ async function handler(
}
return {
list: isPlugin ? histories : transformPreviewHistories(histories),
list: isPlugin ? histories : transformPreviewHistories(histories, responseDetail),
total
};
}

View File

@@ -11,6 +11,7 @@ import { ChatHistoryItemResType } from '@fastgpt/global/core/chat/type';
import { OutLinkChatAuthProps } from '@fastgpt/global/support/permission/chat';
import { authApp } from '@fastgpt/service/support/permission/app/auth';
import { filterPublicNodeResponseData } from '@fastgpt/global/core/chat/utils';
import { MongoOutLink } from '@fastgpt/service/support/outLink/schema';
export type getResDataQuery = OutLinkChatAuthProps & {
chatId?: string;
@@ -26,44 +27,57 @@ async function handler(
req: ApiRequestProps<getResDataBody, getResDataQuery>,
res: ApiResponseType<any>
): Promise<getResDataResponse> {
const { appId, chatId, dataId } = req.query;
const { appId, chatId, dataId, shareId } = req.query;
if (!appId || !chatId || !dataId) {
return {};
}
// 1. Un login api: share chat, team chat
// 2. Login api: account chat, chat log
try {
await authChatCrud({
req,
authToken: true,
authApiKey: true,
...req.query,
per: ReadPermissionVal
});
} catch (error) {
await authApp({
req,
authToken: true,
authApiKey: true,
appId,
per: ManagePermissionVal
});
const authData = await (() => {
try {
return authChatCrud({
req,
authToken: true,
authApiKey: true,
...req.query,
per: ReadPermissionVal
});
} catch (error) {
return authApp({
req,
authToken: true,
authApiKey: true,
appId,
per: ManagePermissionVal
});
}
})();
const [chatData] = await Promise.all([
MongoChatItem.findOne(
{
appId,
chatId,
dataId
},
'obj responseData'
).lean(),
shareId ? MongoOutLink.findOne({ shareId }).lean() : Promise.resolve(null)
]);
if (chatData?.obj !== ChatRoleEnum.AI) {
return {};
}
const chatData = await MongoChatItem.findOne(
{
appId,
chatId,
dataId
},
'obj responseData'
).lean();
if (chatData?.obj === ChatRoleEnum.AI) {
const data = chatData.responseData || {};
return req.query.shareId ? filterPublicNodeResponseData(data) : data;
} else return {};
const flowResponses = chatData.responseData ?? {};
return req.query.shareId
? filterPublicNodeResponseData({
// @ts-ignore
responseDetail: authData.responseDetail,
flowResponses: chatData.responseData
})
: flowResponses;
}
export default NextAPI(handler);

View File

@@ -67,6 +67,7 @@ async function handler(req: NextApiRequest, res: NextApiResponse<any>): CreateCo
const { rawText } = await readRawTextByLocalFile({
teamId,
path: file.path,
encoding: file.encoding,
metadata: {
...fileMetadata,
relatedId: relatedImgId
@@ -81,6 +82,7 @@ async function handler(req: NextApiRequest, res: NextApiResponse<any>): CreateCo
path: file.path,
filename: file.originalname,
contentType: file.mimetype,
encoding: file.encoding,
metadata: fileMetadata
});

View File

@@ -9,7 +9,7 @@ import {
} from '@fastgpt/global/support/permission/constant';
import { CommonErrEnum } from '@fastgpt/global/common/error/code/common';
import type { ApiRequestProps, ApiResponseType } from '@fastgpt/service/type/next';
import { DatasetTypeEnum } from '@fastgpt/global/core/dataset/constants';
import { DatasetTypeEnum, TrainingModeEnum } from '@fastgpt/global/core/dataset/constants';
import { ClientSession } from 'mongoose';
import { parseParentIdInMongo } from '@fastgpt/global/common/parentFolder/utils';
import { mongoSessionRun } from '@fastgpt/service/common/mongo/sessionRun';
@@ -21,6 +21,7 @@ import {
import { authUserPer } from '@fastgpt/service/support/permission/user/auth';
import { TeamWritePermissionVal } from '@fastgpt/global/support/permission/user/constant';
import { DatasetErrEnum } from '@fastgpt/global/common/error/code/dataset';
import { MongoDatasetTraining } from '@fastgpt/service/core/dataset/training/schema';
export type DatasetUpdateQuery = {};
export type DatasetUpdateResponse = any;
@@ -84,6 +85,12 @@ async function handler(
const isFolder = dataset.type === DatasetTypeEnum.folder;
updateTraining({
teamId: dataset.teamId,
datasetId: id,
agentModel: agentModel?.model
});
const onUpdate = async (session?: ClientSession) => {
await MongoDataset.findByIdAndUpdate(
id,
@@ -137,3 +144,29 @@ async function handler(
}
}
export default NextAPI(handler);
async function updateTraining({
teamId,
datasetId,
agentModel
}: {
teamId: string;
datasetId: string;
agentModel?: string;
}) {
if (!agentModel) return;
await MongoDatasetTraining.updateMany(
{
teamId,
datasetId,
mode: { $in: [TrainingModeEnum.qa, TrainingModeEnum.auto] }
},
{
$set: {
model: agentModel,
retryCount: 5
}
}
);
}

View File

@@ -363,7 +363,7 @@ async function handler(req: NextApiRequest, res: NextApiResponse) {
/* select fe response field */
const feResponseData = canWrite
? flowResponses
: filterPublicNodeResponseData({ flowResponses });
: filterPublicNodeResponseData({ flowResponses, responseDetail });
if (stream) {
workflowResponseWrite({
@@ -380,12 +380,10 @@ async function handler(req: NextApiRequest, res: NextApiResponse) {
});
if (detail) {
if (responseDetail || isPlugin) {
workflowResponseWrite({
event: SseResponseEventEnum.flowResponses,
data: feResponseData
});
}
workflowResponseWrite({
event: SseResponseEventEnum.flowResponses,
data: feResponseData
});
}
res.end();

View File

@@ -34,6 +34,7 @@ import {
WorkflowInitContext
} from '../WorkflowComponents/context/workflowInitContext';
import { WorkflowEventContext } from '../WorkflowComponents/context/workflowEventContext';
import { getAppConfigByDiff } from '@/web/core/app/diff';
const Header = () => {
const { t } = useTranslation();
@@ -51,16 +52,19 @@ const Header = () => {
const nodes = useContextSelector(WorkflowInitContext, (v) => v.nodes);
const edges = useContextSelector(WorkflowNodeEdgeContext, (v) => v.edges);
const {
flowData2StoreData,
flowData2StoreDataAndCheck,
setWorkflowTestData,
past,
future,
setPast,
onSwitchTmpVersion,
onSwitchCloudVersion
} = useContextSelector(WorkflowContext, (v) => v);
const flowData2StoreData = useContextSelector(WorkflowContext, (v) => v.flowData2StoreData);
const flowData2StoreDataAndCheck = useContextSelector(
WorkflowContext,
(v) => v.flowData2StoreDataAndCheck
);
const setWorkflowTestData = useContextSelector(WorkflowContext, (v) => v.setWorkflowTestData);
const past = useContextSelector(WorkflowContext, (v) => v.past);
const future = useContextSelector(WorkflowContext, (v) => v.future);
const setPast = useContextSelector(WorkflowContext, (v) => v.setPast);
const onSwitchTmpVersion = useContextSelector(WorkflowContext, (v) => v.onSwitchTmpVersion);
const onSwitchCloudVersion = useContextSelector(WorkflowContext, (v) => v.onSwitchCloudVersion);
const showHistoryModal = useContextSelector(WorkflowEventContext, (v) => v.showHistoryModal);
const setShowHistoryModal = useContextSelector(
WorkflowEventContext,
@@ -76,15 +80,20 @@ const Header = () => {
[...future].reverse().find((snapshot) => snapshot.isSaved) ||
past.find((snapshot) => snapshot.isSaved);
const initialState = past[past.length - 1]?.state;
const savedSnapshotState = getAppConfigByDiff(initialState, savedSnapshot?.diff);
const val = compareSnapshot(
// nodes of the saved snapshot
{
nodes: savedSnapshot?.nodes,
edges: savedSnapshot?.edges,
chatConfig: savedSnapshot?.chatConfig
nodes: savedSnapshotState?.nodes,
edges: savedSnapshotState?.edges,
chatConfig: savedSnapshotState?.chatConfig
},
// nodes of the current canvas
{
nodes: nodes,
edges: edges,
nodes,
edges,
chatConfig: appDetail.chatConfig
}
);
@@ -132,8 +141,6 @@ const Header = () => {
const onBack = useCallback(async () => {
try {
localStorage.removeItem(`${appDetail._id}-past`);
localStorage.removeItem(`${appDetail._id}-future`);
router.push({
pathname: '/app/list',
query: {
@@ -142,7 +149,7 @@ const Header = () => {
}
});
} catch (error) {}
}, [appDetail._id, appDetail.parentId, lastAppListRouteType, router]);
}, [appDetail.parentId, lastAppListRouteType, router]);
const Render = useMemo(() => {
return (

View File

@@ -17,16 +17,44 @@ import styles from './styles.module.scss';
import { useSystem } from '@fastgpt/web/hooks/useSystem';
import { useTranslation } from 'next-i18next';
import { onSaveSnapshotFnType, SimpleAppSnapshotType } from './useSnapshots';
import { getAppConfigByDiff, getAppDiffConfig } from '@/web/core/app/diff';
import { formatTime2YMDHMS } from '@fastgpt/global/common/string/time';
const convertOldFormatHistory = (past: SimpleAppSnapshotType[]) => {
const baseState = past[past.length - 1].appForm;
return past.map((item, index) => {
if (index === past.length - 1) {
return {
title: item.title,
isSaved: item.isSaved,
state: baseState
};
}
const currentState = item.appForm;
const diff = getAppDiffConfig(baseState, currentState);
return {
title: item.title || formatTime2YMDHMS(new Date()),
isSaved: item.isSaved,
diff
};
});
};
const Edit = ({
appForm,
setAppForm,
past,
setPast,
saveSnapshot
}: {
appForm: AppSimpleEditFormType;
setAppForm: React.Dispatch<React.SetStateAction<AppSimpleEditFormType>>;
past: SimpleAppSnapshotType[];
setPast: (value: React.SetStateAction<SimpleAppSnapshotType[]>) => void;
saveSnapshot: onSaveSnapshotFnType;
}) => {
const { isPc } = useSystem();
@@ -39,9 +67,26 @@ const Edit = ({
// show selected dataset
loadAllDatasets();
if (appDetail.version !== 'v2') {
return setAppForm(
appWorkflow2Form({
nodes: v1Workflow2V2((appDetail.modules || []) as any)?.nodes,
chatConfig: appDetail.chatConfig
})
);
}
// Get the latest snapshot
if (past?.[0]?.appForm) {
return setAppForm(past[0].appForm);
if (past?.[0]?.diff) {
const pastState = getAppConfigByDiff(past[past.length - 1].state, past[0].diff);
return setAppForm(pastState);
} else if (past && past.length > 0 && past?.every((item) => item.appForm)) {
// 格式化成 diff
const newPast = convertOldFormatHistory(past);
setPast(newPast);
return setAppForm(getAppConfigByDiff(newPast[newPast.length - 1].state, newPast[0].diff));
}
const appForm = appWorkflow2Form({
@@ -59,15 +104,6 @@ const Edit = ({
}
setAppForm(appForm);
if (appDetail.version !== 'v2') {
setAppForm(
appWorkflow2Form({
nodes: v1Workflow2V2((appDetail.modules || []) as any)?.nodes,
chatConfig: appDetail.chatConfig
})
);
}
});
return (

View File

@@ -104,7 +104,10 @@ const EditForm = ({
const formatVariables = useMemo(
() =>
formatEditorVariablePickerIcon([
...workflowSystemVariables,
...workflowSystemVariables.filter(
(variable) =>
!['appId', 'chatId', 'responseChatItemId', 'histories'].includes(variable.key)
),
...(appForm.chatConfig.variables || [])
]).map((item) => ({
...item,

View File

@@ -1,4 +1,4 @@
import React, { useCallback, useState } from 'react';
import React, { useCallback, useMemo, useState } from 'react';
import { useContextSelector } from 'use-context-selector';
import { AppContext } from '../context';
import FolderPath from '@/components/common/folder/Path';
@@ -29,6 +29,7 @@ import {
} from './useSnapshots';
import PublishHistories from '../PublishHistoriesSlider';
import { AppVersionSchemaType } from '@fastgpt/global/core/app/version';
import { getAppConfigByDiff } from '@/web/core/app/diff';
const Header = ({
forbiddenSaveSnapshot,
@@ -48,7 +49,11 @@ const Header = ({
const { t } = useTranslation();
const { isPc } = useSystem();
const router = useRouter();
const { appId, onSaveApp, currentTab } = useContextSelector(AppContext, (v) => v);
const appId = useContextSelector(AppContext, (v) => v.appId);
const onSaveApp = useContextSelector(AppContext, (v) => v.onSaveApp);
const currentTab = useContextSelector(AppContext, (v) => v.currentTab);
const appLatestVersion = useContextSelector(AppContext, (v) => v.appLatestVersion);
const { lastAppListRouteType } = useSystemStore();
const { allDatasets } = useDatasetStore();
@@ -102,9 +107,19 @@ const Header = ({
const [isShowHistories, { setTrue: setIsShowHistories, setFalse: closeHistories }] =
useBoolean(false);
const initialAppForm = useMemo(
() =>
appWorkflow2Form({
nodes: appLatestVersion?.nodes || [],
chatConfig: appLatestVersion?.chatConfig || {}
}),
[appLatestVersion]
);
const onSwitchTmpVersion = useCallback(
(data: SimpleAppSnapshotType, customTitle: string) => {
setAppForm(data.appForm);
const pastState = getAppConfigByDiff(initialAppForm, data.diff);
setAppForm(pastState);
// Remove multiple "copy-"
const copyText = t('app:version_copy');
@@ -112,11 +127,11 @@ const Header = ({
const title = customTitle.replace(regex, `$1`);
return saveSnapshot({
appForm: data.appForm,
appForm: pastState,
title
});
},
[saveSnapshot, setAppForm, t]
[initialAppForm, saveSnapshot, setAppForm, t]
);
const onSwitchCloudVersion = useCallback(
(appVersion: AppVersionSchemaType) => {
@@ -143,7 +158,8 @@ const Header = ({
useDebounceEffect(
() => {
const savedSnapshot = past.find((snapshot) => snapshot.isSaved);
const val = compareSimpleAppSnapshot(savedSnapshot?.appForm, appForm);
const pastState = getAppConfigByDiff(initialAppForm, savedSnapshot?.diff);
const val = compareSimpleAppSnapshot(pastState, appForm);
setIsPublished(val);
},
[past, allDatasets],

View File

@@ -49,7 +49,13 @@ const SimpleEdit = () => {
saveSnapshot={saveSnapshot}
/>
{currentTab === TabEnum.appEdit ? (
<Edit appForm={appForm} setAppForm={setAppForm} past={past} saveSnapshot={saveSnapshot} />
<Edit
appForm={appForm}
setAppForm={setAppForm}
past={past}
setPast={setPast}
saveSnapshot={saveSnapshot}
/>
) : (
<Box flex={'1 0 0'} h={0} mt={[4, 0]}>
{currentTab === TabEnum.publish && <PublishChannel />}

Some files were not shown because too many files have changed in this diff Show More