Compare commits

..

75 Commits
v2.2 ... v2.7.1

Author SHA1 Message Date
archer
c605964fa8 feat: 知识库匹配模式选择 2023-04-12 00:44:01 +08:00
archer
1fe5cd751a perf: 知识库匹配模式 2023-04-11 18:17:00 +08:00
archer
488e2f476e fix: 重名模型高亮;perf: 未匹配到问题时输出 2023-04-11 17:28:43 +08:00
archer
915b104b8a perf: 输入引导。导出数据编码格式。列表数字被隐藏 2023-04-11 16:32:07 +08:00
archer
aaa350a13e fix: response 2023-04-10 21:27:13 +08:00
archer
6a2b34cb92 perf: 保持数据原样 2023-04-10 21:08:43 +08:00
archer
7f26b31f53 feat: csv导入去重;文档说明 2023-04-10 20:58:23 +08:00
archer
2a597964a2 perf: csv导入导出 2023-04-10 20:39:27 +08:00
archer
c1d3a46dc7 perf: csv文件选择 2023-04-10 19:47:03 +08:00
archer
0c55beb72d perf: comment 2023-04-10 14:39:46 +08:00
archer
9b1c0e1a3c perf: openapi. error catch 2023-04-10 13:16:24 +08:00
archer
a7988c164e perf: readme 2023-04-10 01:59:32 +08:00
archer
99e5fbd0f5 perf: md引入;docker-compose; 2023-04-09 22:56:08 +08:00
archer
5e4c4dd79b README 2023-04-09 12:38:36 +08:00
archer
70584783a5 perf: 环境变量例子 2023-04-09 12:37:13 +08:00
archer
705ac1c27e perf: 专线代理配置 2023-04-08 20:49:15 +08:00
archer
52d00d0562 feat: 知识库对外api 2023-04-08 20:27:43 +08:00
archer
9a145f223f fix: chat 页有些手机无法加载 2023-04-08 13:21:03 +08:00
archer
b7cd4dec89 fix: model auth 2023-04-08 11:57:13 +08:00
archer
33154a9c19 fix: 去除share 2023-04-08 11:51:51 +08:00
archer
e1c7503611 fix: api page hidden 2023-04-08 10:26:34 +08:00
archer
d04c298132 README 2023-04-08 00:47:31 +08:00
archer
eceda01c19 perf: openapi auth and lafgpt 2023-04-08 00:35:35 +08:00
archer
ea1681e1eb feat: auth openapi key 2023-04-07 23:33:59 +08:00
archer
f6c4b4c96d feat: openapi crd 2023-04-07 23:15:30 +08:00
archer
22cc9c85be feat: openapi page 2023-04-07 22:48:21 +08:00
archer
43f8d6008f fix: README.md 2023-04-07 21:51:21 +08:00
archer
29c5554f9e perf: 分页组件 2023-04-07 21:34:51 +08:00
archer
9b18a46456 perf: 账号api结构 2023-04-07 20:58:41 +08:00
archer
d5923bc64f perf: 去掉testapi 2023-04-07 17:23:52 +08:00
archer
f19c2d2ca1 perf: 去掉raw content 2023-04-07 16:12:43 +08:00
archer
84d91f3f76 perf: 接口大小 2023-04-07 15:46:30 +08:00
archer
7811f7482b fix: 账单第一页不展示 2023-04-07 01:20:41 +08:00
archer
9c8ca7dd25 perf: 压缩上下文 2023-04-07 01:11:23 +08:00
archer
1409916bd0 perf: 知识库范围 2023-04-06 23:43:34 +08:00
archer
fc7edcb54f perf: log和向量对话 2023-04-06 22:24:23 +08:00
archer
87d35042de perf: 阈值 2023-04-06 19:44:44 +08:00
archer
77dc961a07 perf: 账号页异步加载组件 2023-04-06 18:30:47 +08:00
archer
9a45fb64c2 perf: 未更新信息时也能保存 2023-04-06 16:12:36 +08:00
archer
881c36542c perf: 连续手动输入数据 2023-04-06 16:02:35 +08:00
archer
f88c6031f5 feat: lafgpt。openapi schema 2023-04-06 15:25:48 +08:00
archer
8a02b3b04a perf: 响应流抽离 2023-04-06 11:42:47 +08:00
archer
d460305871 perf: 文案优化 2023-04-06 09:07:07 +08:00
archer
144bed5a77 perf: 优化tokens计算 2023-04-05 23:43:20 +08:00
archer
96fc917bad perf: 支付文案 2023-04-05 22:32:14 +08:00
archer
794a3698ad feat: wx pay 2023-04-05 22:07:02 +08:00
archer
fbbc32361b perf: 加快拆分QA和生成向量;余额不足提醒 2023-04-05 20:37:37 +08:00
archer
dc329041f3 feat: 根据url获取网站文本 2023-04-05 16:10:47 +08:00
archer
5feb2e19bf fix: word解析失败 2023-04-05 11:16:12 +08:00
archer
ec22cd8320 fix: 价格表 2023-04-05 10:59:53 +08:00
archer
8c7efcbd1a perf: 二维码 2023-04-04 23:54:33 +08:00
archer
afc5947bfb feat: maxtokens 2023-04-04 23:00:01 +08:00
archer
40189a6899 feat: 队列任务余额不足时退出 2023-04-04 22:36:14 +08:00
archer
b73829a25c fix: 重复生成向量 2023-04-04 22:12:48 +08:00
archer
a7c5d3cc05 Merge branch 'dev2.4' into dev2.5 2023-04-04 22:00:16 +08:00
archer
cc36a13f17 Merge branch 'dev2.4' of https://github.com/c121914yu/FastGPT into dev2.4 2023-04-04 21:59:38 +08:00
archer
943abbe0fb perf: 5进程同时进行 2023-04-04 21:41:55 +08:00
archer
b13c3c4da5 fix: 账单余额问题 2023-04-04 21:32:51 +08:00
archer
c12aa7fdf7 fix: 文本长度过长 2023-04-04 14:20:10 +08:00
archer
e08e8aa00b feat: 修改模型数据可修改问题 2023-04-04 13:15:34 +08:00
archer
85e11abc0a perf: 文件拆分 2023-04-03 21:04:38 +08:00
archer
becee69d6a perf: 发送区域样式 2023-04-03 17:28:35 +08:00
archer
042b0c535a perf: 发送按键 2023-04-03 17:14:46 +08:00
archer
f97c29b41e feat: lafgpt请求;fix: 修复发送按键 2023-04-03 16:35:48 +08:00
archer
4d6616cbfa fix: ts 2023-04-03 11:03:51 +08:00
archer
cf37992b5c feat: 封装向量生成和账单 2023-04-03 10:59:32 +08:00
archer
6c4026ccef perf: 文件结构 2023-04-03 10:20:17 +08:00
archer
caf31faf31 perf: 生成qa prompt 2023-04-03 01:39:00 +08:00
archer
a0832af14b perf: 数据集刷新导致页面抖动 2023-04-03 00:51:53 +08:00
archer
677e61416d perf: 版本文案 2023-04-03 00:48:56 +08:00
archer
56ba6fa5f7 feat: 拆分数据自定义prompt 2023-04-03 00:37:40 +08:00
archer
16a31de1c7 feat: 数据集导出 2023-04-03 00:18:21 +08:00
archer
05b2e9e99c feat: 拆分测试环境 2023-04-02 23:38:28 +08:00
archer
ae4243b522 perf: 知识库数据结构 2023-04-01 22:31:56 +08:00
archer
5759cbeae0 perf: 知识库录入 2023-03-31 18:23:07 +08:00
125 changed files with 4285 additions and 2097 deletions

View File

@@ -8,3 +8,4 @@ README.md
.yalc/
yalc.lock
testApi/

View File

@@ -1,8 +1,9 @@
AXIOS_PROXY_HOST=127.0.0.1
AXIOS_PROXY_PORT=33210
MONGODB_URI=
MY_MAIL=
MAILE_CODE=
TOKEN_KEY=
OPENAIKEY=
REDIS_URL=
AXIOS_PROXY_PORT_FAST=7890
AXIOS_PROXY_PORT_NORMAL=7890
MONGODB_URI=mongodb://username:password@0.0.0.0:27017/?authSource=admin&readPreference=primary&appname=MongoDB%20Compass&ssl=false
MY_MAIL=11111111@qq.com
MAILE_CODE=sdasadasfasfad
TOKEN_KEY=sssssssss
OPENAIKEY=sk-afadfadfadfsd
REDIS_URL=redis://default:password@0.0.0.0:8100

3
.gitignore vendored
View File

@@ -36,4 +36,5 @@ yarn-error.log*
next-env.d.ts
/public/trainData/
/.vscode/
platform.json
platform.json
testApi/

View File

@@ -54,13 +54,4 @@ USER nextjs
EXPOSE 3000
ENV PORT 3000
ENV MAX_USER ''
ENV AXIOS_PROXY_HOST ''
ENV AXIOS_PROXY_PORT ''
ENV MONGODB_URI ''
ENV MY_MAIL ''
ENV MAILE_CODE ''
ENV TOKEN_KEY ''
CMD ["node", "server.js"]

181
README.md
View File

@@ -2,48 +2,34 @@
Fast GPT 允许你使用自己的 openai API KEY 来快速的调用 openai 接口,包括 GPT3 及其微调方法,以及最新的 gpt3.5 接口。
## 初始化
## 开发
复制 .env.template 成 .env.local ,填写核心参数
```
AXIOS_PROXY_HOST=axios代理地址目前 openai 接口都需要走代理,本机的话就填 127.0.0.1
AXIOS_PROXY_PORT=代理端口
MONGODB_URI=mongo数据库地址例如mongodb://username:password@ip:27017/?authSource=admin&readPreference=primary&appname=MongoDB%20Compass&directConnection=true&ssl=false
AXIOS_PROXY_PORT_FAST=代理端口1,clash默认为7890
AXIOS_PROXY_PORT_NORMAL=代理端口2
MONGODB_URI=mongo数据库地址
MY_MAIL=发送验证码邮箱
MAILE_CODE=邮箱秘钥代理里设置的是QQ邮箱不知道怎么找这个 code 的,可以百度搜"nodemailer发送邮件"
TOKEN_KEY=随便填一个,用于生成和校验 token
OPENAIKEY=openai的key
REDIS_URL=redis的地址
```
```bash
pnpm dev
```
Open [http://localhost:3000](http://localhost:3000) with your browser to see the result.
## 部署
### docker 模式
请准备好 docker mongo代理, 和nginx。 镜像走本机的代理,所以用 network=hostport 改成代理的端口clash 一般都是 7890。
请准备好 docker mongo代理, 和 nginx。 镜像走本机的代理,所以用 network=hostport 改成代理的端口clash 一般都是 7890。
#### docker 打包
```bash
docker build -t imageName:tag .
docker push imageName:tag
```
#### 服务器拉取镜像和运行
```bash
# 服务器拉取部署, imageName 替换成镜像名
docker pull imageName:tag
docker stop fast-gpt || true
docker rm fast-gpt || true
docker run -d --network=host --name fast-gpt \
-e AXIOS_PROXY_HOST=127.0.0.1 \
-e AXIOS_PROXY_PORT=7890 \
-e MY_MAIL=your email\
-e MAILE_CODE=your email code \
-e TOKEN_KEY=任意一个内容 \
-e MONGODB_URI="mongodb://user:password@127.0.0.0:27017/?authSource=admin&readPreference=primary&appname=MongoDB%20Compass&ssl=false" \
imageName:tag
# 或者直接拉镜像,见下方
```
#### 软件教程docker 安装
@@ -53,20 +39,6 @@ curl -sSL https://get.daocloud.io/docker | sh
sudo systemctl start docker
```
#### 软件教程mongo 安装
```bash
docker pull mongo:6.0.4
docker stop mongo
docker rm mongo
docker run -d --name mongo \
-e MONGO_INITDB_ROOT_USERNAME= \
-e MONGO_INITDB_ROOT_PASSWORD= \
-v /root/service/mongo:/data/db \
mongo:6.0.4
# 检查 mongo 运行情况, 有成功的 logs 代表访问成功
docker logs mongo
```
#### 软件教程: clash 代理
```bash
# 下载包
@@ -84,8 +56,7 @@ export https_proxy=http://127.0.0.1:7890
export HTTP_PROXY=http://127.0.0.1:7890
export HTTPS_PROXY=http://127.0.0.1:7890
# 运行脚本: 删除clash - 到 clash 目录 - 删除缓存 - 执行运行
# 会生成一个 nohup.out 文件,可以看到 clash 的 logs
# 运行脚本: 删除clash - 到 clash 目录 - 删除缓存 - 执行运行. 会生成一个 nohup.out 文件,可以看到 clash 的 logs
OLD_PROCESS=$(pgrep clash)
if [ ! -z "$OLD_PROCESS" ]; then
echo "Killing old process: $OLD_PROCESS"
@@ -99,13 +70,133 @@ nohup ./clash-linux-amd64-v1.10.0 -d ./ &
echo "Restart clash"
```
#### 软件教程Nginx
...没写,这个百度吧。
#### 文件创建
**yml文件**
```yml
version: "3.3"
services:
fast-gpt:
image: c121914yu/fast-gpt:latest
environment:
AXIOS_PROXY_HOST: 127.0.0.1
AXIOS_PROXY_PORT: 7890
MY_MAIL: 11111111@qq.com
MAILE_CODE: sdasadasfasfad
TOKEN_KEY: sssssssss
MONGODB_URI: mongodb://username:password@0.0.0.0:27017/?authSource=admin&readPreference=primary&appname=MongoDB%20Compass&ssl=false
OPENAIKEY: sk-afadfadfadfsd
REDIS_URL: redis://default:password@0.0.0.0:8100
network_mode: host
restart: always
container_name: fast-gpt
mongodb:
image: mongo:6.0.4
container_name: mongo
restart: always
environment:
- MONGO_INITDB_ROOT_USERNAME=root
- MONGO_INITDB_ROOT_PASSWORD=ROOT_1234
- MONGO_DATA_DIR=/data/db
- MONGO_LOG_DIR=/data/logs
volumes:
- /root/fastgpt/mongo/data:/data/db
- /root/fastgpt/mongo/logs:/data/logs
ports:
- 27017:27017
nginx:
image: nginx:alpine3.17
container_name: nginx
restart: always
network_mode: host
ports:
- "80:80"
volumes:
- /root/fastgpt/nginx/nginx.conf:/etc/nginx/nginx.conf:ro
redis-stack:
image: redis/redis-stack:6.2.6-v6
container_name: redis-stack
restart: unless-stopped
ports:
- "8100:6379"
- "8101:8001"
environment:
- REDIS_ARGS=--requirepass psw1234
volumes:
- /etc/localtime:/etc/localtime:ro
- /root/fastgpt/redis/redis.conf:/redis.conf
- /root/fastgpt/redis/data:/data
```
**redis.conf**
```
## 开启aop持久化
appendonly yes
#default: 持久化文件
appendfilename "appendonly.aof"
#default: 每秒同步一次
appendfsync everysec
```
**nginx.conf**
```
user nginx;
worker_processes auto;
error_log /var/log/nginx/error.log;
pid /run/nginx.pid;
#### redis
events {
worker_connections 1024;
}
http {
include /etc/nginx/mime.types;
default_type application/octet-stream;
include /etc/nginx/conf.d/*.conf;
server {
listen 80;
server_name test.com;
gzip on;
gzip_min_length 1k;
gzip_buffers 4 8k;
gzip_http_version 1.1;
gzip_comp_level 6;
gzip_vary on;
gzip_types text/plain application/x-javascript text/css application/javascript application/json application/xml;
gzip_disable "MSIE [1-6]\.";
location / {
proxy_pass http://localhost:3000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}
}
}
```
#### 运行脚本
**redis创建索引**
```bash
# 索引
# FT.CREATE idx:model:data ON JSON PREFIX 1 model:data: SCHEMA $.modelId AS modelId TAG $.dataId AS dataId TAG $.vector AS vector VECTOR FLAT 6 DIM 1536 DISTANCE_METRIC COSINE TYPE FLOAT32
FT.CREATE idx:model:data:hash ON HASH PREFIX 1 model:data: SCHEMA modelId TAG dataId TAG vector VECTOR FLAT 6 DIM 1536 DISTANCE_METRIC COSINE TYPE FLOAT32
```
FT.CREATE idx:model:data:hash ON HASH PREFIX 1 model:data: SCHEMA modelId TAG userId TAG status TAG q TEXT text TEXT vector VECTOR FLAT 6 DIM 1536 DISTANCE_METRIC COSINE TYPE FLOAT32
```
**run.sh 运行文件**
```bash
#!/bin/bash
docker-compose up -d
echo "Docker Compose 重新拉取镜像完成!"
# 删除本地旧镜像
images=$(docker images --format "{{.ID}} {{.Repository}}" | grep fast-gpt)
# 将镜像 ID 和名称放入数组中
IFS=$'\n' read -rd '' -a image_array <<<"$images"
# 遍历数组并删除所有旧的镜像
for ((i=1; i<${#image_array[@]}; i++))
do
image=${image_array[$i]}
image_id=${image%% *}
docker rmi $image_id
done
```

View File

@@ -36,6 +36,7 @@
"nodemailer": "^6.9.1",
"nprogress": "^0.2.0",
"openai": "^3.2.1",
"papaparse": "^5.4.1",
"react": "18.2.0",
"react-dom": "18.2.0",
"react-hook-form": "^7.43.1",
@@ -48,6 +49,7 @@
"sass": "^1.58.3",
"sharp": "^0.31.3",
"tunnel": "^0.0.6",
"wxpay-v3": "^3.0.2",
"zustand": "^4.3.5"
},
"devDependencies": {
@@ -57,6 +59,7 @@
"@types/lodash": "^4.14.191",
"@types/node": "18.14.0",
"@types/nodemailer": "^6.4.7",
"@types/papaparse": "^5.3.7",
"@types/react": "18.0.28",
"@types/react-dom": "18.0.11",
"@types/react-syntax-highlighter": "^15.5.6",

885
pnpm-lock.yaml generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,15 +0,0 @@
{"prompt":"什么是百度plato</s>","completion":" 百度PLATO是一种聊天机器人发布于2021年10月参数量为百亿是GPT-3的1/17。</s>"}
{"prompt":"与百度plato相比chatgpt有哪些优点</s>","completion":" 与百度PLATO相比ChatGPT体现出耐心、体贴和妥当被网友们普遍称赞。ChatGPT可以用于个性化聊天机器人例如生成一个很相似的数字人提供一位温柔健谈的朋友指导语言学习等。</s>"}
{"prompt":"replika是什么</s>","completion":" Replika是一个产品使用ChatGPT的技术可以根据自己的需求生成一个虚拟女友、男友或单纯的朋友。目前还引起了不少有关性骚扰、家暴、隐私等问题的争议。</s>"}
{"prompt":"聊天机器人可以用于哪些场景?</s>","completion":" 聊天机器人可以用于很多陪伴的场景,例如亲人去世、抑郁症或自闭症患者、语言学习、小朋友、养老院或独居在家的老人等。</s>"}
{"prompt":"有人反对使用聊天机器人解决孤独困境,为什么?</s>","completion":" 有人反对使用聊天机器人解决孤独困境,认为孤独困境应该用人与人真正的交往解决,机器是冰冷的。</s>"}
{"prompt":"为什么聊天机器人仍然需要解决当下的需求?</s>","completion":" 在社会结构性的困局还没有解开时,长夜漫漫的独身打工人、养老院里卧床的老人、父母在外留守在家的儿童,当下的需求仍需要解决。</s>"}
{"prompt":"ai在未来会被戴上什么样的帽子</s>","completion":" AI在未来各个场景下都会被戴上一个很虚无的帽子“机器是冰冷的”、“没有灵魂”、“它并不是真正在思考”。</s>"}
{"prompt":"怎么回答“ai是否真正思考”的问题</s>","completion":" 引用乔姆斯基老爷子的那句名言“思考是人类特性。AI有天会思考吗那就像是在问潜水艇会不会游泳一样。”</s>"}
{"prompt":"聊天机器人与人类完成任务的方式有何不同?</s>","completion":" 聊天机器人与人类完成任务的方式不同,但并不重要。它们能跟人一样完成任务,这是最重要的。</s>"}
{"prompt":"聊天机器人的发展前景如何?</s>","completion":" 聊天机器人的发展前景广阔,可以应用于很多领域,例如客服、教育、医疗等。</s>"}
{"prompt":"什么是百度plato</s>","completion":" 百度PLATO是一种聊天机器人发布于2021年10月参数量为百亿是GPT-3的1/17。</s>"}
{"prompt":"与百度plato相比chatgpt有哪些优点</s>","completion":" 与百度PLATO相比ChatGPT体现出耐心、体贴和妥当被网友们普遍称赞。ChatGPT还可以用于很多陪伴的场景如亲人去世、抑郁症或自闭症患者、语言学习、小朋友、养老院或独居在家的老人等。</s>"}
{"prompt":"replika是什么产品</s>","completion":" Replika是一个产品用的是ChatGPT的技术。用户可以根据自己的需求生成一个虚拟女友、男友或单纯的朋友。目前还引起了不少有关性骚扰、家暴、隐私等问题的争议。</s>"}
{"prompt":"聊天机器人能够解决哪些现实问题?</s>","completion":" 聊天机器人能够解决现代人面临的孤独困境,如长夜漫漫的独身打工人、养老院里卧床的老人、父母在外留守在家的儿童等。</s>"}
{"prompt":"有人反对使用聊天机器人解决孤独困境,为什么?</s>","completion":" 有人反对使用聊天机器人解决孤独困境,认为孤独困境应该用人与人真正的交往解决,机器是冰冷的。</s>"}

View File

@@ -0,0 +1,9 @@
## 常见问题
**内容长度**
单次最长 4000 tokens, 上下文最长 8000 tokens, 上下文超长时会被截断。
**删除和复制**
点击对话头像,可以选择复制或删除该条内容。
**代理出错**
服务器代理不稳定,可以过一会儿再尝试。

6
public/docs/csvSelect.md Normal file
View File

@@ -0,0 +1,6 @@
接受一个csv文件表格头包含 question 和 answer。question 代表问题answer 代表答案。
导入前会进行去重,如果问题和答案完全相同,则不会被导入,所以最终导入的内容可能会比文件的内容少。但是,对于带有换行的内容,目前无法去重。
| question | answer |
| --- | --- |
| 什么是 laf | laf 是一个云函数开发平台…… |
| 什么是 sealos | Sealos 是以 kubernetes 为内核的云操作系统发行版,可以…… |

40
public/docs/intro.md Normal file
View File

@@ -0,0 +1,40 @@
## 欢迎使用 Fast GPT
[Git 仓库](https://github.com/c121914yu/FastGPT)
### 交流群/问题反馈
扫码满了,加个小号,定时拉
wx号: fastgpt123
![](/imgs/wx300.jpg)
### 快速开始
1. 使用邮箱注册账号。
2. 进入账号页面,添加关联账号,目前只有 openai 的账号可以添加,直接去 openai 官网,把 API Key 粘贴过来。
3. 如果填写了自己的 openai 账号,使用时会直接用你的账号。如果没有填写,需要付费使用平台的账号。
4. 进入模型页,创建一个模型,建议直接用 ChatGPT。
5. 在模型列表点击【对话】,即可使用 API 进行聊天。
### 定制 prompt
1. 进入模型编辑页
2. 调整温度和提示词
3. 使用该模型对话。每次对话时,提示词和温度都会自动注入,方便管理个人的模型。建议把自己日常经常需要使用的 5~10 个方向预设好。
### 知识库
1. 创建模型时选择【知识库】
2. 进入模型编辑页
3. 导入数据,可以选择手动导入,或者选择文件导入。文件导入会自动调用 chatGPT 理解文件内容,并生成知识库。
4. 使用该模型对话。
注意使用知识库模型对话时tokens 消耗会加快。
### 价格表
如果使用了自己的 Api Key不会计费。可以在账号页看到详细账单。单纯使用 chatGPT 模型进行对话,只有一个计费项目。使用知识库时,包含**对话**和**索引**生成两个计费项。
| 计费项 | 价格: 元/ 1K tokens包含上下文|
| --- | --- |
| chatgpt - 对话 | 0.03 |
| 知识库 - 对话 | 0.03 |
| 知识库 - 索引 | 0.004 |
| 文件拆分 | 0.03 |

3
public/docs/shareHint.md Normal file
View File

@@ -0,0 +1,3 @@
你正准备分享对话,请确保分享链接不会滥用,因为它是使用的是你的 API key。
* 分享空白对话:为该模型创建一个空白的聊天分享出去。
* 分享当前对话:会把当前聊天的内容也分享出去,但是要注意不要多个人同时用一个聊天内容。

View File

@@ -0,0 +1,3 @@
## Fast GPT V2.7
* FastGpt Api 允许你将 Fast Gpt 的部分功能通过 api 的形式,将知识库接入到自己的应用中,例如:飞书、企业微信、客服助手.
* 通过 csv 文件导入和导出你的问答对。你可以将你的 csv 文件放置在飞书文档上,以便团队共享。

BIN
public/imgs/wx300.jpg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 53 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 320 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 50 KiB

File diff suppressed because one or more lines are too long

View File

@@ -1,21 +0,0 @@
{"prompt": "什么是高潜公司\n\n###\n\n", "completion": " 高潜公司是指高潜力的公司,其所处的赛道有发展潜力和前景。###"}
{"prompt": "高潜公司的优势是什么\n\n###\n\n", "completion": " 高潜公司所处的整个行业和赛道有一定的增量空间头部vc通过严格的筛选和加持为高潜公司背书高潜公司的创始者一般都特别优秀。加入高潜公司可以获得很多机会来锻炼自己、强化自己的认知快速成长、展现自己并有可能成为公司扩张后的管理层###"}
{"prompt": "高潜公司的风险是什么\n\n###\n\n", "completion": " 高潜公司的高风险问题也很明显。一旦公司失败,很多边缘人就会被淘汰掉。但宏观来讲,只要赛道是有潜力的,即便这家公司失败了,也可以跳槽到成功的公司去。###"}
{"prompt": "高潜公司的例子有哪些\n\n###\n\n", "completion": " 很多高潜公司在做最近比较火的AIGC包括去年的元宇宙前两年比较火的新消费、企业服务和出海之类的。###"}
{"prompt": "为什么现在加入大厂的前景不广阔?\n\n###\n\n", "completion": " 今年的大形势之下互联网大厂纷纷开始裁员、取消普调、晋升比例严重缩水HC也急剧减少。###"}
{"prompt": "加入高潜公司可以获得什么样的机会?\n\n###\n\n", "completion": " 加入高潜公司可以获得很多机会来锻炼自己、强化自己的认知,快速成长、展现自己,并有可能成为公司扩张后的管理层。###"}
{"prompt": "什么是高潜公司?\n\n###\n\n", "completion": " 高潜公司的定义是“高潜力的公司”所处的行业和赛道有发展潜力和前景头部vc为其背书创始者一般都特别优秀。###"}
{"prompt": "加入高潜公司有哪些优点\n\n###\n\n", "completion": " 加入高潜公司能够获得锻炼自己、强化认知的机会,快速成长、展现自己,在公司扩张后成为团队的管理层。同时,所处的行业还处于上升期,发展前景广阔。###"}
{"prompt": "高潜公司存在哪些风险?\n\n###\n\n", "completion": " 高潜公司存在高风险问题,一旦公司失败,很多边缘人就会被淘汰掉。但宏观来讲,只要赛道是有潜力的,即便这家公司失败了,也可以跳槽到成功的公司去。###"}
{"prompt": "高潜公司和互联网有什么相似之处?\n\n###\n\n", "completion": " 目前来说,高潜公司的风格大部分都比较像互联网大厂,有风格,但又没有那么内卷,因为大家每天都在开拓边疆,突破和探索市场的边界。同时,互联网产业的蓬勃发展也给其他行业,包括新能源汽车、新消费等,带来了用户增长和运营的经验###"}
{"prompt": "互联网大厂的制度和分工如何?\n\n###\n\n", "completion": " 目前的互联网大厂的制度和分工非常明确基本是按照一些标准SOP做事。同时由于已经没有用户增量了内卷和向上管理的问题也比较严重。###"}
{"prompt": "高潜公司对于哪些毕业生是很好的选择\n\n###\n\n", "completion": " 对于追求长期创造价值而非想找份安稳的工作的毕业生来说,高潜公司是很好的选择。###"}
{"prompt": "互联网大厂的晋升方式有哪些?\n\n###\n\n", "completion": " 互联网大厂的晋升方式有两种:老板特别喜欢你或者你的战功支撑你的竞争###"}
{"prompt": "为什么目前更多优秀的人都会加入创业公司?\n\n###\n\n", "completion": " 目前社会的大趋势是更多优秀的人都会加入创业公司,因为真正在创造价值的其实永远是创业公司###"}
{"prompt": "为什么选择高潜公司主要是因为我们迎来了什么三个繁荣?\n\n###\n\n", "completion": " 选择高潜公司主要是因为我们迎来了创新生态的三个繁荣:人才繁荣、资本繁荣和环境繁荣###"}
{"prompt": "资本繁荣是如何推动创业生态的崛起的?\n\n###\n\n", "completion": " 资本繁荣无论中国还是美国创业生态的崛起都是伴随着移动互联网的发展。中国这一代VC的崛起主要是通过投资移动互联网项目所积累的战绩扩大了资金池通过这些成本低、增长快、回报率高的项目才有了底气去推动更多行业发展尝试着投资toB和硬科技这类成本高、增长慢、回报率低、风险大的项目。###"}
{"prompt": "环境繁荣是指什么?\n\n###\n\n", "completion": " 环境繁荣是指互联网带动起经济发展后,各地政府也开始了对于创业进行培育,免费场地、巨额无偿补贴、各地送钱竞赛,大幅度降低了创业成本。###"}
{"prompt": "高潜公司对于追求长期创造价值的毕业生来说为什么是好的选择?\n\n###\n\n", "completion": " 因为在高潜公司,毕业生可以通过参与创业公司的发展,获得更多的机会和成长空间,而不是被固定在某个职位上。此外,高潜公司通常会注重团队建设和文化建设,能够提供更好的工作环境和发展机会。###"}
{"prompt": "如果想要寻找高潜公司,有哪些渠道可以利用?\n\n###\n\n", "completion": " 可以利用领英职场app和知潜微信小程序。领英职场app中入驻了很多高潜公司及高管而知潜小程序现在也已经有100+高潜力创业公司入驻都是经过筛选的、top VC投资过且赛道有前景的公司。此外红杉和真格等VC机构自己的公众号也会经常披露投资的企业。###"}
{"prompt": "为什么现在更多优秀的人才会加入创业公司?\n\n###\n\n", "completion": " 因为在当前社会的大趋势下,创业公司成为了创新生态的重要组成部分。创业公司在创造价值和推动行业发展方面具有更大的空间和潜力,而且随着移动互联网的发展,创业公司的崛起也得到了资本和政策的支持。因此,更多优秀的人才选择加入创业公司,以实现自己的创业梦想和获得更好的发展机会。###"}
{"prompt": "高潜公司的优势在哪些方面?\n\n###\n\n", "completion": " 高潜公司在团队建设、文化建设和发展机会等方面具有优势。高潜公司通常会注重团队建设和文化建设,能够提供更好的工作环境和发展机会。同时,高潜公司的发展潜力和创新能力也很强,可以为毕业生提供更多的机会和成长空间。###"}

View File

@@ -1,21 +0,0 @@
{"prompt":"什么是死锁?如何避免死锁?\n\n###\n\n","completion":" 死锁是指两个或多个进程在执行过程中,由于互相等待对方释放资源而造成的一种僵局。避免死锁的常用方法包括:破坏互斥条件、破坏请求和保持条件、破坏不剥夺条件、破坏循环等待条件。###"}
{"prompt":"什么是虚拟内存?\n\n###\n\n","completion":" 虚拟内存是指一种计算机内存管理技术,它使得操作系统可以将进程所使用的虚拟地址空间映射到物理内存中,从而可以让多个进程共享物理内存。当物理内存不足时,操作系统会将一部分不常用的内存页面暂时存储到硬盘上,这样就可以释放出更多的物理内存供其他进程使用。###"}
{"prompt":"什么是分页和分段?有什么区别?\n\n###\n\n","completion":" 分页和分段是虚拟内存管理中常用的两种技术。分页是将虚拟地址空间划分为固定大小的页面,而分段则是将虚拟地址空间划分为不同大小的段。区别在于,分页的页面大小是固定的,而分段的段大小是可变的;另外,分页是为了实现虚拟内存,而分段则是为了实现逻辑地址空间的管理。###"}
{"prompt":"什么是lru算法\n\n###\n\n","completion":" lru算法是一种页面置换算法它的全称是least recently used即最近最少使用算法。lru算法的基本思想是在物理内存中维护一个页面访问历史记录当需要替换一页时选择最近最少被访问的页面进行替换。###"}
{"prompt":"什么是文件系统?\n\n###\n\n","completion":" 文件系统是一种用于管理计算机存储设备上文件和目录的组织方式。文件系统通常包括存储管理、目录管理、文件访问控制、文件系统维护等功能。常见的文件系统包括fat、ntfs、ext等。###"}
{"prompt":"什么是并发和并行?\n\n###\n\n","completion":" 并发是指多个任务交替执行的情况,而并行是指多个任务同时执行的情况。并发的实现方式是通过时间分片技术,而并行的实现方式是通过多核处理器等硬件设备。###"}
{"prompt":"什么是系统调用?\n\n###\n\n","completion":" 系统调用是操作系统中的一种机制,它允许用户空间程序请求操作系统内核提供一些服务,如读写文件、创建进程等。系统调用通常是通过软中断实现的###"}
{"prompt":"什么是进程间通信?\n\n###\n\n","completion":" 进程间通信是指不同进程之间进行数据交换和通信的机制,常见的方式有管道、共享内存、消息队列、信号量等。###"}
{"prompt":"什么是文件描述符?\n\n###\n\n","completion":" 文件描述符是一个非负整数它是操作系统内核为了管理打开的文件而维护的一种抽象概念。应用程序使用文件描述符来访问文件或其他i\/o设备###"}
{"prompt":"什么是中断? \n\n###\n\n","completion":" 中断是指计算机硬件或软件在执行程序时,暂停当前程序的执行,转而去执行另一个程序或处理器所需的其他任务的过程。中断可以是外部中断、内部中断或软中断。###"}
{"prompt":"什么是页表?\n\n###\n\n","completion":" 页表是一种数据结构,它用于将虚拟地址转换为物理地址。在虚拟内存中,每个进程都有自己的页表,它描述了虚拟地址空间中每个页面对应的物理地址。 ###"}
{"prompt":"什么是僵尸进程?\n\n###\n\n","completion":" 僵尸进程是指一个已经完成执行但其父进程还没有对其进行善后处理的进程,它会一直占用系统资源,直到其父进程将其处理为止。###"}
{"prompt":"什么是守护进程? \n\n###\n\n","completion":" 守护进程是一种在后台运行的特殊进程,它通常用于提供某种服务或执行某种系统任务。守护进程通常不与任何用户界面交互,也不会终止,直到操作系统关机或重启。###"}
{"prompt":"计算机操作系统是什么?\n\n###\n\n","completion":" 计算机操作系统是一种控制计算机硬件和协调应用程序运行的系统软件。###"}
{"prompt":"什么是进程?\n\n###\n\n","completion":" 进程是一个正在执行中的程序实例,它包括了程序计数器、堆栈、数据区等信息。###"}
{"prompt":"什么是线程?\n\n###\n\n","completion":" 线程是操作系统能够进行运算调度的最小单位,它是进程的一个执行流程,共享进程的地址空间。###"}
{"prompt":"什么是虚拟内存?\n\n###\n\n","completion":" 虚拟内存是一种计算机系统内存管理技术,它使得应用程序认为它拥有连续的可用的内存,而实际上它通常是被分成多个物理内存碎片。###"}
{"prompt":"什么是死锁?\n\n###\n\n","completion":" 死锁是指两个或多个进程在执行过程中,因争夺资源而造成的一种互相等待的现象,导致所有进程都无法继续执行。###"}
{"prompt":"什么是缓存?\n\n###\n\n","completion":" 缓存是指在计算机中暂存数据的高速存储器,它可以提高计算机对数据的访问速度,避免频繁地访问较慢的主存储器。###"}
{"prompt":"什么是文件系统?\n\n###\n\n","completion":" 文件系统是计算机中用来管理和组织文件的一种机制,它通过一系列的数据结构来描述文件和目录的组织方式,以及文件如何存储和访问。###"}
{"prompt":"什么是调度算法?\n\n###\n\n","completion":" 调度算法是指操作系统中用来决定进程或线程在cpu上执行顺序的一种算法它的目标是最大化系统吞吐量、最小化响应时间或最大化资源利用率等。###"}

View File

@@ -5,8 +5,7 @@ import type { InitChatResponse } from './response/chat';
/**
* 获取一个聊天框的ID
*/
export const getChatSiteId = (modelId: string, isShare = false) =>
GET<string>(`/chat/generate?modelId=${modelId}&isShare=${isShare ? 'true' : 'false'}`);
export const getChatSiteId = (modelId: string) => GET<string>(`/chat/generate?modelId=${modelId}`);
/**
* 获取初始化聊天内容

1
src/api/common.ts Normal file
View File

@@ -0,0 +1 @@
import { GET, POST, DELETE } from './request';

View File

@@ -5,15 +5,30 @@ import { TrainingItemType } from '../types/training';
import { RequestPaging } from '../types/index';
import { Obj2Query } from '@/utils/tools';
/**
* 获取模型列表
*/
export const getMyModels = () => GET<ModelSchema[]>('/model/list');
/**
* 创建一个模型
*/
export const postCreateModel = (data: { name: string; serviceModelName: string }) =>
POST<ModelSchema>('/model/create', data);
/**
* 根据 ID 删除模型
*/
export const delModelById = (id: string) => DELETE(`/model/del?modelId=${id}`);
/**
* 根据 ID 获取模型
*/
export const getModelById = (id: string) => GET<ModelSchema>(`/model/detail?modelId=${id}`);
/**
* 根据 ID 更新模型
*/
export const putModelById = (id: string, data: ModelUpdateParams) =>
PUT(`/model/update?modelId=${id}`, data);
@@ -35,21 +50,56 @@ export const getModelTrainings = (id: string) =>
type GetModelDataListProps = RequestPaging & {
modelId: string;
};
/**
* 获取模型的知识库数据
*/
export const getModelDataList = (props: GetModelDataListProps) =>
GET(`/model/data/getModelData?${Obj2Query(props)}`);
export const getModelSplitDataList = (modelId: string) =>
GET<ModelSplitDataSchema[]>(`/model/data/getSplitData?modelId=${modelId}`);
/**
* 获取导出数据(不分页)
*/
export const getExportDataList = (modelId: string) =>
GET<[string, string][]>(`/model/data/exportModelData?modelId=${modelId}`);
/**
* 获取模型正在拆分数据的数量
*/
export const getModelSplitDataListLen = (modelId: string) =>
GET<number>(`/model/data/getSplitData?modelId=${modelId}`);
/**
* 获取 web 页面内容
*/
export const getWebContent = (url: string) => POST<string>(`/model/data/fetchingUrlData`, { url });
/**
* 手动输入数据
*/
export const postModelDataInput = (data: {
modelId: string;
data: { text: ModelDataSchema['text']; q: ModelDataSchema['q'] }[];
}) => POST(`/model/data/pushModelDataInput`, data);
}) => POST<number>(`/model/data/pushModelDataInput`, data);
export const postModelDataFileText = (modelId: string, text: string) =>
POST(`/model/data/splitData`, { modelId, text });
/**
* 拆分数据
*/
export const postModelDataSplitData = (data: { modelId: string; text: string; prompt: string }) =>
POST(`/model/data/splitData`, data);
export const putModelDataById = (data: { dataId: string; text: string }) =>
/**
* json导入数据
*/
export const postModelDataCsvData = (modelId: string, data: string[][]) =>
POST<number>(`/model/data/pushModelDataCsv`, { modelId, data: data });
/**
* 更新模型数据
*/
export const putModelDataById = (data: { dataId: string; text: string; q?: string }) =>
PUT('/model/data/putModelData', data);
/**
* 删除一条模型数据
*/
export const delOneModelData = (dataId: string) =>
DELETE(`/model/data/delModelDataById?dataId=${dataId}`);

16
src/api/openapi.ts Normal file
View File

@@ -0,0 +1,16 @@
import { GET, POST, DELETE } from './request';
import { UserOpenApiKey } from '@/types/openapi';
/**
* crete a api key
*/
export const createAOpenApiKey = () => POST<string>('/openapi/postKey');
/**
* get api keys
*/
export const getOpenApiKeys = () => GET<UserOpenApiKey[]>('/openapi/getKeys');
/**
* delete api by id
*/
export const delOpenApiById = (id: string) => DELETE(`/openapi/delKey?id=${id}`);

View File

@@ -0,0 +1 @@
<?xml version="1.0" standalone="no"?><!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd"><svg t="1680878351566" class="icon" viewBox="0 0 1024 1024" version="1.1" xmlns="http://www.w3.org/2000/svg" p-id="1173" xmlns:xlink="http://www.w3.org/1999/xlink" width="48" height="48"><path d="M896 771.413333h-768c-51.2 0-93.866667-42.666667-93.866667-93.866666V209.92c0-51.2 42.666667-93.866667 93.866667-93.866667h768c51.2 0 93.866667 42.666667 93.866667 93.866667v465.92c0 52.906667-42.666667 95.573333-93.866667 95.573333zM128 167.253333C104.106667 167.253333 85.333333 186.026667 85.333333 209.92v465.92c0 23.893333 18.773333 42.666667 42.666667 42.666667h768c23.893333 0 42.666667-18.773333 42.666667-42.666667V209.92c0-23.893333-18.773333-42.666667-42.666667-42.666667h-768z" p-id="1174"></path><path d="M512 907.946667c-13.653333 0-25.6-11.946667-25.6-25.6v-136.533334c0-13.653333 11.946667-25.6 25.6-25.6s25.6 11.946667 25.6 25.6v136.533334c0 13.653333-11.946667 25.6-25.6 25.6z" p-id="1175"></path><path d="M680.96 907.946667H343.04c-13.653333 0-25.6-11.946667-25.6-25.6s11.946667-25.6 25.6-25.6h337.92c13.653333 0 25.6 11.946667 25.6 25.6s-11.946667 25.6-25.6 25.6zM776.533333 648.533333h-529.066666c-13.653333 0-25.6-11.946667-25.6-25.6s11.946667-25.6 25.6-25.6h530.773333c13.653333 0 25.6 11.946667 25.6 25.6s-11.946667 25.6-27.306667 25.6z" p-id="1176"></path></svg>

After

Width:  |  Height:  |  Size: 1.4 KiB

View File

Before

Width:  |  Height:  |  Size: 2.9 KiB

After

Width:  |  Height:  |  Size: 2.9 KiB

View File

@@ -0,0 +1 @@
<?xml version="1.0" standalone="no"?><!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd"><svg t="1680878410563" class="icon" viewBox="0 0 1024 1024" version="1.1" xmlns="http://www.w3.org/2000/svg" p-id="2745" xmlns:xlink="http://www.w3.org/1999/xlink" width="48" height="48"><path d="M256 512l81.6 108.8a32 32 0 0 1-51.2 38.4l-96-128a31.968 31.968 0 0 1 0-38.4l96-128a32 32 0 0 1 51.2 38.4L256 512zM670.4 620.8a32 32 0 0 0 51.2 38.4l96-128a31.968 31.968 0 0 0 0-38.4l-96-128a32 32 0 0 0-51.2 38.4L752 512l-81.6 108.8zM503.232 646.944a32 32 0 1 1-62.464-13.888l64-288a32 32 0 1 1 62.464 13.888l-64 288z" p-id="2746"></path><path d="M160 144a32 32 0 0 0-32 32V864a32 32 0 0 0 32 32h688a32 32 0 0 0 32-32V176a32 32 0 0 0-32-32H160z m0-64h688a96 96 0 0 1 96 96V864a96 96 0 0 1-96 96H160a96 96 0 0 1-96-96V176a96 96 0 0 1 96-96z" p-id="2747"></path></svg>

After

Width:  |  Height:  |  Size: 897 B

View File

@@ -0,0 +1 @@
<?xml version="1.0" standalone="no"?><!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd"><svg t="1680878383832" class="icon" viewBox="0 0 1024 1024" version="1.1" xmlns="http://www.w3.org/2000/svg" p-id="1637" xmlns:xlink="http://www.w3.org/1999/xlink" width="48" height="48"><path d="M511.333 63.333c-247.424 0-448 200.576-448 448s200.576 448 448 448 448-200.576 448-448-200.576-448-448-448z m0 832c-51.868 0-102.15-10.144-149.451-30.15-36.011-15.231-69.123-35.67-98.812-60.897 12.177-31.985 42.226-63.875 84.223-88.903C396.189 686.243 456.222 669.53 512 669.53c55.631 0 115.416 16.658 164.026 45.703 41.762 24.953 71.689 56.812 83.863 88.804-29.764 25.342-62.976 45.865-99.106 61.146-47.299 20.006-97.582 30.15-149.45 30.15z m296.268-139.658c-20.493-35.937-54.353-68.855-98.747-95.381C649.75 624.979 579.839 605.53 512 605.53c-67.964 0-138.094 19.488-197.471 54.875-44.644 26.606-78.656 59.594-99.195 95.586-23.835-28.755-43.234-60.652-57.85-95.208-20.006-47.3-30.15-97.583-30.15-149.451s10.144-102.15 30.15-149.451c19.337-45.719 47.034-86.792 82.321-122.078 35.286-35.287 76.359-62.983 122.078-82.321 47.3-20.006 97.583-30.15 149.451-30.15 51.868 0 102.15 10.144 149.451 30.15 45.719 19.337 86.792 47.034 122.078 82.321 35.287 35.286 62.983 76.359 82.321 122.078 20.006 47.3 30.15 97.583 30.15 149.451s-10.144 102.15-30.15 149.451c-14.563 34.429-33.869 66.22-57.583 94.892z" p-id="1638"></path><path d="M512 220.223c-88.224 0-160 71.776-160 160s71.776 160 160 160c88.225 0 160-71.775 160-160s-71.775-160-160-160z m0 256c-52.935 0-96-43.065-96-96s43.065-96 96-96 96 43.065 96 96-43.065 96-96 96z" p-id="1639"></path></svg>

After

Width:  |  Height:  |  Size: 1.6 KiB

View File

@@ -1,7 +1,6 @@
import React from 'react';
import type { IconProps } from '@chakra-ui/react';
import { Icon } from '@chakra-ui/react';
import dynamic from 'next/dynamic';
const map = {
model: require('./icons/model.svg').default,
@@ -10,7 +9,11 @@ const map = {
menu: require('./icons/menu.svg').default,
pay: require('./icons/pay.svg').default,
copy: require('./icons/copy.svg').default,
chatSend: require('./icons/chatSend.svg').default
chatSend: require('./icons/chatSend.svg').default,
board: require('./icons/board.svg').default,
develop: require('./icons/develop.svg').default,
user: require('./icons/user.svg').default,
chatting: require('./icons/chatting.svg').default
};
export type IconName = keyof typeof map;

View File

@@ -16,27 +16,27 @@ const unShowLayoutRoute: { [key: string]: boolean } = {
const navbarList = [
{
label: '介绍',
icon: 'icon-gongzuotai-01',
icon: 'board',
link: '/',
activeLink: ['/']
},
{
label: '模型',
icon: 'icon-moxing',
icon: 'model',
link: '/model/list',
activeLink: ['/model/list', '/model/detail']
},
// {
// label: '数据',
// icon: 'icon-datafull',
// link: '/data/list',
// activeLink: ['/data/list', '/data/detail']
// },
{
label: '账号',
icon: 'icon-yonghu-yuan',
icon: 'user',
link: '/number/setting',
activeLink: ['/number/setting']
},
{
label: '开发',
icon: 'develop',
link: '/openapi',
activeLink: ['/openapi']
}
];

View File

@@ -2,8 +2,7 @@ import React from 'react';
import { Box, Flex } from '@chakra-ui/react';
import Image from 'next/image';
import { useRouter } from 'next/router';
import Icon from '../Iconfont';
import MyIcon from '../Icon';
export enum NavbarTypeEnum {
normal = 'normal',
small = 'small'
@@ -66,20 +65,16 @@ const Navbar = ({
backgroundColor: 'transparent'
})}
>
<Icon
name={item.icon}
width={24}
height={24}
color={item.activeLink.includes(router.pathname) ? '#2B6CB0' : '#4A5568'}
<MyIcon
name={item.icon as any}
width={'24px'}
height={'24px'}
fill={item.activeLink.includes(router.pathname) ? '#2B6CB0' : '#4A5568'}
/>
<Box mt={1}>{item.label}</Box>
</Flex>
))}
</Box>
{/* 通知 icon */}
{/* <Flex className={styles.informIcon} mb={5} justifyContent={'center'}>
<Icon name={'icon-tongzhi'} width={28} height={28} color={'#718096'}></Icon>
</Flex> */}
</Flex>
);
};

View File

@@ -1,6 +1,6 @@
import React from 'react';
import { useRouter } from 'next/router';
import Icon from '../Iconfont';
import MyIcon from '../Icon';
import {
Flex,
Drawer,
@@ -39,9 +39,8 @@ const NavbarPhone = ({
px={7}
>
<Box onClick={onOpen}>
<Icon name="icon-caidan" width={20} height={20}></Icon>
<MyIcon name="menu" width={'20px'} height={'20px'} color={'blackAlpha.600'}></MyIcon>
</Box>
{/* <Icon name="icon-tongzhi" width={20} height={20}></Icon> */}
</Flex>
<Drawer isOpen={isOpen} placement="left" size={'xs'} onClose={onClose}>
<DrawerOverlay />
@@ -74,11 +73,11 @@ const NavbarPhone = ({
backgroundColor: 'transparent'
})}
>
<Icon
name={item.icon}
width={24}
height={24}
color={item.activeLink.includes(router.pathname) ? '#2B6CB0' : '#4A5568'}
<MyIcon
name={item.icon as any}
width={'24px'}
height={'24px'}
fill={item.activeLink.includes(router.pathname) ? '#2B6CB0' : '#4A5568'}
/>
<Box ml={5}>{item.label}</Box>
</Flex>

View File

@@ -160,7 +160,7 @@
}
.markdown ul,
.markdown ol {
padding-left: 1em;
padding-left: 2em;
}
.markdown ul.no-list,
.markdown ol.no-list {

View File

@@ -23,15 +23,15 @@ const WxConcat = ({ onClose }: { onClose: () => void }) => {
<ModalBody textAlign={'center'}>
<Image
style={{ margin: 'auto' }}
src={'/imgs/wxcode.jpg'}
src={'/imgs/wx300.jpg'}
width={200}
height={200}
alt=""
/>
<Box mt={2}>
:{' '}
:
<Box as={'span'} userSelect={'all'}>
YNyiqi
fastgpt123
</Box>
</Box>
</ModalBody>

View File

@@ -4,61 +4,3 @@ export enum EmailTypeEnum {
}
export const PRICE_SCALE = 100000;
export const introPage = `
## 欢迎使用 Fast GPT
[Git 仓库](https://github.com/c121914yu/FastGPT)
### 快速开始
1. 使用邮箱注册账号。
2. 进入账号页面,添加关联账号,目前只有 openai 的账号可以添加,直接去 openai 官网,把 API Key 粘贴过来。
3. 如果填写了自己的 openai 账号,使用时会直接用你的账号。如果没有填写,需要付费使用平台的账号。
4. 进入模型页,创建一个模型,建议直接用 ChatGPT。
5. 在模型列表点击【对话】,即可使用 API 进行聊天。
### 定制 prompt
1. 进入模型编辑页
2. 调整温度和提示词
3. 使用该模型对话。每次对话时,提示词和温度都会自动注入,方便管理个人的模型。建议把自己日常经常需要使用的 5~10 个方向预设好。
### 知识库
1. 创建模型时选择【知识库】
2. 进入模型编辑页
3. 导入数据,可以选择手动导入,或者选择文件导入。文件导入会自动调用 chatGPT 理解文件内容,并生成知识库。
4. 使用该模型对话。
注意使用知识库模型对话时tokens 消耗会加快。
### 其他问题
还有其他问题,可以加我 wx: YNyiqi拉个交流群大家一起聊聊。
`;
export const chatProblem = `
**内容长度**
单次最长 4000 tokens, 上下文最长 8000 tokens, 上下文超长时会被截断。
**模型问题**
一般情况下,请直接选择 chatGPT 模型,价格低效果好。
**代理出错**
服务器代理不稳定,可以过一会儿再尝试。
**API key 问题**
请把 openai 的 API key 粘贴到账号里再创建对话。如果是使用分享的对话,不需要填写 API key。
`;
export const versionIntro = `
## Fast GPT V2.2
* 定制知识库:创建模型时可以选择【知识库】模型, 可以手动导入知识点或者直接导入一个文件自动学习。
* 删除和复制功能:点击对话头像,可以选择复制或删除该条内容。
`;
export const shareHint = `
你正准备分享对话,请确保分享链接不会滥用,因为它是使用的是你的 API key。
* 分享空白对话:为该模型创建一个空白的聊天分享出去。
* 分享当前对话:会把当前聊天的内容也分享出去,但是要注意不要多个人同时用一个聊天内容。
`;

View File

@@ -1,15 +1,16 @@
import type { ServiceName, ModelDataType, ModelSchema } from '@/types/mongoSchema';
import type { RedisModelDataItemType } from '@/types/redis';
export enum ChatModelNameEnum {
GPT35 = 'gpt-3.5-turbo',
VECTOR_GPT = 'VECTOR_GPT',
GPT3 = 'text-davinci-003'
VECTOR = 'text-embedding-ada-002'
}
export const ChatModelNameMap = {
[ChatModelNameEnum.GPT35]: 'gpt-3.5-turbo',
[ChatModelNameEnum.VECTOR_GPT]: 'gpt-3.5-turbo',
[ChatModelNameEnum.GPT3]: 'text-davinci-003'
[ChatModelNameEnum.VECTOR]: 'text-embedding-ada-002'
};
export type ModelConstantsData = {
@@ -20,7 +21,6 @@ export type ModelConstantsData = {
maxToken: number;
contextMaxToken: number;
maxTemperature: number;
trainedMaxToken: number; // 训练后最大多少tokens
price: number; // 多少钱 / 1token单位: 0.00001元
};
@@ -32,8 +32,7 @@ export const modelList: ModelConstantsData[] = [
trainName: '',
maxToken: 4000,
contextMaxToken: 7500,
trainedMaxToken: 2000,
maxTemperature: 2,
maxTemperature: 1.5,
price: 3
},
{
@@ -42,22 +41,10 @@ export const modelList: ModelConstantsData[] = [
model: ChatModelNameEnum.VECTOR_GPT,
trainName: 'vector',
maxToken: 4000,
contextMaxToken: 7500,
trainedMaxToken: 2000,
contextMaxToken: 7000,
maxTemperature: 1,
price: 3
}
// {
// serviceCompany: 'openai',
// name: 'GPT3',
// model: ChatModelNameEnum.GPT3,
// trainName: 'davinci',
// maxToken: 4000,
// contextMaxToken: 7500,
// trainedMaxToken: 2000,
// maxTemperature: 2,
// price: 30
// }
];
export enum TrainingStatusEnum {
@@ -93,9 +80,37 @@ export const formatModelStatus = {
}
};
export const ModelDataStatusMap = {
0: '训练完成',
1: '训练中'
export const ModelDataStatusMap: Record<RedisModelDataItemType['status'], string> = {
ready: '训练完成',
waiting: '训练中'
};
/* 知识库搜索时的配置 */
// 搜索方式
export enum ModelVectorSearchModeEnum {
hightSimilarity = 'hightSimilarity', // 高相似度+禁止回复
lowSimilarity = 'lowSimilarity', // 低相似度
noContext = 'noContex' // 高相似度+无上下文回复
}
export const ModelVectorSearchModeMap: Record<
`${ModelVectorSearchModeEnum}`,
{
text: string;
similarity: number;
}
> = {
[ModelVectorSearchModeEnum.hightSimilarity]: {
text: '高相似度, 无匹配时拒绝回复',
similarity: 0.2
},
[ModelVectorSearchModeEnum.noContext]: {
text: '高相似度,无匹配时直接回复',
similarity: 0.2
},
[ModelVectorSearchModeEnum.lowSimilarity]: {
text: '低相似度匹配',
similarity: 0.8
}
};
export const defaultModel: ModelSchema = {
@@ -109,6 +124,9 @@ export const defaultModel: ModelSchema = {
systemPrompt: '',
intro: '',
temperature: 5,
search: {
mode: ModelVectorSearchModeEnum.hightSimilarity
},
service: {
company: 'openai',
trainId: '',

View File

@@ -1 +1,6 @@
export const VecModelDataIndex = 'model:data';
export const VecModelDataPrefix = 'model:data';
export const VecModelDataIdx = `idx:${VecModelDataPrefix}:hash`;
export enum ModelDataStatusEnum {
ready = 'ready',
waiting = 'waiting'
}

View File

@@ -3,6 +3,7 @@ export enum BillTypeEnum {
splitData = 'splitData',
QA = 'QA',
abstract = 'abstract',
vector = 'vector',
return = 'return'
}
export enum PageTypeEnum {
@@ -16,5 +17,6 @@ export const BillTypeMap: Record<`${BillTypeEnum}`, string> = {
[BillTypeEnum.splitData]: 'QA拆分',
[BillTypeEnum.QA]: 'QA拆分',
[BillTypeEnum.abstract]: '摘要总结',
[BillTypeEnum.vector]: '索引生成',
[BillTypeEnum.return]: '退款'
};

15
src/hooks/useMarkdown.ts Normal file
View File

@@ -0,0 +1,15 @@
import { useQuery } from '@tanstack/react-query';
export const getMd = async (url: string) => {
const response = await fetch(`/docs/${url}`);
const textContent = await response.text();
return textContent;
};
export const useMarkdown = ({ url }: { url: string }) => {
const { data = '' } = useQuery([url], () => getMd(url));
return {
data
};
};

View File

@@ -1,8 +1,8 @@
import { useState, useCallback, useMemo } from 'react';
import { useState, useCallback, useMemo, useEffect } from 'react';
import type { PagingData } from '../types/index';
import { IconButton, Flex, Box } from '@chakra-ui/react';
import { IconButton, Flex, Box, Input } from '@chakra-ui/react';
import { ArrowBackIcon, ArrowForwardIcon } from '@chakra-ui/icons';
import { useQuery, useMutation } from '@tanstack/react-query';
import { useMutation } from '@tanstack/react-query';
import { useToast } from './useToast';
export const usePagination = <T = any,>({
@@ -17,13 +17,10 @@ export const usePagination = <T = any,>({
const { toast } = useToast();
const [pageNum, setPageNum] = useState(1);
const [total, setTotal] = useState(0);
const [data, setData] = useState<T[]>([]);
const maxPage = useMemo(() => Math.ceil(total / pageSize), [pageSize, total]);
const {
mutate,
data = [],
isLoading
} = useMutation({
const { mutate, isLoading } = useMutation({
mutationFn: async (num: number = pageNum) => {
try {
const res: PagingData<T> = await api({
@@ -33,7 +30,7 @@ export const usePagination = <T = any,>({
});
setPageNum(num);
setTotal(res.total);
return res.data;
setData(res.data);
} catch (error: any) {
toast({
title: error?.message || '获取数据异常',
@@ -44,10 +41,9 @@ export const usePagination = <T = any,>({
}
});
useQuery(['init'], () => {
useEffect(() => {
mutate(1);
return null;
});
}, []);
const Pagination = useCallback(() => {
return (
@@ -57,16 +53,40 @@ export const usePagination = <T = any,>({
icon={<ArrowBackIcon />}
aria-label={'left'}
size={'sm'}
w={'28px'}
h={'28px'}
onClick={() => mutate(pageNum - 1)}
/>
<Box mx={2}>
{pageNum}/{maxPage}
</Box>
<Flex mx={2} alignItems={'center'}>
<Input
defaultValue={pageNum}
w={'50px'}
size={'xs'}
type={'number'}
min={1}
max={maxPage}
onBlur={(e) => {
const val = +e.target.value;
if (val === pageNum) return;
if (val >= maxPage) {
mutate(maxPage);
} else if (val < 1) {
mutate(1);
} else {
mutate(+e.target.value);
}
}}
/>
<Box mx={2}>/</Box>
{maxPage}
</Flex>
<IconButton
isDisabled={pageNum === maxPage}
icon={<ArrowForwardIcon />}
aria-label={'left'}
size={'sm'}
w={'28px'}
h={'28px'}
onClick={() => mutate(pageNum + 1)}
/>
</Flex>

View File

@@ -1,6 +1,5 @@
import { useState, useCallback } from 'react';
import { useState, useCallback, useEffect } from 'react';
import type { PagingData } from '../types/index';
import { useQuery } from '@tanstack/react-query';
import { useToast } from './useToast';
export const usePaging = <T = any>({
@@ -64,7 +63,9 @@ export const usePaging = <T = any>({
getData(pageNum + 1);
}, [getData, isLoadAll, pageNum, requesting]);
useQuery(['init'], () => getData(1, true));
useEffect(() => {
getData(1, true);
}, []);
return {
pageNum,

View File

@@ -1,4 +1,5 @@
import type { AppProps, NextWebVitalsMetric } from 'next/app';
import { useEffect } from 'react';
import type { AppProps } from 'next/app';
import Script from 'next/script';
import Head from 'next/head';
import { ChakraProvider, ColorModeScript } from '@chakra-ui/react';
@@ -9,6 +10,7 @@ import NProgress from 'nprogress'; //nprogress module
import Router from 'next/router';
import 'nprogress/nprogress.css';
import '../styles/reset.scss';
import { useToast } from '@/hooks/useToast';
//Binding events.
Router.events.on('routeChangeStart', () => NProgress.start());
@@ -27,6 +29,17 @@ const queryClient = new QueryClient({
});
export default function App({ Component, pageProps }: AppProps) {
const { toast } = useToast();
// 校验是否支持 click 事件
useEffect(() => {
if (typeof document.createElement('div').click !== 'function') {
toast({
title: '你的浏览器版本过低',
status: 'warning'
});
}
}, [toast]);
return (
<>
<Head>
@@ -34,11 +47,10 @@ export default function App({ Component, pageProps }: AppProps) {
<meta name="description" content="Generated by Fast GPT" />
<meta
name="viewport"
content="width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=0;"
content="width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=0"
/>
<link rel="icon" href="/favicon.ico" />
</Head>
<Script src="/js/iconfont.js" strategy="afterInteractive"></Script>
<Script src="/js/qrcode.min.js" strategy="afterInteractive"></Script>
<Script src="/js/pdf.js" strategy="afterInteractive"></Script>
<QueryClientProvider client={queryClient}>

View File

@@ -1,8 +1,7 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { createParser, ParsedEvent, ReconnectInterval } from 'eventsource-parser';
import { connectToDatabase } from '@/service/mongo';
import { getOpenAIApi, authChat } from '@/service/utils/chat';
import { httpsAgent } from '@/service/utils/tools';
import { httpsAgent, openaiChatFilter } from '@/service/utils/tools';
import { ChatCompletionRequestMessage, ChatCompletionRequestMessageRoleEnum } from 'openai';
import { ChatItemType } from '@/types/chat';
import { jsonRes } from '@/service/response';
@@ -10,7 +9,7 @@ import type { ModelSchema } from '@/types/mongoSchema';
import { PassThrough } from 'stream';
import { modelList } from '@/constants/model';
import { pushChatBill } from '@/service/events/pushBill';
import { openaiChatFilter } from '@/service/utils/tools';
import { gpt35StreamResponse } from '@/service/utils/openai';
/* 发送提示词 */
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
@@ -97,57 +96,21 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
{
timeout: 40000,
responseType: 'stream',
httpsAgent
httpsAgent: httpsAgent(!userApiKey)
}
);
console.log('api response time:', `${(Date.now() - startTime) / 1000}s`);
// 创建响应流
res.setHeader('Content-Type', 'text/event-stream;charset-utf-8');
res.setHeader('Access-Control-Allow-Origin', '*');
res.setHeader('X-Accel-Buffering', 'no');
res.setHeader('Cache-Control', 'no-cache, no-transform');
step = 1;
let responseContent = '';
stream.pipe(res);
const onParse = async (event: ParsedEvent | ReconnectInterval) => {
if (event.type !== 'event') return;
const data = event.data;
if (data === '[DONE]') return;
try {
const json = JSON.parse(data);
const content: string = json?.choices?.[0].delta.content || '';
if (!content || (responseContent === '' && content === '\n')) return;
responseContent += content;
// console.log('content:', content)
!stream.destroyed && stream.push(content.replace(/\n/g, '<br/>'));
} catch (error) {
error;
}
};
const decoder = new TextDecoder();
try {
for await (const chunk of chatResponse.data as any) {
if (stream.destroyed) {
// 流被中断了,直接忽略后面的内容
break;
}
const parser = createParser(onParse);
parser.feed(decoder.decode(chunk));
}
} catch (error) {
console.log('pipe error', error);
}
// close stream
!stream.destroyed && stream.push(null);
stream.destroy();
const { responseContent } = await gpt35StreamResponse({
res,
stream,
chatResponse
});
const promptsContent = formatPrompts.map((item) => item.content).join('');
// 只有使用平台的 key 才计费
pushChatBill({
isPay: !userApiKey,

View File

@@ -7,9 +7,8 @@ import type { ModelSchema } from '@/types/mongoSchema';
/* 获取我的模型 */
export default async function handler(req: NextApiRequest, res: NextApiResponse<any>) {
try {
const { modelId, isShare = 'false' } = req.query as {
const { modelId } = req.query as {
modelId: string;
isShare?: 'true' | 'false';
};
const { authorization } = req.headers;
@@ -26,23 +25,20 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse<
await connectToDatabase();
// 获取模型配置
// 校验是否为用户的模型
const model = await Model.findOne<ModelSchema>({
_id: modelId,
userId
});
if (!model) {
throw new Error('模型不存在');
throw new Error('无权使用该模型');
}
// 创建 chat 数据
const response = await Chat.create({
userId,
modelId,
expiredTime: Date.now() + model.security.expiredTime,
loadAmount: model.security.maxLoadAmount,
isShare: isShare === 'true',
content: []
});

View File

@@ -1,173 +0,0 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { createParser, ParsedEvent, ReconnectInterval } from 'eventsource-parser';
import { connectToDatabase } from '@/service/mongo';
import { getOpenAIApi, authChat } from '@/service/utils/chat';
import { httpsAgent } from '@/service/utils/tools';
import { ChatItemType } from '@/types/chat';
import { jsonRes } from '@/service/response';
import type { ModelSchema } from '@/types/mongoSchema';
import { PassThrough } from 'stream';
import { modelList } from '@/constants/model';
import { pushChatBill } from '@/service/events/pushBill';
/* 发送提示词 */
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
let step = 0; // step=1时表示开始了流响应
const stream = new PassThrough();
stream.on('error', () => {
console.log('error: ', 'stream error');
stream.destroy();
});
res.on('close', () => {
stream.destroy();
});
res.on('error', () => {
console.log('error: ', 'request error');
stream.destroy();
});
try {
const { chatId, prompt } = req.body as {
prompt: ChatItemType;
chatId: string;
};
const { authorization } = req.headers;
if (!chatId || !prompt) {
throw new Error('缺少参数');
}
await connectToDatabase();
const { chat, userApiKey, systemKey, userId } = await authChat(chatId, authorization);
const model: ModelSchema = chat.modelId;
// 读取对话内容
const prompts = [...chat.content, prompt];
// 上下文长度过滤
const maxContext = model.security.contextMaxLen;
const filterPrompts =
prompts.length > maxContext ? prompts.slice(prompts.length - maxContext) : prompts;
// 格式化文本内容
const formatPrompts: string[] = filterPrompts.map((item: ChatItemType) => item.value);
// 如果有系统提示词,自动插入
if (model.systemPrompt) {
formatPrompts.unshift(`${model.systemPrompt}`);
}
const promptText = formatPrompts.join('</s>');
// 计算温度
const modelConstantsData = modelList.find((item) => item.model === model.service.modelName);
if (!modelConstantsData) {
throw new Error('模型异常,请用 chatgpt 模型');
}
const temperature = modelConstantsData.maxTemperature * (model.temperature / 10);
// 获取 chatAPI
const chatAPI = getOpenAIApi(userApiKey || systemKey);
let startTime = Date.now();
// console.log({
// model: model.service.chatModel,
// temperature: temperature,
// prompt: promptText,
// stream: true,
// max_tokens:
// model.trainingTimes > 0 ? modelConstantsData.trainedMaxToken : modelConstantsData.maxToken,
// presence_penalty: -0.5, // 越大,越容易出现新内容
// frequency_penalty: 0.5, // 越大,重复内容越少
// stop: [`###`]
// });
// 发出请求
const chatResponse = await chatAPI.createCompletion(
{
model: model.service.chatModel,
temperature: temperature,
prompt: promptText,
stream: true,
max_tokens:
model.trainingTimes > 0
? modelConstantsData.trainedMaxToken
: modelConstantsData.maxToken,
presence_penalty: -0.5, // 越大,越容易出现新内容
frequency_penalty: 0.5, // 越大,重复内容越少
stop: [`###`, '。!?.!.']
},
{
timeout: 40000,
responseType: 'stream',
httpsAgent
}
);
console.log('api response time:', `${(Date.now() - startTime) / 1000}s`);
// 创建响应流
res.setHeader('Content-Type', 'text/event-stream;charset-utf-8');
res.setHeader('Access-Control-Allow-Origin', '*');
res.setHeader('X-Accel-Buffering', 'no');
res.setHeader('Cache-Control', 'no-cache, no-transform');
step = 1;
let responseContent = '';
stream.pipe(res);
const onParse = async (event: ParsedEvent | ReconnectInterval) => {
if (event.type !== 'event') return;
const data = event.data;
if (data === '[DONE]') return;
try {
const json = JSON.parse(data);
const content: string = json?.choices?.[0].text || '';
// console.log('content:', content);
if (!content || (responseContent === '' && content === '\n')) return;
responseContent += content;
!stream.destroyed && stream.push(content.replace(/\n/g, '<br/>'));
} catch (error) {
error;
}
};
const decoder = new TextDecoder();
try {
for await (const chunk of chatResponse.data as any) {
if (stream.destroyed) {
// 流被中断了,直接忽略后面的内容
break;
}
const parser = createParser(onParse);
parser.feed(decoder.decode(chunk));
}
} catch (error) {
console.log('pipe error', error);
}
// close stream
!stream.destroyed && stream.push(null);
stream.destroy();
// 只有使用平台的 key 才计费
pushChatBill({
isPay: !userApiKey,
modelName: model.service.modelName,
userId,
chatId,
text: promptText + responseContent
});
} catch (err: any) {
// console.log(err?.response);
if (step === 1) {
// 直接结束流
console.log('error结束');
stream.destroy();
} else {
res.status(500);
jsonRes(res, {
code: 500,
error: err
});
}
}
}

View File

@@ -1,18 +1,19 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { createParser, ParsedEvent, ReconnectInterval } from 'eventsource-parser';
import { connectToDatabase, ModelData } from '@/service/mongo';
import { getOpenAIApi, authChat } from '@/service/utils/chat';
import { connectToDatabase } from '@/service/mongo';
import { authChat } from '@/service/utils/chat';
import { httpsAgent, openaiChatFilter, systemPromptFilter } from '@/service/utils/tools';
import { ChatCompletionRequestMessage, ChatCompletionRequestMessageRoleEnum } from 'openai';
import { ChatItemType } from '@/types/chat';
import { jsonRes } from '@/service/response';
import type { ModelSchema } from '@/types/mongoSchema';
import { PassThrough } from 'stream';
import { modelList } from '@/constants/model';
import { modelList, ModelVectorSearchModeMap, ModelVectorSearchModeEnum } from '@/constants/model';
import { pushChatBill } from '@/service/events/pushBill';
import { connectRedis } from '@/service/redis';
import { VecModelDataIndex } from '@/constants/redis';
import { VecModelDataPrefix } from '@/constants/redis';
import { vectorToBuffer } from '@/utils/tools';
import { openaiCreateEmbedding, gpt35StreamResponse } from '@/service/utils/openai';
import dayjs from 'dayjs';
/* 发送提示词 */
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
@@ -56,34 +57,25 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
// 读取对话内容
const prompts = [...chat.content, prompt];
// 获取 chatAPI
const chatAPI = getOpenAIApi(userApiKey || systemKey);
// 获取提示词的向量
const { vector: promptVector, chatAPI } = await openaiCreateEmbedding({
isPay: !userApiKey,
apiKey: userApiKey || systemKey,
userId,
text: prompt.value
});
// 把输入的内容转成向量
const promptVector = await chatAPI
.createEmbedding(
{
model: 'text-embedding-ada-002',
input: prompt.value
},
{
timeout: 120000,
httpsAgent
}
)
.then((res) => res?.data?.data?.[0]?.embedding || []);
// 搜索系统提示词, 按相似度从 redis 中搜出前3条不同 dataId 的数据
const similarity = ModelVectorSearchModeMap[model.search.mode]?.similarity || 0.22;
// 搜索系统提示词, 按相似度从 redis 中搜出相关的 q 和 text
const redisData: any[] = await redis.sendCommand([
'FT.SEARCH',
`idx:${VecModelDataIndex}:hash`,
`idx:${VecModelDataPrefix}:hash`,
`@modelId:{${String(
chat.modelId._id
)}} @vector:[VECTOR_RANGE 0.15 $blob]=>{$YIELD_DISTANCE_AS: score}`,
// `@modelId:{${String(chat.modelId._id)}}=>[KNN 10 @vector $blob AS score]`,
)}} @vector:[VECTOR_RANGE ${similarity} $blob]=>{$YIELD_DISTANCE_AS: score}`,
'RETURN',
'1',
'dataId',
'text',
'SORTBY',
'score',
'PARAMS',
@@ -92,48 +84,48 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
vectorToBuffer(promptVector),
'LIMIT',
'0',
'20',
'30',
'DIALECT',
'2'
]);
// 格式化响应值获取去重后的id
let formatIds = [2, 4, 6, 8, 10, 12, 14, 16, 18, 20]
.map((i) => {
if (!redisData[i] || !redisData[i][1]) return '';
return redisData[i][1];
})
.filter((item) => item);
formatIds = Array.from(new Set(formatIds));
if (formatIds.length === 0) {
throw new Error('对不起,我没有找到你的问题');
const formatRedisPrompt: string[] = [];
// 格式化响应值,获取 qa
for (let i = 2; i < 61; i += 2) {
const text = redisData[i]?.[1];
if (text) {
formatRedisPrompt.push(text);
}
}
// 从 mongo 中取出原文作为提示词
const textArr = (
await Promise.all(
[2, 4, 6, 8, 10, 12, 14, 16, 18, 20].map((i) => {
if (!redisData[i] || !redisData[i][1]) return '';
return ModelData.findById(redisData[i][1])
.select('text q')
.then((res) => {
if (!res) return '';
const questions = res.q.map((item) => item.text).join(' ');
const answer = res.text;
return `${questions} ${answer}`;
});
})
)
).filter((item) => item);
/* 高相似度+退出,无法匹配时直接退出 */
if (
formatRedisPrompt.length === 0 &&
model.search.mode === ModelVectorSearchModeEnum.hightSimilarity
) {
return res.send('对不起,你的问题不在知识库中。');
}
/* 高相似度+无上下文,不添加额外知识 */
if (
formatRedisPrompt.length === 0 &&
model.search.mode === ModelVectorSearchModeEnum.noContext
) {
prompts.unshift({
obj: 'SYSTEM',
value: model.systemPrompt
});
} else {
// 有匹配情况下,添加知识库内容。
// 系统提示词过滤,最多 2800 tokens
const systemPrompt = systemPromptFilter(formatRedisPrompt, 2800);
// textArr 筛选,最多 3000 tokens
const systemPrompt = systemPromptFilter(textArr, 2800);
prompts.unshift({
obj: 'SYSTEM',
value: `根据下面的知识回答问题: ${systemPrompt}`
});
prompts.unshift({
obj: 'SYSTEM',
value: `${model.systemPrompt} 用知识库内容回答,知识库内容为: "当前时间:${dayjs().format(
'YYYY/MM/DD HH:mm:ss'
)} ${systemPrompt}"`
});
}
// 控制在 tokens 数量,防止超出
const filterPrompts = openaiChatFilter(prompts, modelConstantsData.contextMaxToken);
@@ -168,55 +160,19 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
{
timeout: 40000,
responseType: 'stream',
httpsAgent
httpsAgent: httpsAgent(!userApiKey)
}
);
console.log('api response time:', `${(Date.now() - startTime) / 1000}s`);
// 创建响应流
res.setHeader('Content-Type', 'text/event-stream;charset-utf-8');
res.setHeader('Access-Control-Allow-Origin', '*');
res.setHeader('X-Accel-Buffering', 'no');
res.setHeader('Cache-Control', 'no-cache, no-transform');
step = 1;
let responseContent = '';
stream.pipe(res);
const onParse = async (event: ParsedEvent | ReconnectInterval) => {
if (event.type !== 'event') return;
const data = event.data;
if (data === '[DONE]') return;
try {
const json = JSON.parse(data);
const content: string = json?.choices?.[0].delta.content || '';
if (!content || (responseContent === '' && content === '\n')) return;
responseContent += content;
// console.log('content:', content)
!stream.destroyed && stream.push(content.replace(/\n/g, '<br/>'));
} catch (error) {
error;
}
};
const decoder = new TextDecoder();
try {
for await (const chunk of chatResponse.data as any) {
if (stream.destroyed) {
// 流被中断了,直接忽略后面的内容
break;
}
const parser = createParser(onParse);
parser.feed(decoder.decode(chunk));
}
} catch (error) {
console.log('pipe error', error);
}
// close stream
!stream.destroyed && stream.push(null);
stream.destroy();
const { responseContent } = await gpt35StreamResponse({
res,
stream,
chatResponse
});
const promptsContent = formatPrompts.map((item) => item.content).join('');
// 只有使用平台的 key 才计费

View File

@@ -1,9 +1,7 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { connectToDatabase, ModelData } from '@/service/mongo';
import { authToken } from '@/service/utils/tools';
import { connectRedis } from '@/service/redis';
import { VecModelDataIndex } from '@/constants/redis';
export default async function handler(req: NextApiRequest, res: NextApiResponse<any>) {
try {
@@ -23,25 +21,15 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse<
// 凭证校验
const userId = await authToken(authorization);
await connectToDatabase();
const redis = await connectRedis();
const data = await ModelData.findById(dataId);
await ModelData.deleteOne({
_id: dataId,
userId
});
// 删除 redis 数据
data?.q.forEach(async (item) => {
try {
await redis.json.del(`${VecModelDataIndex}:${item.id}`);
} catch (error) {
console.log(error);
}
});
// 校验是否为该用户的数据
const dataItemUserId = await redis.hGet(dataId, 'userId');
if (dataItemUserId !== userId) {
throw new Error('无权操作');
}
// 删除
await redis.del(dataId);
jsonRes(res);
} catch (err) {
console.log(err);

View File

@@ -0,0 +1,60 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { connectToDatabase } from '@/service/mongo';
import { authToken } from '@/service/utils/tools';
import { connectRedis } from '@/service/redis';
import { VecModelDataIdx } from '@/constants/redis';
export default async function handler(req: NextApiRequest, res: NextApiResponse<any>) {
try {
let { modelId } = req.query as {
modelId: string;
};
const { authorization } = req.headers;
if (!authorization) {
throw new Error('无权操作');
}
if (!modelId) {
throw new Error('缺少参数');
}
// 凭证校验
const userId = await authToken(authorization);
await connectToDatabase();
const redis = await connectRedis();
// 从 redis 中获取数据
const searchRes = await redis.ft.search(
VecModelDataIdx,
`@modelId:{${modelId}} @userId:{${userId}}`,
{
RETURN: ['q', 'text'],
LIMIT: {
from: 0,
size: 10000
}
}
);
const data: [string, string][] = [];
searchRes.documents.forEach((item: any) => {
if (item.value.q && item.value.text) {
data.push([item.value.q.replace(/\n/g, '\\n'), item.value.text.replace(/\n/g, '\\n')]);
}
});
jsonRes(res, {
data
});
} catch (err) {
jsonRes(res, {
code: 500,
error: err
});
}
}

View File

@@ -0,0 +1,36 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { connectToDatabase } from '@/service/mongo';
import { authToken } from '@/service/utils/tools';
import axios from 'axios';
import { httpsAgent } from '@/service/utils/tools';
/**
* 读取网站的内容
*/
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
try {
const { url } = req.body as { url: string };
if (!url) {
throw new Error('缺少 url');
}
await connectToDatabase();
const { authorization } = req.headers;
await authToken(authorization);
const data = await axios
.get(url, {
httpsAgent: httpsAgent(false)
})
.then((res) => res.data as string);
jsonRes(res, { data });
} catch (err) {
jsonRes(res, {
code: 500,
error: err
});
}
}

View File

@@ -1,7 +1,10 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { connectToDatabase, ModelData } from '@/service/mongo';
import { connectToDatabase } from '@/service/mongo';
import { authToken } from '@/service/utils/tools';
import { connectRedis } from '@/service/redis';
import { VecModelDataIdx } from '@/constants/redis';
import { SearchOptions } from 'redis';
export default async function handler(req: NextApiRequest, res: NextApiResponse<any>) {
try {
@@ -32,24 +35,34 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse<
const userId = await authToken(authorization);
await connectToDatabase();
const redis = await connectRedis();
const data = await ModelData.find({
modelId,
userId
})
.sort({ _id: -1 }) // 按照创建时间倒序排列
.skip((pageNum - 1) * pageSize)
.limit(pageSize);
// 从 redis 中获取数据
const searchRes = await redis.ft.search(
VecModelDataIdx,
`@modelId:{${modelId}} @userId:{${userId}}`,
{
RETURN: ['q', 'text', 'status'],
LIMIT: {
from: (pageNum - 1) * pageSize,
size: pageSize
},
SORTBY: {
BY: 'modelId',
DIRECTION: 'DESC'
}
}
);
jsonRes(res, {
data: {
pageNum,
pageSize,
data,
total: await ModelData.countDocuments({
modelId,
userId
})
data: searchRes.documents.map((item) => ({
id: item.id,
...item.value
})),
total: searchRes.total
}
});
} catch (err) {

View File

@@ -24,7 +24,7 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
});
jsonRes(res, {
data
data: data.map((item) => item.textList).flat().length
});
} catch (err) {
jsonRes(res, {

View File

@@ -0,0 +1,101 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { connectToDatabase, Model } from '@/service/mongo';
import { authToken } from '@/service/utils/tools';
import { generateVector } from '@/service/events/generateVector';
import { connectRedis } from '@/service/redis';
import { VecModelDataPrefix, ModelDataStatusEnum } from '@/constants/redis';
import { VecModelDataIdx } from '@/constants/redis';
import { customAlphabet } from 'nanoid';
const nanoid = customAlphabet('abcdefghijklmnopqrstuvwxyz1234567890', 12);
export default async function handler(req: NextApiRequest, res: NextApiResponse<any>) {
try {
const { modelId, data } = req.body as {
modelId: string;
data: string[][];
};
const { authorization } = req.headers;
if (!authorization) {
throw new Error('无权操作');
}
if (!modelId || !Array.isArray(data)) {
throw new Error('缺少参数');
}
// 凭证校验
const userId = await authToken(authorization);
await connectToDatabase();
const redis = await connectRedis();
// 验证是否是该用户的 model
const model = await Model.findOne({
_id: modelId,
userId
});
if (!model) {
throw new Error('无权操作该模型');
}
// 去重
const searchRes = await Promise.allSettled(
data.map(async ([q, a]) => {
try {
q = q.replace(/\\n/g, '\n');
a = a.replace(/\\n/g, '\n');
const redisSearch = await redis.ft.search(VecModelDataIdx, `@q:${q} @text:${a}`, {
RETURN: ['q', 'text']
});
if (redisSearch.total > 0) {
return Promise.reject('已经存在');
}
} catch (error) {
error;
}
return Promise.resolve({
q,
a
});
})
);
const filterData = searchRes
.filter((item) => item.status === 'fulfilled')
.map<{ q: string; a: string }>((item: any) => item.value);
// 插入 redis
const insertRedisRes = await Promise.allSettled(
filterData.map((item) => {
return redis.sendCommand([
'HMSET',
`${VecModelDataPrefix}:${nanoid()}`,
'userId',
userId,
'modelId',
String(modelId),
'q',
item.q,
'text',
item.a,
'status',
ModelDataStatusEnum.waiting
]);
})
);
generateVector();
jsonRes(res, {
data: insertRedisRes.filter((item) => item.status === 'fulfilled').length
});
} catch (err) {
jsonRes(res, {
code: 500,
error: err
});
}
}

View File

@@ -1,9 +1,11 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { connectToDatabase, ModelData, Model } from '@/service/mongo';
import { connectToDatabase, Model } from '@/service/mongo';
import { authToken } from '@/service/utils/tools';
import { ModelDataSchema } from '@/types/mongoSchema';
import { generateVector } from '@/service/events/generateVector';
import { connectRedis } from '@/service/redis';
import { VecModelDataPrefix, ModelDataStatusEnum } from '@/constants/redis';
export default async function handler(req: NextApiRequest, res: NextApiResponse<any>) {
try {
@@ -25,6 +27,7 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse<
const userId = await authToken(authorization);
await connectToDatabase();
const redis = await connectRedis();
// 验证是否是该用户的 model
const model = await Model.findOne({
@@ -36,19 +39,29 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse<
throw new Error('无权操作该模型');
}
// push data
await ModelData.insertMany(
data.map((item) => ({
...item,
modelId,
userId
}))
const insertRes = await Promise.allSettled(
data.map((item) => {
return redis.sendCommand([
'HMSET',
`${VecModelDataPrefix}:${item.q.id}`,
'userId',
userId,
'modelId',
modelId,
'q',
item.q.text,
'text',
item.text,
'status',
ModelDataStatusEnum.waiting
]);
})
);
generateVector(true);
generateVector();
jsonRes(res, {
data: model
data: insertRes.filter((item) => item.status === 'rejected').length
});
} catch (err) {
jsonRes(res, {

View File

@@ -1,57 +0,0 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { connectToDatabase, DataItem, ModelData } from '@/service/mongo';
import { authToken } from '@/service/utils/tools';
import { customAlphabet } from 'nanoid';
const nanoid = customAlphabet('abcdefghijklmnopqrstuvwxyz1234567890', 12);
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
try {
let { dataIds, modelId } = req.body as { dataIds: string[]; modelId: string };
if (!dataIds) {
throw new Error('参数错误');
}
await connectToDatabase();
const { authorization } = req.headers;
const userId = await authToken(authorization);
const dataItems = (
await Promise.all(
dataIds.map((dataId) =>
DataItem.find<{ _id: string; result: { q: string }[]; text: string }>(
{
userId,
dataId
},
'result text'
)
)
)
).flat();
// push data
await ModelData.insertMany(
dataItems.map((item) => ({
modelId: modelId,
userId,
text: item.text,
q: item.result.map((item) => ({
id: nanoid(),
text: item.q
}))
}))
);
jsonRes(res, {
data: dataItems
});
} catch (err) {
jsonRes(res, {
code: 500,
error: err
});
}
}

View File

@@ -1,14 +1,13 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { connectToDatabase, ModelData } from '@/service/mongo';
import { authToken } from '@/service/utils/tools';
import { connectRedis } from '@/service/redis';
import { ModelDataStatusEnum } from '@/constants/redis';
import { generateVector } from '@/service/events/generateVector';
export default async function handler(req: NextApiRequest, res: NextApiResponse<any>) {
try {
let { dataId, text } = req.body as {
dataId: string;
text: string;
};
const { dataId, text, q } = req.body as { dataId: string; text: string; q?: string };
const { authorization } = req.headers;
if (!authorization) {
@@ -22,17 +21,26 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse<
// 凭证校验
const userId = await authToken(authorization);
await connectToDatabase();
const redis = await connectRedis();
await ModelData.updateOne(
{
_id: dataId,
userId
},
{
text
}
);
// 校验是否为该用户的数据
const dataItemUserId = await redis.hGet(dataId, 'userId');
if (dataItemUserId !== userId) {
throw new Error('无权操作');
}
// 更新
await redis.sendCommand([
'HMSET',
dataId,
...(q ? ['q', q, 'status', ModelDataStatusEnum.waiting] : []),
'text',
text
]);
if (q) {
generateVector();
}
jsonRes(res);
} catch (err) {

View File

@@ -8,8 +8,8 @@ import { encode } from 'gpt-token-utils';
/* 拆分数据成QA */
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
try {
const { text, modelId } = req.body as { text: string; modelId: string };
if (!text || !modelId) {
const { text, modelId, prompt } = req.body as { text: string; modelId: string; prompt: string };
if (!text || !modelId || !prompt) {
throw new Error('参数错误');
}
await connectToDatabase();
@@ -31,17 +31,25 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
const replaceText = text.replace(/(\\n|\n)+/g, ' ');
// 文本拆分成 chunk
let chunks = replaceText.match(/[^!?.。]+[!?.。]/g) || [];
const chunks = replaceText.match(/[^!?.。]+[!?.。]/g) || [];
const textList: string[] = [];
let splitText = '';
/* 取 3k ~ 4K tokens 内容 */
chunks.forEach((chunk) => {
splitText += chunk;
const tokens = encode(splitText).length;
if (tokens >= 980) {
const tokens = encode(splitText + chunk).length;
if (tokens >= 4000) {
// 超过 4000不要这块内容
textList.push(splitText);
splitText = chunk;
} else if (tokens >= 3000) {
// 超过 3000取内容
textList.push(splitText + chunk);
splitText = '';
} else {
//没超过 3000继续添加
splitText += chunk;
}
});
@@ -54,7 +62,8 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
userId,
modelId,
rawText: text,
textList
textList,
prompt
});
generateQA();
@@ -67,3 +76,11 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
});
}
}
export const config = {
api: {
bodyParser: {
sizeLimit: '10mb'
}
}
};

View File

@@ -1,13 +1,13 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { Chat, Model, Training, connectToDatabase, ModelData } from '@/service/mongo';
import { authToken, getUserApiOpenai } from '@/service/utils/tools';
import { Chat, Model, Training, connectToDatabase } from '@/service/mongo';
import { authToken } from '@/service/utils/tools';
import { getUserApiOpenai } from '@/service/utils/openai';
import { TrainingStatusEnum } from '@/constants/model';
import { getOpenAIApi } from '@/service/utils/chat';
import { TrainingItemType } from '@/types/training';
import { httpsAgent } from '@/service/utils/tools';
import { connectRedis } from '@/service/redis';
import { VecModelDataIndex } from '@/constants/redis';
import { VecModelDataIdx } from '@/constants/redis';
/* 获取我的模型 */
export default async function handler(req: NextApiRequest, res: NextApiResponse<any>) {
@@ -26,39 +26,38 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse<
// 凭证校验
const userId = await authToken(authorization);
// 验证是否是该用户的 model
const model = await Model.findOne({
_id: modelId,
userId
});
if (!model) {
throw new Error('无权操作该模型');
}
await connectToDatabase();
const redis = await connectRedis();
const modelDataList = await ModelData.find({
// 获取 redis 中模型关联的所有数据
const searchRes = await redis.ft.search(
VecModelDataIdx,
`@modelId:{${modelId}} @userId:{${userId}}`,
{
LIMIT: {
from: 0,
size: 10000
}
}
);
// 删除 redis 内容
await Promise.all(searchRes.documents.map((item) => redis.del(item.id)));
// 删除对应的聊天
await Chat.deleteMany({
modelId
});
// 删除 redis
modelDataList?.forEach((modelData) =>
modelData.q.forEach(async (item) => {
try {
await redis.json.del(`${VecModelDataIndex}:${item.id}`);
} catch (error) {
console.log(error);
}
})
);
let requestQueue: any[] = [];
// 删除对应的聊天
requestQueue.push(
Chat.deleteMany({
modelId
})
);
// 删除数据集
requestQueue.push(
ModelData.deleteMany({
modelId
})
);
// 查看是否正在训练
const training: TrainingItemType | null = await Training.findOne({
modelId,
@@ -69,30 +68,26 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse<
if (training) {
const { openai } = await getUserApiOpenai(userId);
// 获取训练记录
const tuneRecord = await openai.retrieveFineTune(training.tuneId, { httpsAgent });
const tuneRecord = await openai.retrieveFineTune(training.tuneId, {
httpsAgent: httpsAgent(false)
});
// 删除训练文件
openai.deleteFile(tuneRecord.data.training_files[0].id, { httpsAgent });
openai.deleteFile(tuneRecord.data.training_files[0].id, { httpsAgent: httpsAgent(false) });
// 取消训练
openai.cancelFineTune(training.tuneId, { httpsAgent });
openai.cancelFineTune(training.tuneId, { httpsAgent: httpsAgent(false) });
}
// 删除对应训练记录
requestQueue.push(
Training.deleteMany({
modelId
})
);
await Training.deleteMany({
modelId
});
// 删除模型
requestQueue.push(
Model.deleteOne({
_id: modelId,
userId
})
);
await Promise.all(requestQueue);
await Model.deleteOne({
_id: modelId,
userId
});
jsonRes(res);
} catch (err) {

View File

@@ -1,8 +1,8 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { connectToDatabase, Model, Training } from '@/service/mongo';
import { getOpenAIApi } from '@/service/utils/chat';
import { authToken, getUserApiOpenai } from '@/service/utils/tools';
import { authToken } from '@/service/utils/tools';
import { getUserApiOpenai } from '@/service/utils/openai';
import type { ModelSchema } from '@/types/mongoSchema';
import { TrainingItemType } from '@/types/training';
import { ModelStatusEnum, TrainingStatusEnum } from '@/constants/model';
@@ -46,11 +46,13 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
const { openai } = await getUserApiOpenai(userId);
// 获取 openai 的训练情况
const { data } = await openai.retrieveFineTune(training.tuneId, { httpsAgent });
const { data } = await openai.retrieveFineTune(training.tuneId, {
httpsAgent: httpsAgent(false)
});
// console.log(data);
if (data.status === OpenAiTuneStatusEnum.succeeded) {
// 删除训练文件
openai.deleteFile(data.training_files[0].id, { httpsAgent });
openai.deleteFile(data.training_files[0].id, { httpsAgent: httpsAgent(false) });
// 更新模型状态和模型内容
await Model.findByIdAndUpdate(modelId, {
@@ -75,7 +77,7 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
/* 取消微调 */
if (data.status === OpenAiTuneStatusEnum.cancelled) {
// 删除训练文件
openai.deleteFile(data.training_files[0].id, { httpsAgent });
openai.deleteFile(data.training_files[0].id, { httpsAgent: httpsAgent(false) });
// 更新模型
await Model.findByIdAndUpdate(modelId, {

View File

@@ -3,7 +3,8 @@ import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { connectToDatabase, Model, Training } from '@/service/mongo';
import formidable from 'formidable';
import { authToken, getUserApiOpenai } from '@/service/utils/tools';
import { authToken } from '@/service/utils/tools';
import { getUserApiOpenai } from '@/service/utils/openai';
import { join } from 'path';
import fs from 'fs';
import type { ModelSchema } from '@/types/mongoSchema';
@@ -74,7 +75,7 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
// @ts-ignore
fs.createReadStream(file.filepath),
'fine-tune',
{ httpsAgent }
{ httpsAgent: httpsAgent(false) }
);
uploadFileId = uploadRes.data.id; // 记录上传文件的 ID
@@ -86,7 +87,7 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
suffix: model.name,
n_epochs: 4
},
{ httpsAgent }
{ httpsAgent: httpsAgent(false) }
);
trainId = trainRes.data.id; // 记录训练 ID
@@ -116,9 +117,9 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
// @ts-ignore
if (openai) {
// @ts-ignore
uploadFileId && openai.deleteFile(uploadFileId, { httpsAgent });
uploadFileId && openai.deleteFile(uploadFileId, { httpsAgent: httpsAgent(false) });
// @ts-ignore
trainId && openai.cancelFineTune(trainId, { httpsAgent });
trainId && openai.cancelFineTune(trainId, { httpsAgent: httpsAgent(false) });
}
jsonRes(res, {

View File

@@ -8,7 +8,7 @@ import type { ModelUpdateParams } from '@/types/model';
/* 获取我的模型 */
export default async function handler(req: NextApiRequest, res: NextApiResponse<any>) {
try {
const { name, service, security, systemPrompt, intro, temperature } =
const { name, search, service, security, systemPrompt, intro, temperature } =
req.body as ModelUpdateParams;
const { modelId } = req.query as { modelId: string };
const { authorization } = req.headers;
@@ -37,6 +37,7 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse<
systemPrompt,
intro,
temperature,
search,
// service,
security
}

View File

@@ -0,0 +1,250 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { connectToDatabase, Model } from '@/service/mongo';
import { getOpenAIApi } from '@/service/utils/chat';
import { authOpenApiKey } from '@/service/utils/tools';
import { httpsAgent, openaiChatFilter, systemPromptFilter } from '@/service/utils/tools';
import { ChatCompletionRequestMessage, ChatCompletionRequestMessageRoleEnum } from 'openai';
import { ChatItemType } from '@/types/chat';
import { jsonRes } from '@/service/response';
import { PassThrough } from 'stream';
import { ChatModelNameEnum, modelList, ChatModelNameMap } from '@/constants/model';
import { pushChatBill } from '@/service/events/pushBill';
import { connectRedis } from '@/service/redis';
import { VecModelDataPrefix } from '@/constants/redis';
import { vectorToBuffer } from '@/utils/tools';
import { openaiCreateEmbedding, gpt35StreamResponse } from '@/service/utils/openai';
/* 发送提示词 */
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
let step = 0; // step=1时表示开始了流响应
const stream = new PassThrough();
stream.on('error', () => {
console.log('error: ', 'stream error');
stream.destroy();
});
res.on('close', () => {
stream.destroy();
});
res.on('error', () => {
console.log('error: ', 'request error');
stream.destroy();
});
try {
const {
prompt,
modelId,
isStream = true
} = req.body as {
prompt: ChatItemType;
modelId: string;
isStream: boolean;
};
if (!prompt || !modelId) {
throw new Error('缺少参数');
}
await connectToDatabase();
const redis = await connectRedis();
let startTime = Date.now();
/* 凭证校验 */
const { apiKey, userId } = await authOpenApiKey(req);
/* 查找数据库里的模型信息 */
const model = await Model.findById(modelId);
if (!model) {
throw new Error('找不到模型');
}
const modelConstantsData = modelList.find(
(item) => item.model === ChatModelNameEnum.VECTOR_GPT
);
if (!modelConstantsData) {
throw new Error('模型已下架');
}
console.log('laf gpt start');
// 获取 chatAPI
const chatAPI = getOpenAIApi(apiKey);
// 请求一次 chatgpt 拆解需求
const promptResponse = await chatAPI.createChatCompletion(
{
model: ChatModelNameMap[ChatModelNameEnum.GPT35],
temperature: 0,
frequency_penalty: 0.5, // 越大,重复内容越少
presence_penalty: -0.5, // 越大,越容易出现新内容
messages: [
{
role: 'system',
content: `服务端逻辑生成器.根据用户输入的需求,拆解成代码实现的步骤,并按格式返回: 1.\n2.\n3.\n ......
下面是一些例子:
实现一个手机号发生注册验证码方法.
1. 从 query 中获取 phone.
2. 校验手机号格式是否正确,不正确则返回错误响应,消息为:手机号格式错误.
3. 给 phone 发送一个短信验证码,验证码长度为6位字符串,内容为:你正在注册laf,验证码为:code.
4. 数据库添加数据,表为"codes",内容为 {phone, code}.
实现根据手机号注册账号,需要验证手机验证码.
1. 从 body 中获取 phone 和 code.
2. 校验手机号格式是否正确,不正确返回错误响应,消息为:手机号格式错误.
2. 获取数据库数据,表为"codes",查找是否有符合 phone, code 等于body参数的记录,没有的话错误响应,消息为:验证码不正确.
4. 添加数据库数据,表为"users" ,内容为{phone, code, createTime}.
5. 删除数据库数据,删除 code 记录.
更新博客记录。传入blogId,blogText,tags,还需要记录更新的时间.
1. 从 body 中获取 blogId,blogText 和 tags.
2. 校验 blogId 是否为空,为空则错误响应,消息为:博客ID不能为空.
3. 校验 blogText 是否为空,为空则错误响应,消息为:博客内容不能为空.
4. 校验 tags 是否为数组,不是则错误响应,消息为:标签必须为数组.
5. 获取当前时间,记录为 updateTime.
6. 更新数据库数据,表为"blogs",更新符合 blogId 的记录的内容为{blogText, tags, updateTime}.
7. 返回结果 {message: "更新博客记录成功"}.`
},
{
role: 'user',
content: prompt.value
}
]
},
{
timeout: 120000,
httpsAgent: httpsAgent(true)
}
);
const promptResolve = promptResponse.data.choices?.[0]?.message?.content || '';
if (!promptResolve) {
throw new Error('gpt 异常');
}
prompt.value += ` ${promptResolve}`;
console.log('prompt resolve success, time:', `${(Date.now() - startTime) / 1000}s`);
// 获取提示词的向量
const { vector: promptVector } = await openaiCreateEmbedding({
isPay: true,
apiKey,
userId,
text: prompt.value
});
// 读取对话内容
const prompts = [prompt];
// 搜索系统提示词, 按相似度从 redis 中搜出相关的 q 和 text
const redisData: any[] = await redis.sendCommand([
'FT.SEARCH',
`idx:${VecModelDataPrefix}:hash`,
`@modelId:{${String(model._id)}}=>[KNN 20 @vector $blob AS score]`,
'RETURN',
'1',
'text',
'SORTBY',
'score',
'PARAMS',
'2',
'blob',
vectorToBuffer(promptVector),
'DIALECT',
'2'
]);
// 格式化响应值,获取 qa
const formatRedisPrompt: string[] = [];
for (let i = 2; i < 42; i += 2) {
const text = redisData[i]?.[1];
if (text) {
formatRedisPrompt.push(text);
}
}
// textArr 筛选,最多 3200 tokens
const systemPrompt = systemPromptFilter(formatRedisPrompt, 3200);
prompts.unshift({
obj: 'SYSTEM',
value: `${model.systemPrompt} 知识库内容是最新的,知识库内容为: "${systemPrompt}"`
});
// 控制在 tokens 数量,防止超出
const filterPrompts = openaiChatFilter(prompts, modelConstantsData.contextMaxToken);
// 格式化文本内容成 chatgpt 格式
const map = {
Human: ChatCompletionRequestMessageRoleEnum.User,
AI: ChatCompletionRequestMessageRoleEnum.Assistant,
SYSTEM: ChatCompletionRequestMessageRoleEnum.System
};
const formatPrompts: ChatCompletionRequestMessage[] = filterPrompts.map(
(item: ChatItemType) => ({
role: map[item.obj],
content: item.value
})
);
// console.log(formatPrompts);
// 计算温度
const temperature = modelConstantsData.maxTemperature * (model.temperature / 10);
// 发出请求
const chatResponse = await chatAPI.createChatCompletion(
{
model: model.service.chatModel,
temperature,
messages: formatPrompts,
frequency_penalty: 0.5, // 越大,重复内容越少
presence_penalty: -0.5, // 越大,越容易出现新内容
stream: isStream
},
{
timeout: 120000,
responseType: isStream ? 'stream' : 'json',
httpsAgent: httpsAgent(true)
}
);
console.log('code response. time:', `${(Date.now() - startTime) / 1000}s`);
step = 1;
let responseContent = '';
if (isStream) {
const streamResponse = await gpt35StreamResponse({
res,
stream,
chatResponse
});
responseContent = streamResponse.responseContent;
} else {
responseContent = chatResponse.data.choices?.[0]?.message?.content || '';
jsonRes(res, {
data: responseContent
});
}
console.log('laf gpt done. time:', `${(Date.now() - startTime) / 1000}s`);
const promptsContent = formatPrompts.map((item) => item.content).join('');
pushChatBill({
isPay: true,
modelName: model.service.modelName,
userId,
text: promptsContent + responseContent
});
} catch (err: any) {
if (step === 1) {
// 直接结束流
console.log('error结束');
stream.destroy();
} else {
res.status(500);
jsonRes(res, {
code: 500,
error: err
});
}
}
}

View File

@@ -0,0 +1,213 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { connectToDatabase, Model } from '@/service/mongo';
import {
httpsAgent,
openaiChatFilter,
systemPromptFilter,
authOpenApiKey
} from '@/service/utils/tools';
import { ChatCompletionRequestMessage, ChatCompletionRequestMessageRoleEnum } from 'openai';
import { ChatItemType } from '@/types/chat';
import { jsonRes } from '@/service/response';
import { PassThrough } from 'stream';
import { modelList } from '@/constants/model';
import { pushChatBill } from '@/service/events/pushBill';
import { connectRedis } from '@/service/redis';
import { VecModelDataPrefix } from '@/constants/redis';
import { vectorToBuffer } from '@/utils/tools';
import { openaiCreateEmbedding, gpt35StreamResponse } from '@/service/utils/openai';
import dayjs from 'dayjs';
/* 发送提示词 */
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
let step = 0; // step=1时表示开始了流响应
const stream = new PassThrough();
stream.on('error', () => {
console.log('error: ', 'stream error');
stream.destroy();
});
res.on('close', () => {
stream.destroy();
});
res.on('error', () => {
console.log('error: ', 'request error');
stream.destroy();
});
try {
const {
prompts,
modelId,
isStream = true
} = req.body as {
prompts: ChatItemType[];
modelId: string;
isStream: boolean;
};
if (!prompts || !modelId) {
throw new Error('缺少参数');
}
if (!Array.isArray(prompts)) {
throw new Error('prompts is not array');
}
if (prompts.length > 30 || prompts.length === 0) {
throw new Error('prompts length range 1-30');
}
await connectToDatabase();
const redis = await connectRedis();
let startTime = Date.now();
/* 凭证校验 */
const { apiKey, userId } = await authOpenApiKey(req);
const model = await Model.findOne({
_id: modelId,
userId
});
if (!model) {
throw new Error('无权使用该模型');
}
const modelConstantsData = modelList.find((item) => item.model === model?.service?.modelName);
if (!modelConstantsData) {
throw new Error('模型初始化异常');
}
// 获取提示词的向量
const { vector: promptVector, chatAPI } = await openaiCreateEmbedding({
isPay: true,
apiKey,
userId,
text: prompts[prompts.length - 1].value // 取最后一个
});
// 搜索系统提示词, 按相似度从 redis 中搜出相关的 q 和 text
const redisData: any[] = await redis.sendCommand([
'FT.SEARCH',
`idx:${VecModelDataPrefix}:hash`,
`@modelId:{${modelId}} @vector:[VECTOR_RANGE 0.22 $blob]=>{$YIELD_DISTANCE_AS: score}`,
'RETURN',
'1',
'text',
'SORTBY',
'score',
'PARAMS',
'2',
'blob',
vectorToBuffer(promptVector),
'LIMIT',
'0',
'30',
'DIALECT',
'2'
]);
const formatRedisPrompt: string[] = [];
// 格式化响应值,获取 qa
for (let i = 2; i < 61; i += 2) {
const text = redisData[i]?.[1];
if (text) {
formatRedisPrompt.push(text);
}
}
// system 合并
if (prompts[0].obj === 'SYSTEM') {
formatRedisPrompt.unshift(prompts.shift()?.value || '');
}
if (formatRedisPrompt.length > 0) {
// 系统提示词过滤,最多 2800 tokens
const systemPrompt = systemPromptFilter(formatRedisPrompt, 2800);
prompts.unshift({
obj: 'SYSTEM',
value: `${model.systemPrompt} 用知识库内容回答,知识库内容为: "当前时间:${dayjs().format(
'YYYY/MM/DD HH:mm:ss'
)} ${systemPrompt}"`
});
} else {
return res.send('对不起,你的问题不在知识库中。');
}
// 控制在 tokens 数量,防止超出
const filterPrompts = openaiChatFilter(prompts, modelConstantsData.contextMaxToken);
// 格式化文本内容成 chatgpt 格式
const map = {
Human: ChatCompletionRequestMessageRoleEnum.User,
AI: ChatCompletionRequestMessageRoleEnum.Assistant,
SYSTEM: ChatCompletionRequestMessageRoleEnum.System
};
const formatPrompts: ChatCompletionRequestMessage[] = filterPrompts.map(
(item: ChatItemType) => ({
role: map[item.obj],
content: item.value
})
);
// console.log(formatPrompts);
// 计算温度
const temperature = modelConstantsData.maxTemperature * (model.temperature / 10);
// 发出请求
const chatResponse = await chatAPI.createChatCompletion(
{
model: model.service.chatModel,
temperature: temperature,
messages: formatPrompts,
frequency_penalty: 0.5, // 越大,重复内容越少
presence_penalty: -0.5, // 越大,越容易出现新内容
stream: isStream
},
{
timeout: 120000,
responseType: isStream ? 'stream' : 'json',
httpsAgent: httpsAgent(true)
}
);
console.log('api response time:', `${(Date.now() - startTime) / 1000}s`);
step = 1;
let responseContent = '';
if (isStream) {
const streamResponse = await gpt35StreamResponse({
res,
stream,
chatResponse
});
responseContent = streamResponse.responseContent;
} else {
responseContent = chatResponse.data.choices?.[0]?.message?.content || '';
jsonRes(res, {
data: responseContent
});
}
const promptsContent = formatPrompts.map((item) => item.content).join('');
pushChatBill({
isPay: true,
modelName: model.service.modelName,
userId,
text: promptsContent + responseContent
});
// jsonRes(res);
} catch (err: any) {
if (step === 1) {
// 直接结束流
console.log('error结束');
stream.destroy();
} else {
res.status(500);
jsonRes(res, {
code: 500,
error: err
});
}
}
}

View File

@@ -0,0 +1,33 @@
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { connectToDatabase, OpenApi } from '@/service/mongo';
import { authToken } from '@/service/utils/tools';
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
try {
const { id } = req.query as { id: string };
const { authorization } = req.headers;
if (!authorization) {
throw new Error('缺少登录凭证');
}
if (!id) {
throw new Error('缺少参数');
}
const userId = await authToken(authorization);
await connectToDatabase();
await OpenApi.findOneAndRemove({ _id: id, userId });
jsonRes(res);
} catch (err) {
jsonRes(res, {
code: 500,
error: err
});
}
}

View File

@@ -0,0 +1,43 @@
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { connectToDatabase, OpenApi } from '@/service/mongo';
import { authToken } from '@/service/utils/tools';
import { UserOpenApiKey } from '@/types/openapi';
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
try {
const { authorization } = req.headers;
if (!authorization) {
throw new Error('缺少登录凭证');
}
const userId = await authToken(authorization);
await connectToDatabase();
const findResponse = await OpenApi.find({ userId }).sort({ _id: -1 });
// jus save four data
const apiKeys = findResponse.map<UserOpenApiKey>(
({ _id, apiKey, createTime, lastUsedTime }) => {
return {
id: _id,
apiKey: `${apiKey.substring(0, 2)}******${apiKey.substring(apiKey.length - 2)}`,
createTime,
lastUsedTime
};
}
);
jsonRes(res, {
data: apiKeys
});
} catch (err) {
jsonRes(res, {
code: 500,
error: err
});
}
}

View File

@@ -0,0 +1,43 @@
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { connectToDatabase, OpenApi } from '@/service/mongo';
import { authToken } from '@/service/utils/tools';
import { customAlphabet } from 'nanoid';
const nanoid = customAlphabet('abcdefghijklmnopqrstuvwxyz1234567890');
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
try {
const { authorization } = req.headers;
if (!authorization) {
throw new Error('缺少登录凭证');
}
const userId = await authToken(authorization);
await connectToDatabase();
const count = await OpenApi.find({ userId }).countDocuments();
if (count >= 5) {
throw new Error('最多 5 组API Key');
}
const apiKey = `${userId}-${nanoid()}`;
await OpenApi.create({
userId,
apiKey
});
jsonRes(res, {
data: apiKey
});
} catch (err) {
jsonRes(res, {
code: 500,
error: err
});
}
}

View File

@@ -1,27 +0,0 @@
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { AuthCode } from '@/service/models/authCode';
import { connectToDatabase } from '@/service/mongo';
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
if (process.env.NODE_ENV !== 'development') {
throw new Error('不是开发环境');
}
try {
await connectToDatabase();
const authCode = await AuthCode.deleteMany({
expiredTime: { $lt: Date.now() }
});
jsonRes(res, {
message: `删除了${authCode.deletedCount}条记录`
});
} catch (err) {
jsonRes(res, {
code: 500,
error: err
});
}
}

View File

@@ -1,28 +0,0 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { connectToDatabase, Chat } from '@/service/mongo';
/* 定时删除那些不活跃的内容 */
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
if (process.env.NODE_ENV !== 'development') {
throw new Error('不是开发环境');
}
try {
await connectToDatabase();
const response = await Chat.deleteMany(
{ $expr: { $lt: [{ $size: '$content' }, 5] } },
// 使用 $pull 操作符删除数组中的元素
{ $pull: { content: { $exists: true } } }
);
jsonRes(res, {
message: `删除了${response.deletedCount}条记录`
});
} catch (err) {
jsonRes(res, {
code: 500,
error: err
});
}
}

View File

@@ -1,35 +0,0 @@
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { connectToDatabase, Bill } from '@/service/mongo';
import { authToken } from '@/service/utils/tools';
import type { BillSchema } from '@/types/mongoSchema';
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
try {
if (process.env.NODE_ENV !== 'development') {
throw new Error('不是开发环境');
}
await connectToDatabase();
const bills = await Bill.find({
tokenLen: { $exists: false }
});
await Promise.all(
bills.map((bill) =>
Bill.findByIdAndUpdate(bill._id, {
tokenLen: bill.textLen
})
)
);
jsonRes(res, {
data: {}
});
} catch (err) {
jsonRes(res, {
code: 500,
error: err
});
}
}

View File

@@ -1,37 +0,0 @@
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { connectToDatabase, DataItem, Data } from '@/service/mongo';
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
try {
if (process.env.NODE_ENV !== 'development') {
throw new Error('不是开发环境');
}
await connectToDatabase();
// await DataItem.updateMany(
// {},
// {
// type: 'QA'
// // times: 2
// }
// );
await Data.updateMany(
{},
{
type: 'QA'
}
);
jsonRes(res, {
data: {}
});
} catch (err) {
jsonRes(res, {
code: 500,
error: err
});
}
}

View File

@@ -1,68 +0,0 @@
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { connectToDatabase, Bill } from '@/service/mongo';
import { authToken } from '@/service/utils/tools';
import type { BillSchema } from '@/types/mongoSchema';
import { VecModelDataIndex } from '@/constants/redis';
import { connectRedis } from '@/service/redis';
import { vectorToBuffer } from '@/utils/tools';
let vectorData = [
-0.025028639, -0.010407282, 0.026523087, -0.0107438695, -0.006967359, 0.010043768, -0.012043097,
0.008724345, -0.028919589, -0.0117738275, 0.0050690062, 0.02961969
].concat(new Array(1524).fill(0));
let vectorData2 = [
0.025028639, 0.010407282, 0.026523087, 0.0107438695, -0.006967359, 0.010043768, -0.012043097,
0.008724345, 0.028919589, 0.0117738275, 0.0050690062, 0.02961969
].concat(new Array(1524).fill(0));
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
try {
if (process.env.NODE_ENV !== 'development') {
throw new Error('不是开发环境');
}
await connectToDatabase();
const redis = await connectRedis();
await redis.sendCommand([
'HMSET',
'model:data:333',
'vector',
vectorToBuffer(vectorData2),
'modelId',
'1133',
'dataId',
'safadfa'
]);
// search
const response = await redis.sendCommand([
'FT.SEARCH',
'idx:model:data:hash',
'@modelId:{1133} @vector:[VECTOR_RANGE 0.15 $blob]=>{$YIELD_DISTANCE_AS: score}',
'RETURN',
'2',
'modelId',
'dataId',
'PARAMS',
'2',
'blob',
vectorToBuffer(vectorData2),
'SORTBY',
'score',
'DIALECT',
'2'
]);
jsonRes(res, {
data: response
});
} catch (err) {
jsonRes(res, {
code: 500,
error: err
});
}
}

View File

@@ -1,80 +0,0 @@
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { connectToDatabase, Training, Model } from '@/service/mongo';
import type { TrainingItemType } from '@/types/training';
import { TrainingStatusEnum, ModelStatusEnum } from '@/constants/model';
import { getOpenAIApi } from '@/service/utils/chat';
import { getUserApiOpenai } from '@/service/utils/tools';
import { OpenAiTuneStatusEnum } from '@/service/constants/training';
import { sendTrainSucceed } from '@/service/utils/sendEmail';
import { httpsAgent } from '@/service/utils/tools';
import { ModelPopulate } from '@/types/mongoSchema';
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
if (process.env.NODE_ENV !== 'development') {
throw new Error('不是开发环境');
}
try {
await connectToDatabase();
// 查询正在训练中的训练记录
const trainingRecords: TrainingItemType[] = await Training.find({
status: TrainingStatusEnum.pending
});
const { openai } = await getUserApiOpenai('63f9a14228d2a688d8dc9e1b');
const response = await Promise.all(
trainingRecords.map(async (item) => {
const { data } = await openai.retrieveFineTune(item.tuneId, { httpsAgent });
if (data.status === OpenAiTuneStatusEnum.succeeded) {
// 删除训练文件
openai.deleteFile(data.training_files[0].id, { httpsAgent });
const model = await Model.findById<ModelPopulate>(item.modelId).populate({
path: 'userId',
options: {
strictPopulate: false
}
});
if (!model) {
throw new Error('模型不存在');
}
// 更新模型
await Model.findByIdAndUpdate(item.modelId, {
status: ModelStatusEnum.running,
updateTime: new Date(),
service: {
...model.service,
trainId: data.fine_tuned_model, // 训练完后,再次训练和对话使用的 model 是一样的
chatModel: data.fine_tuned_model
}
});
// 更新训练数据
await Training.findByIdAndUpdate(item._id, {
status: TrainingStatusEnum.succeed
});
// 发送邮件通知
await sendTrainSucceed(model.userId.email as string, model.name);
return 'succeed';
}
return 'pending';
})
);
jsonRes(res, {
data: `${response.length}个训练线程,${
response.filter((item) => item === 'succeed').length
}个完成`
});
} catch (err) {
jsonRes(res, {
code: 500,
error: err
});
}
}

View File

@@ -1,11 +1,12 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import axios from 'axios';
import { connectToDatabase, User, Pay } from '@/service/mongo';
import { authToken } from '@/service/utils/tools';
import { PaySchema } from '@/types/mongoSchema';
import dayjs from 'dayjs';
import { getPayResult } from '@/service/utils/wxpay';
/* 校验支付结果 */
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
try {
const { authorization } = req.headers;
@@ -25,18 +26,16 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
throw new Error('订单已结算');
}
const { data } = await axios.get(
`https://sif268.laf.dev/wechat-order-query?order_number=${payOrder.orderId}&api_key=${process.env.WXPAYCODE}`
);
const payRes = await getPayResult(payOrder.orderId);
// 校验下是否超过一天
const orderTime = dayjs(payOrder.createTime);
const diffInHours = dayjs().diff(orderTime, 'hours');
if (data.trade_state === 'SUCCESS') {
if (payRes.trade_state === 'SUCCESS') {
// 订单已支付
try {
// 更新订单状态
// 更新订单状态. 如果没有合适的订单,说明订单重复了
const updateRes = await Pay.updateOne(
{
_id: payId,
@@ -61,7 +60,7 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
});
console.log(error);
}
} else if (data.trade_state === 'CLOSED' || diffInHours > 24) {
} else if (payRes.trade_state === 'CLOSED' || diffInHours > 24) {
// 订单已关闭
await Pay.findByIdAndUpdate(payId, {
status: 'CLOSED'
@@ -70,7 +69,7 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
data: '订单已过期'
});
} else {
throw new Error(data.trade_state_desc);
throw new Error(payRes?.trade_state_desc || '订单无效');
}
} catch (err) {
// console.log(err);

View File

@@ -1,14 +1,14 @@
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import axios from 'axios';
import { authToken } from '@/service/utils/tools';
import { customAlphabet } from 'nanoid';
import { connectToDatabase, Pay } from '@/service/mongo';
import { PRICE_SCALE } from '@/constants/common';
import { nativePay } from '@/service/utils/wxpay';
const nanoid = customAlphabet('abcdefghijklmnopqrstuvwxyz1234567890', 20);
/* 获取支付二维码 */
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
try {
const { authorization } = req.headers;
@@ -23,15 +23,7 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
const id = nanoid();
await connectToDatabase();
const response = await axios({
url: 'https://sif268.laf.dev/wechat-pay',
method: 'POST',
data: {
trade_order_number: id,
amount: amount * 100,
api_key: process.env.WXPAYCODE
}
});
const code_url = await nativePay(amount * 100, id);
// 充值记录 + 1
const payOrder = await Pay.create({
@@ -43,11 +35,11 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
jsonRes(res, {
data: {
payId: payOrder._id,
codeUrl: response.data?.code_url
codeUrl: code_url
}
});
} catch (err) {
console.log(err);
console.log(err, '==');
jsonRes(res, {
code: 500,
error: err

View File

@@ -0,0 +1,18 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import axios from 'axios';
import { connectToDatabase, User, Pay } from '@/service/mongo';
import { authToken } from '@/service/utils/tools';
import { PaySchema } from '@/types/mongoSchema';
import dayjs from 'dayjs';
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
try {
res.send('');
} catch (err) {
jsonRes(res, {
code: 500,
error: err
});
}
}

View File

@@ -16,15 +16,6 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
await connectToDatabase();
// 注册人数限流
if (type === EmailTypeEnum.register) {
const maxCount = process.env.MAX_USER ? +process.env.MAX_USER : Infinity;
const userCount = await User.count();
if (userCount >= maxCount) {
throw new Error('当前注册用户已满,请等待名额~');
}
}
let code = '';
for (let i = 0; i < 6; i++) {
code += Math.floor(Math.random() * 10);

View File

@@ -9,7 +9,7 @@ import { UserUpdateParams } from '@/types/user';
/* 更新一些基本信息 */
export default async function handler(req: NextApiRequest, res: NextApiResponse<any>) {
try {
const { accounts } = req.body as UserUpdateParams;
const { openaiKey } = req.body as UserUpdateParams;
const { authorization } = req.headers;
if (!authorization) {
@@ -26,8 +26,7 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse<
_id: userId
},
{
// 限定字段
...(accounts ? { accounts } : {})
openaiKey
}
);

View File

@@ -1,6 +1,6 @@
import React from 'react';
import { Card, Box, Mark } from '@chakra-ui/react';
import { versionIntro, chatProblem } from '@/constants/common';
import { Card, Box } from '@chakra-ui/react';
import { useMarkdown } from '@/hooks/useMarkdown';
import Markdown from '@/components/Markdown';
const Empty = ({ intro }: { intro: string }) => {
@@ -9,6 +9,9 @@ const Empty = ({ intro }: { intro: string }) => {
{children}
</Box>
);
const { data: chatProblem } = useMarkdown({ url: '/chatProblem.md' });
const { data: versionIntro } = useMarkdown({ url: '/versionIntro.md' });
return (
<Box
minH={'100%'}
@@ -30,7 +33,6 @@ const Empty = ({ intro }: { intro: string }) => {
<Markdown source={versionIntro} />
</Card>
<Card p={4}>
<Header></Header>
<Markdown source={chatProblem} />
</Card>
</Box>

View File

@@ -30,18 +30,16 @@ import { getToken } from '@/utils/user';
import MyIcon from '@/components/Icon';
import { useCopyData } from '@/utils/tools';
import Markdown from '@/components/Markdown';
import { shareHint } from '@/constants/common';
import { getChatSiteId } from '@/api/chat';
import WxConcat from '@/components/WxConcat';
import { useMarkdown } from '@/hooks/useMarkdown';
const SlideBar = ({
name,
chatId,
modelId,
resetChat,
onClose
}: {
name?: string;
chatId: string;
modelId: string;
resetChat: () => void;
@@ -55,6 +53,7 @@ const SlideBar = ({
const [hasReady, setHasReady] = useState(false);
const { isOpen: isOpenShare, onOpen: onOpenShare, onClose: onCloseShare } = useDisclosure();
const { isOpen: isOpenWx, onOpen: onOpenWx, onClose: onCloseWx } = useDisclosure();
const { data: shareHint } = useMarkdown({ url: '/chatProblem.md' });
const { isSuccess } = useQuery(['init'], getMyModels, {
cacheTime: 5 * 60 * 1000
@@ -187,14 +186,14 @@ const SlideBar = ({
}}
fontSize={'xs'}
border={'1px solid transparent'}
{...(item.name === name
{...(item._id === modelId
? {
borderColor: 'rgba(255,255,255,0.5)',
backgroundColor: 'rgba(255,255,255,0.1)'
}
: {})}
onClick={async () => {
if (item.name === name) return;
if (item._id === modelId) return;
router.replace(`/chat?chatId=${await getChatSiteId(item._id)}`);
onClose();
}}
@@ -281,7 +280,7 @@ const SlideBar = ({
mr={3}
onClick={async () => {
copyData(
`${location.origin}/chat?chatId=${await getChatSiteId(modelId, true)}`,
`${location.origin}/chat?chatId=${await getChatSiteId(modelId)}`,
'已复制分享链接'
);
onCloseShare();

View File

@@ -1,13 +1,7 @@
import React, { useCallback, useState, useRef, useMemo, useEffect } from 'react';
import { useRouter } from 'next/router';
import Image from 'next/image';
import {
getInitChatSiteInfo,
getChatSiteId,
postGPT3SendPrompt,
delChatRecordByIndex,
postSaveChat
} from '@/api/chat';
import { getInitChatSiteInfo, getChatSiteId, delChatRecordByIndex, postSaveChat } from '@/api/chat';
import type { InitChatResponse } from '@/api/response/chat';
import { ChatSiteItemType } from '@/types/chat';
import {
@@ -33,12 +27,11 @@ import { useGlobalStore } from '@/store/global';
import { useChatStore } from '@/store/chat';
import { useCopyData } from '@/utils/tools';
import { streamFetch } from '@/api/fetch';
import SlideBar from './components/SlideBar';
import Empty from './components/Empty';
import Icon from '@/components/Icon';
import { encode } from 'gpt-token-utils';
import { modelList } from '@/constants/model';
const SlideBar = dynamic(() => import('./components/SlideBar'));
const Empty = dynamic(() => import('./components/Empty'));
const Markdown = dynamic(() => import('@/components/Markdown'));
const textareaMinH = '22px';
@@ -48,10 +41,12 @@ interface ChatType extends InitChatResponse {
}
const Chat = ({ chatId }: { chatId: string }) => {
const { toast } = useToast();
const router = useRouter();
const ChatBox = useRef<HTMLDivElement>(null);
const TextareaDom = useRef<HTMLTextAreaElement>(null);
const { toast } = useToast();
const router = useRouter();
// 中断请求
const controller = useRef(new AbortController());
const [chatData, setChatData] = useState<ChatType>({
@@ -70,11 +65,11 @@ const Chat = ({ chatId }: { chatId: string }) => {
() => chatData.history[chatData.history.length - 1]?.status === 'loading',
[chatData.history]
);
const { isOpen: isOpenSlider, onClose: onCloseSlider, onOpen: onOpenSlider } = useDisclosure();
const { copyData } = useCopyData();
const { isPc, media } = useScreen();
const { setLoading } = useGlobalStore();
const { isOpen: isOpenSlider, onClose: onCloseSlider, onOpen: onOpenSlider } = useDisclosure();
const { pushChatHistory } = useChatStore();
// 滚动到底部
@@ -119,8 +114,7 @@ const Chat = ({ chatId }: { chatId: string }) => {
async (prompts: ChatSiteItemType) => {
const urlMap: Record<string, string> = {
[ChatModelNameEnum.GPT35]: '/api/chat/chatGpt',
[ChatModelNameEnum.VECTOR_GPT]: '/api/chat/vectorGpt',
[ChatModelNameEnum.GPT3]: '/api/chat/gpt3'
[ChatModelNameEnum.VECTOR_GPT]: '/api/chat/vectorGpt'
};
if (!urlMap[chatData.modelName]) return Promise.reject('找不到模型');
@@ -191,24 +185,31 @@ const Chat = ({ chatId }: { chatId: string }) => {
* 发送一个内容
*/
const sendPrompt = useCallback(async () => {
if (isChatting) {
toast({
title: '正在聊天中...请等待结束',
status: 'warning'
});
return;
}
const storeInput = inputVal;
// 去除空行
const val = inputVal
.trim()
.split('\n')
.filter((val) => val)
.join('\n');
if (!chatData?.modelId || !val || !ChatBox.current || isChatting) {
const val = inputVal.trim().replace(/\n\s*/g, '\n');
if (!chatData?.modelId || !val) {
toast({
title: '内容为空',
status: 'warning'
});
return;
}
// 长度校验
const tokens = encode(val).length;
const model = modelList.find((item) => item.model === chatData.modelName);
if (model && tokens >= model.maxToken) {
if (model && val.length >= model.maxToken) {
toast({
title: '单次输入超出 4000 tokens',
title: '单次输入超出 4000 字符',
status: 'warning'
});
return;
@@ -298,22 +299,13 @@ const Chat = ({ chatId }: { chatId: string }) => {
// 复制内容
const onclickCopy = useCallback(
(chatId: string) => {
const dom = document.getElementById(chatId);
const innerText = dom?.innerText;
innerText && copyData(innerText);
(value: string) => {
const val = value.replace(/\n+/g, '\n');
copyData(val);
},
[copyData]
);
useEffect(() => {
controller.current = new AbortController();
return () => {
// eslint-disable-next-line react-hooks/exhaustive-deps
controller.current?.abort();
};
}, [chatId]);
// 初始化聊天框
useQuery(
['init', chatId],
@@ -343,6 +335,7 @@ const Chat = ({ chatId }: { chatId: string }) => {
isClosable: true,
duration: 5000
});
router.push('/model/list');
},
onSettled() {
setLoading(false);
@@ -350,6 +343,14 @@ const Chat = ({ chatId }: { chatId: string }) => {
}
);
// 更新流中断对象
useEffect(() => {
controller.current = new AbortController();
return () => {
// eslint-disable-next-line react-hooks/exhaustive-deps
controller.current?.abort();
};
}, [chatId]);
return (
<Flex
h={'100%'}
@@ -360,7 +361,6 @@ const Chat = ({ chatId }: { chatId: string }) => {
<Box flex={'0 0 250px'} w={0} h={'100%'}>
<SlideBar
resetChat={resetChat}
name={chatData?.name}
chatId={chatId}
modelId={chatData.modelId}
onClose={onCloseSlider}
@@ -392,7 +392,6 @@ const Chat = ({ chatId }: { chatId: string }) => {
<DrawerContent maxWidth={'250px'}>
<SlideBar
resetChat={resetChat}
name={chatData?.name}
chatId={chatId}
modelId={chatData.modelId}
onClose={onCloseSlider}
@@ -431,7 +430,7 @@ const Chat = ({ chatId }: { chatId: string }) => {
/>
</MenuButton>
<MenuList fontSize={'sm'}>
<MenuItem onClick={() => onclickCopy(`chat${index}`)}></MenuItem>
<MenuItem onClick={() => onclickCopy(item.value)}></MenuItem>
<MenuItem onClick={() => delChatRecord(index)}></MenuItem>
</MenuList>
</Menu>
@@ -452,9 +451,8 @@ const Chat = ({ chatId }: { chatId: string }) => {
</Box>
{/* 发送区 */}
<Box m={media('20px auto', '0 auto')} w={'100%'} maxW={media('min(750px, 100%)', 'auto')}>
<Flex
alignItems={'flex-end'}
py={5}
<Box
py={'18px'}
position={'relative'}
boxShadow={`0 0 15px rgba(0,0,0,0.1)`}
border={media('1px solid', '0')}
@@ -465,9 +463,8 @@ const Chat = ({ chatId }: { chatId: string }) => {
{/* 输入框 */}
<Textarea
ref={TextareaDom}
flex={1}
w={0}
py={0}
pr={['45px', '55px']}
border={'none'}
_focusVisible={{
border: 'none'
@@ -481,6 +478,8 @@ const Chat = ({ chatId }: { chatId: string }) => {
maxHeight={'150px'}
maxLength={-1}
overflowY={'auto'}
whiteSpace={'pre-wrap'}
wordBreak={'break-all'}
color={useColorModeValue('blackAlpha.700', 'white')}
onChange={(e) => {
const textarea = e.target;
@@ -500,27 +499,34 @@ const Chat = ({ chatId }: { chatId: string }) => {
}}
/>
{/* 发送和等待按键 */}
<Box px={4} onClick={sendPrompt}>
<Flex
alignItems={'center'}
justifyContent={'center'}
h={'30px'}
w={'30px'}
position={'absolute'}
right={['12px', '20px']}
bottom={'15px'}
onClick={sendPrompt}
>
{isChatting ? (
<Image
<Icon
style={{ transform: 'translateY(4px)' }}
src={'/icon/chatting.svg'}
width={30}
height={30}
alt={''}
h={'30px'}
w={'30px'}
name={'chatting'}
/>
) : (
<Box cursor={'pointer'}>
<Icon
name={'chatSend'}
width={'20px'}
height={'20px'}
fill={useColorModeValue('#718096', 'white')}
></Icon>
</Box>
<Icon
name={'chatSend'}
width={['18px', '20px']}
height={['18px', '20px']}
cursor={'pointer'}
fill={useColorModeValue('#718096', 'white')}
></Icon>
)}
</Box>
</Flex>
</Flex>
</Box>
</Box>
</Flex>
</Flex>
@@ -530,7 +536,7 @@ const Chat = ({ chatId }: { chatId: string }) => {
export default Chat;
export async function getServerSideProps(context: any) {
const chatId = context.query?.chatId || '';
const chatId = context?.query?.chatId || 'noid';
return {
props: { chatId }

View File

@@ -15,7 +15,7 @@ import {
import { useTabs } from '@/hooks/useTabs';
import { useConfirm } from '@/hooks/useConfirm';
import { useSelectFile } from '@/hooks/useSelectFile';
import { readTxtContent, readPdfContent, readDocContent } from '@/utils/tools';
import { readTxtContent, readPdfContent, readDocContent } from '@/utils/file';
import { postSplitData } from '@/api/data';
import { useMutation } from '@tanstack/react-query';
import { useToast } from '@/hooks/useToast';

View File

@@ -1,12 +1,14 @@
import React from 'react';
import { Card } from '@chakra-ui/react';
import Markdown from '@/components/Markdown';
import { introPage } from '@/constants/common';
import { useMarkdown } from '@/hooks/useMarkdown';
const Home = () => {
const { data } = useMarkdown({ url: '/intro.md' });
return (
<Card p={5} lineHeight={2}>
<Markdown source={introPage} isChatting={false} />
<Markdown source={data} isChatting={false} />
</Card>
);
};

View File

@@ -71,7 +71,6 @@ const Login = () => {
order={1}
flex={`0 0 ${isPc ? '400px' : '100%'}`}
height={'100%'}
maxH={'450px'}
border="1px"
borderColor="gray.200"
py={5}

View File

@@ -1,7 +1,6 @@
import React, { useState, useCallback } from 'react';
import {
Box,
IconButton,
Flex,
Button,
Modal,
@@ -9,127 +8,158 @@ import {
ModalContent,
ModalHeader,
ModalCloseButton,
Input,
Textarea
} from '@chakra-ui/react';
import { useForm, useFieldArray } from 'react-hook-form';
import { postModelDataInput } from '@/api/model';
import { useForm } from 'react-hook-form';
import { postModelDataInput, putModelDataById } from '@/api/model';
import { useToast } from '@/hooks/useToast';
import { DeleteIcon } from '@chakra-ui/icons';
import { customAlphabet } from 'nanoid';
const nanoid = customAlphabet('abcdefghijklmnopqrstuvwxyz1234567890', 12);
type FormData = { text: string; q: { val: string }[] };
export type FormData = { dataId?: string; text: string; q: string };
const InputDataModal = ({
onClose,
onSuccess,
modelId
modelId,
defaultValues = {
text: '',
q: ''
}
}: {
onClose: () => void;
onSuccess: () => void;
modelId: string;
defaultValues?: FormData;
}) => {
const [importing, setImporting] = useState(false);
const { toast } = useToast();
const { register, handleSubmit, control } = useForm<FormData>({
defaultValues: {
text: '',
q: [{ val: '' }]
}
});
const {
fields: inputQ,
append: appendQ,
remove: removeQ
} = useFieldArray({
control,
name: 'q'
const { register, handleSubmit, reset } = useForm<FormData>({
defaultValues
});
/**
* 确认导入新数据
*/
const sureImportData = useCallback(
async (e: FormData) => {
setImporting(true);
try {
await postModelDataInput({
const res = await postModelDataInput({
modelId: modelId,
data: [
{
text: e.text,
q: e.q.map((item) => ({
q: {
id: nanoid(),
text: item.val
}))
text: e.q
}
}
]
});
toast({
title: '导入数据成功,需要一段时间训练',
status: 'success'
title: res === 0 ? '导入数据成功,需要一段时间训练' : '数据导入异常',
status: res === 0 ? 'success' : 'warning'
});
reset({
text: '',
q: ''
});
onClose();
onSuccess();
} catch (err) {
console.log(err);
}
setImporting(false);
},
[modelId, onClose, onSuccess, toast]
[modelId, onSuccess, reset, toast]
);
const updateData = useCallback(
async (e: FormData) => {
if (!e.dataId) return;
if (e.text !== defaultValues.text || e.q !== defaultValues.q) {
await putModelDataById({
dataId: e.dataId,
text: e.text,
q: e.q === defaultValues.q ? '' : e.q
});
onSuccess();
}
toast({
title: '修改回答成功',
status: 'success'
});
onClose();
},
[defaultValues, onClose, onSuccess, toast]
);
return (
<Modal isOpen={true} onClose={onClose}>
<Modal isOpen={true} onClose={onClose} isCentered>
<ModalOverlay />
<ModalContent maxW={'min(900px, 90vw)'} maxH={'80vh'} position={'relative'}>
<ModalContent
m={0}
display={'flex'}
flexDirection={'column'}
h={'90vh'}
maxW={'90vw'}
position={'relative'}
>
<ModalHeader></ModalHeader>
<ModalCloseButton />
<Box px={6} pb={2} overflowY={'auto'}>
<Box mb={2}>:</Box>
<Textarea
mb={4}
placeholder="知识点"
rows={3}
maxH={'200px'}
{...register(`text`, {
required: '知识点'
})}
/>
{inputQ.map((item, index) => (
<Box key={item.id} mb={5}>
<Box mb={2}>{index + 1}:</Box>
<Flex>
<Input
placeholder="问法"
{...register(`q.${index}.val`, {
required: '问法不能为空'
})}
></Input>
{inputQ.length > 1 && (
<IconButton
icon={<DeleteIcon />}
aria-label={'delete'}
colorScheme={'gray'}
variant={'unstyled'}
onClick={() => removeQ(index)}
/>
)}
</Flex>
</Box>
))}
<Box
display={['block', 'flex']}
flex={'1 0 0'}
h={['100%', 0]}
overflowY={'auto'}
px={6}
pb={2}
>
<Box flex={2} mr={[0, 4]} mb={[4, 0]} h={['230px', '100%']}>
<Box h={'30px'}></Box>
<Textarea
placeholder={
'相关问题,可以输入多个问法, 最多500字。例如\n1. laf 是什么?\n2. laf 可以做什么?\n3. laf怎么用'
}
maxLength={500}
resize={'none'}
h={'calc(100% - 30px)'}
{...register(`q`, {
required: '相关问题,可以回车输入多个问法'
})}
/>
</Box>
<Box flex={3} h={['330px', '100%']}>
<Box h={'30px'}></Box>
<Textarea
placeholder={
'知识点最多1000字。请保持主语的完整性缺少主语会导致效果不佳。例如\n1. laf是一个云函数开发平台。\n2. laf 什么都能做。\n3. 下面是使用 laf 的例子: ……'
}
maxLength={1000}
resize={'none'}
h={'calc(100% - 30px)'}
{...register(`text`, {
required: '知识点'
})}
/>
</Box>
</Box>
<Flex px={6} pt={2} pb={4}>
<Button alignSelf={'flex-start'} variant={'outline'} onClick={() => appendQ({ val: '' })}>
</Button>
<Box flex={1}></Box>
<Button variant={'outline'} mr={3} onClick={onClose}>
</Button>
<Button isLoading={importing} onClick={handleSubmit(sureImportData)}>
<Button
isLoading={importing}
onClick={handleSubmit(defaultValues.dataId ? updateData : sureImportData)}
>
</Button>
</Flex>

View File

@@ -1,4 +1,4 @@
import React, { useCallback } from 'react';
import React, { useCallback, useState } from 'react';
import {
Box,
TableContainer,
@@ -12,34 +12,36 @@ import {
Flex,
Button,
useDisclosure,
Textarea,
Menu,
MenuButton,
MenuList,
MenuItem
} from '@chakra-ui/react';
import type { ModelSchema } from '@/types/mongoSchema';
import { ModelDataSchema } from '@/types/mongoSchema';
import type { RedisModelDataItemType } from '@/types/redis';
import { ModelDataStatusMap } from '@/constants/model';
import { usePagination } from '@/hooks/usePagination';
import {
getModelDataList,
delOneModelData,
putModelDataById,
getModelSplitDataList
getModelSplitDataListLen,
getExportDataList
} from '@/api/model';
import { DeleteIcon, RepeatIcon } from '@chakra-ui/icons';
import { useToast } from '@/hooks/useToast';
import { DeleteIcon, RepeatIcon, EditIcon } from '@chakra-ui/icons';
import { useLoading } from '@/hooks/useLoading';
import { fileDownload } from '@/utils/file';
import dynamic from 'next/dynamic';
import { useQuery } from '@tanstack/react-query';
import { useMutation, useQuery } from '@tanstack/react-query';
import type { FormData as InputDataType } from './InputDataModal';
import Papa from 'papaparse';
const InputModel = dynamic(() => import('./InputDataModal'));
const SelectModel = dynamic(() => import('./SelectFileModal'));
const SelectFileModel = dynamic(() => import('./SelectFileModal'));
const SelectUrlModel = dynamic(() => import('./SelectUrlModal'));
const SelectCsvModal = dynamic(() => import('./SelectCsvModal'));
const ModelDataCard = ({ model }: { model: ModelSchema }) => {
const { toast } = useToast();
const { Loading } = useLoading();
const { Loading, setIsLoading } = useLoading();
const {
data: modelDataList,
@@ -48,41 +50,34 @@ const ModelDataCard = ({ model }: { model: ModelSchema }) => {
total,
getData,
pageNum
} = usePagination<ModelDataSchema>({
} = usePagination<RedisModelDataItemType>({
api: getModelDataList,
pageSize: 10,
pageSize: 8,
params: {
modelId: model._id
}
});
const updateAnswer = useCallback(
async (dataId: string, text: string) => {
await putModelDataById({
dataId,
text
});
toast({
title: '修改回答成功',
status: 'success'
});
},
[toast]
);
const [editInputData, setEditInputData] = useState<InputDataType>();
const {
isOpen: isOpenInputModal,
onOpen: onOpenInputModal,
onClose: onCloseInputModal
isOpen: isOpenSelectFileModal,
onOpen: onOpenSelectFileModal,
onClose: onCloseSelectFileModal
} = useDisclosure();
const {
isOpen: isOpenSelectModal,
onOpen: onOpenSelectModal,
onClose: onCloseSelectModal
isOpen: isOpenSelectUrlModal,
onOpen: onOpenSelectUrlModal,
onClose: onCloseSelectUrlModal
} = useDisclosure();
const {
isOpen: isOpenSelectCsvModal,
onOpen: onOpenSelectCsvModal,
onClose: onCloseSelectCsvModal
} = useDisclosure();
const { data, refetch } = useQuery(['getModelSplitDataList'], () =>
getModelSplitDataList(model._id)
const { data: splitDataLen, refetch } = useQuery(['getModelSplitDataList'], () =>
getModelSplitDataListLen(model._id)
);
const refetchData = useCallback(
@@ -93,11 +88,33 @@ const ModelDataCard = ({ model }: { model: ModelSchema }) => {
[getData, refetch]
);
// 获取所有的数据,并导出 json
const { mutate: onclickExport, isLoading: isLoadingExport } = useMutation({
mutationFn: () => getExportDataList(model._id),
onSuccess(res) {
try {
setIsLoading(true);
const text = Papa.unparse({
fields: ['question', 'answer'],
data: res
});
fileDownload({
text,
type: 'text/csv',
filename: 'data.csv'
});
} catch (error) {
error;
}
setIsLoading(false);
}
});
return (
<>
<Flex>
<Box fontWeight={'bold'} fontSize={'lg'} flex={1}>
: {total}{' '}
<Box fontWeight={'bold'} fontSize={'lg'} flex={1} mr={2}>
: {total}
<Box as={'span'} fontSize={'sm'}>
</Box>
@@ -107,64 +124,89 @@ const ModelDataCard = ({ model }: { model: ModelSchema }) => {
aria-label={'refresh'}
variant={'outline'}
mr={4}
size={'sm'}
onClick={() => refetchData(pageNum)}
/>
<Button
variant={'outline'}
mr={2}
size={'sm'}
isLoading={isLoadingExport}
title={'换行数据导出时,会进行格式转换'}
onClick={() => onclickExport()}
>
</Button>
<Menu>
<MenuButton as={Button}></MenuButton>
<MenuButton as={Button} size={'sm'}>
</MenuButton>
<MenuList>
<MenuItem onClick={onOpenInputModal}></MenuItem>
<MenuItem onClick={onOpenSelectModal}></MenuItem>
<MenuItem
onClick={() =>
setEditInputData({
text: '',
q: ''
})
}
>
</MenuItem>
<MenuItem onClick={onOpenSelectFileModal}>/ QA </MenuItem>
<MenuItem onClick={onOpenSelectUrlModal}> QA </MenuItem>
<MenuItem onClick={onOpenSelectCsvModal}>csv </MenuItem>
</MenuList>
</Menu>
</Flex>
{data && data.length > 0 && <Box fontSize={'xs'}>{data.length}...</Box>}
{!!(splitDataLen && splitDataLen > 0) && (
<Box fontSize={'xs'}>{splitDataLen}...</Box>
)}
<Box mt={4}>
<TableContainer h={'600px'} overflowY={'auto'}>
<TableContainer minH={'500px'}>
<Table variant={'simple'}>
<Thead>
<Tr>
<Th>Question</Th>
<Th>Text</Th>
<Th>Status</Th>
<Th></Th>
<Th></Th>
</Tr>
</Thead>
<Tbody>
{modelDataList.map((item) => (
<Tr key={item._id}>
<Td w={'350px'}>
{item.q.map((item, i) => (
<Box
key={item.id}
fontSize={'xs'}
w={'100%'}
whiteSpace={'pre-wrap'}
_notLast={{ mb: 1 }}
>
Q{i + 1}:{' '}
<Box as={'span'} userSelect={'all'}>
{item.text}
</Box>
</Box>
))}
<Tr key={item.id}>
<Td minW={'200px'}>
<Box fontSize={'xs'} whiteSpace={'pre-wrap'}>
{item.q}
</Box>
</Td>
<Td minW={'200px'}>
<Textarea
<Box
w={'100%'}
h={'100%'}
defaultValue={item.text}
fontSize={'xs'}
resize={'both'}
onBlur={(e) => {
const oldVal = modelDataList.find((data) => item._id === data._id)?.text;
if (oldVal !== e.target.value) {
updateAnswer(item._id, e.target.value);
}
}}
></Textarea>
whiteSpace={'pre-wrap'}
maxH={'250px'}
overflowY={'auto'}
>
{item.text}
</Box>
</Td>
<Td w={'100px'}>{ModelDataStatusMap[item.status]}</Td>
<Td>{ModelDataStatusMap[item.status]}</Td>
<Td>
<IconButton
mr={5}
icon={<EditIcon />}
variant={'outline'}
aria-label={'delete'}
size={'sm'}
onClick={() =>
setEditInputData({
dataId: item.id,
q: item.q,
text: item.text
})
}
/>
<IconButton
icon={<DeleteIcon />}
variant={'outline'}
@@ -172,7 +214,7 @@ const ModelDataCard = ({ model }: { model: ModelSchema }) => {
aria-label={'delete'}
size={'sm'}
onClick={async () => {
await delOneModelData(item._id);
await delOneModelData(item.id);
refetchData(pageNum);
}}
/>
@@ -188,11 +230,34 @@ const ModelDataCard = ({ model }: { model: ModelSchema }) => {
</Box>
<Loading loading={isLoading} fixed={false} />
{isOpenInputModal && (
<InputModel modelId={model._id} onClose={onCloseInputModal} onSuccess={refetchData} />
{editInputData !== undefined && (
<InputModel
modelId={model._id}
defaultValues={editInputData}
onClose={() => setEditInputData(undefined)}
onSuccess={refetchData}
/>
)}
{isOpenSelectModal && (
<SelectModel modelId={model._id} onClose={onCloseSelectModal} onSuccess={refetchData} />
{isOpenSelectFileModal && (
<SelectFileModel
modelId={model._id}
onClose={onCloseSelectFileModal}
onSuccess={refetchData}
/>
)}
{isOpenSelectUrlModal && (
<SelectUrlModel
modelId={model._id}
onClose={onCloseSelectUrlModal}
onSuccess={refetchData}
/>
)}
{isOpenSelectCsvModal && (
<SelectCsvModal
modelId={model._id}
onClose={onCloseSelectCsvModal}
onSuccess={refetchData}
/>
)}
</>
);

View File

@@ -12,12 +12,13 @@ import {
SliderThumb,
SliderMark,
Tooltip,
Button
Button,
Select
} from '@chakra-ui/react';
import { QuestionOutlineIcon } from '@chakra-ui/icons';
import type { ModelSchema } from '@/types/mongoSchema';
import { UseFormReturn } from 'react-hook-form';
import { modelList } from '@/constants/model';
import { modelList, ModelVectorSearchModeMap } from '@/constants/model';
import { formatPrice } from '@/utils/user';
import { useConfirm } from '@/hooks/useConfirm';
@@ -53,6 +54,12 @@ const ModelEditForm = ({
})}
></Input>
</Flex>
<Flex alignItems={'center'} mt={4}>
<Box flex={'0 0 80px'} w={0}>
modelId:
</Box>
<Box>{getValues('_id')}</Box>
</Flex>
</FormControl>
<Flex alignItems={'center'} mt={4}>
<Box flex={'0 0 80px'} w={0}>
@@ -83,15 +90,6 @@ const ModelEditForm = ({
</Button>
</Flex>
{/* <FormControl mt={4}>
<Box mb={1}>介绍:</Box>
<Textarea
rows={5}
maxLength={500}
{...register('intro')}
placeholder={'模型的介绍,仅做展示,不影响模型的效果'}
/>
</FormControl> */}
</Card>
<Card p={4}>
<Box fontWeight={'bold'}></Box>
@@ -108,7 +106,7 @@ const ModelEditForm = ({
<Slider
aria-label="slider-ex-1"
min={1}
min={0}
max={10}
step={1}
value={getValues('temperature')}
@@ -137,25 +135,32 @@ const ModelEditForm = ({
</Slider>
</Flex>
</FormControl>
{canTrain && (
<FormControl mt={4}>
<Flex alignItems={'center'}>
<Box flex={'0 0 70px'}></Box>
<Select {...register('search.mode', { required: '搜索模式不能为空' })}>
{Object.entries(ModelVectorSearchModeMap).map(([key, { text }]) => (
<option key={key} value={key}>
{text}
</option>
))}
</Select>
</Flex>
</FormControl>
)}
<Box mt={4}>
{canTrain ? (
<Box fontWeight={'bold'}>
prompt
使 tokens
</Box>
) : (
<>
<Box mb={1}></Box>
<Textarea
rows={6}
maxLength={-1}
{...register('systemPrompt')}
placeholder={
'模型默认的 prompt 词,通过调整该内容,可以生成一个限定范围的模型。\n\n注意改功能会影响对话的整体朝向'
}
/>
</>
)}
<Box mb={1}></Box>
<Textarea
rows={6}
maxLength={-1}
{...register('systemPrompt')}
placeholder={
canTrain
? '训练的模型会根据知识库内容,生成一部分系统提示词,因此在对话时需要消耗更多的 tokens。你可以增加提示词让效果更符合预期。例如: \n1. 请根据知识库内容回答用户问题。\n2. 知识库是电影《铃芽之旅》的内容,根据知识库内容回答。无关问题,拒绝回复!'
: '模型默认的 prompt 词,通过调整该内容,可以生成一个限定范围的模型。\n注意改功能会影响对话的整体朝向'
}
/>
</Box>
</Card>
{/* <Card p={4}>

View File

@@ -0,0 +1,153 @@
import React, { useState, useCallback } from 'react';
import {
Box,
Flex,
Button,
Modal,
ModalOverlay,
ModalContent,
ModalHeader,
ModalCloseButton,
ModalBody
} from '@chakra-ui/react';
import { useToast } from '@/hooks/useToast';
import { useSelectFile } from '@/hooks/useSelectFile';
import { useConfirm } from '@/hooks/useConfirm';
import { readCsvContent } from '@/utils/file';
import { useMutation } from '@tanstack/react-query';
import { postModelDataCsvData } from '@/api/model';
import Markdown from '@/components/Markdown';
import { useMarkdown } from '@/hooks/useMarkdown';
import { fileDownload } from '@/utils/file';
const csvTemplate = `question,answer\n"什么是 laf","laf 是一个云函数开发平台……"\n"什么是 sealos","Sealos 是以 kubernetes 为内核的云操作系统发行版,可以……"`;
const SelectJsonModal = ({
onClose,
onSuccess,
modelId
}: {
onClose: () => void;
onSuccess: () => void;
modelId: string;
}) => {
const [selecting, setSelecting] = useState(false);
const { toast } = useToast();
const { File, onOpen } = useSelectFile({ fileType: '.csv', multiple: false });
const [fileData, setFileData] = useState<string[][]>([]);
const { openConfirm, ConfirmChild } = useConfirm({
content: '确认导入该数据集?'
});
const onSelectFile = useCallback(
async (e: File[]) => {
const file = e[0];
setSelecting(true);
try {
const { header, data } = await readCsvContent(file);
if (header[0] !== 'question' || header[1] !== 'answer') {
throw new Error('csv 文件格式有误');
}
setFileData(data);
} catch (error: any) {
console.log(error);
toast({
title: error?.message || 'csv 文件格式有误',
status: 'error'
});
}
setSelecting(false);
},
[setSelecting, toast]
);
const { mutate, isLoading } = useMutation({
mutationFn: async () => {
if (!fileData) return;
const res = await postModelDataCsvData(modelId, fileData);
toast({
title: `导入数据成功,最终导入: ${res || 0} 条数据。需要一段时间训练`,
status: 'success',
duration: 4000
});
onClose();
onSuccess();
},
onError() {
toast({
title: '导入文件失败',
status: 'error'
});
}
});
const { data: intro } = useMarkdown({ url: '/csvSelect.md' });
return (
<Modal isOpen={true} onClose={onClose} isCentered>
<ModalOverlay />
<ModalContent maxW={'90vw'} position={'relative'} m={0} h={'90vh'}>
<ModalHeader>csv </ModalHeader>
<ModalCloseButton />
<ModalBody h={'100%'} display={['block', 'flex']} fontSize={'sm'} overflowY={'auto'}>
<Box flex={'2 0 0'} w={['100%', 0]} mr={[0, 4]} mb={[4, 0]}>
<Markdown source={intro} />
<Box
my={3}
cursor={'pointer'}
textDecoration={'underline'}
color={'blue.600'}
onClick={() =>
fileDownload({
text: csvTemplate,
type: 'text/csv',
filename: 'template.csv'
})
}
>
csv模板
</Box>
<Flex alignItems={'center'}>
<Button isLoading={selecting} onClick={onOpen}>
csv
</Button>
<Box ml={4}> {fileData.length} </Box>
</Flex>
</Box>
<Box flex={'3 0 0'} h={'100%'} overflow={'auto'} p={2} backgroundColor={'blackAlpha.50'}>
{fileData.map((item, index) => (
<Box key={index}>
<Box>
Q{index + 1}. {item[0]}
</Box>
<Box>
A{index + 1}. {item[1]}
</Box>
</Box>
))}
</Box>
</ModalBody>
<Flex px={6} pt={2} pb={4}>
<Box flex={1}></Box>
<Button variant={'outline'} mr={3} onClick={onClose}>
</Button>
<Button
isLoading={isLoading}
isDisabled={fileData.length === 0}
onClick={openConfirm(mutate)}
>
</Button>
</Flex>
</ModalContent>
<ConfirmChild />
<File onSelect={onSelectFile} />
</Modal>
);
};
export default SelectJsonModal;

View File

@@ -8,18 +8,18 @@ import {
ModalContent,
ModalHeader,
ModalCloseButton,
ModalBody
ModalBody,
Input,
Textarea
} from '@chakra-ui/react';
import { useToast } from '@/hooks/useToast';
import { useSelectFile } from '@/hooks/useSelectFile';
import { customAlphabet } from 'nanoid';
import { encode } from 'gpt-token-utils';
import { useConfirm } from '@/hooks/useConfirm';
import { readTxtContent, readPdfContent, readDocContent } from '@/utils/tools';
import { readTxtContent, readPdfContent, readDocContent } from '@/utils/file';
import { useMutation } from '@tanstack/react-query';
import { postModelDataFileText } from '@/api/model';
const nanoid = customAlphabet('abcdefghijklmnopqrstuvwxyz1234567890', 12);
import { postModelDataSplitData } from '@/api/model';
import { formatPrice } from '@/utils/user';
const fileExtension = '.txt,.doc,.docx,.pdf,.md';
@@ -34,6 +34,7 @@ const SelectFileModal = ({
}) => {
const [selecting, setSelecting] = useState(false);
const { toast } = useToast();
const [prompt, setPrompt] = useState('');
const { File, onOpen } = useSelectFile({ fileType: fileExtension, multiple: true });
const [fileText, setFileText] = useState('');
const { openConfirm, ConfirmChild } = useConfirm({
@@ -64,10 +65,9 @@ const SelectFileModal = ({
})
)
)
.join('\n')
.replace(/\n+/g, '\n');
.join(' ')
.replace(/(\\n|\n)+/g, '\n');
setFileText(fileTexts);
console.log(encode(fileTexts));
} catch (error: any) {
console.log(error);
toast({
@@ -83,7 +83,11 @@ const SelectFileModal = ({
const { mutate, isLoading } = useMutation({
mutationFn: async () => {
if (!fileText) return;
await postModelDataFileText(modelId, fileText);
await postModelDataSplitData({
modelId,
text: fileText,
prompt: `下面是${prompt || '一段长文本'}`
});
toast({
title: '导入数据成功,需要一段拆解和训练',
status: 'success'
@@ -100,40 +104,55 @@ const SelectFileModal = ({
});
return (
<Modal isOpen={true} onClose={onClose}>
<Modal isOpen={true} onClose={onClose} isCentered>
<ModalOverlay />
<ModalContent maxW={'min(900px, 90vw)'} position={'relative'}>
<ModalContent maxW={'min(900px, 90vw)'} m={0} position={'relative'} h={'90vh'}>
<ModalHeader></ModalHeader>
<ModalCloseButton />
<ModalBody>
<Flex
flexDirection={'column'}
p={2}
h={'100%'}
alignItems={'center'}
justifyContent={'center'}
fontSize={'sm'}
>
<Button isLoading={selecting} onClick={onOpen}>
</Button>
<Box mt={2}> {fileExtension} . </Box>
<Box mt={2}>
{fileText.length} {encode(fileText).length} tokens
</Box>
<Box
h={'300px'}
w={'100%'}
overflow={'auto'}
p={2}
backgroundColor={'blackAlpha.50'}
whiteSpace={'pre'}
fontSize={'xs'}
>
{fileText}
<ModalBody
display={'flex'}
flexDirection={'column'}
p={4}
h={'100%'}
alignItems={'center'}
justifyContent={'center'}
fontSize={'sm'}
>
<Button isLoading={selecting} onClick={onOpen}>
</Button>
<Box mt={2} maxW={['100%', '70%']}>
{fileExtension} QA
tokens
</Box>
<Box mt={2}>
{encode(fileText).length} tokens {formatPrice(encode(fileText).length * 3)}
</Box>
<Flex w={'100%'} alignItems={'center'} my={4}>
<Box flex={'0 0 auto'} mr={2}>
</Box>
<Input
placeholder="提示词,例如: Laf的介绍/关于gpt4的论文/一段长文本"
value={prompt}
onChange={(e) => setPrompt(e.target.value)}
size={'sm'}
/>
</Flex>
<Textarea
flex={'1 0 0'}
h={0}
w={'100%'}
placeholder="文件内容"
maxLength={-1}
resize={'none'}
fontSize={'xs'}
whiteSpace={'pre-wrap'}
value={fileText}
onChange={(e) => setFileText(e.target.value)}
/>
</ModalBody>
<Flex px={6} pt={2} pb={4}>

View File

@@ -0,0 +1,168 @@
import React, { useState } from 'react';
import {
Box,
Flex,
Button,
Modal,
ModalOverlay,
ModalContent,
ModalHeader,
ModalCloseButton,
ModalBody,
Input,
Textarea
} from '@chakra-ui/react';
import { useToast } from '@/hooks/useToast';
import { customAlphabet } from 'nanoid';
import { encode } from 'gpt-token-utils';
import { useConfirm } from '@/hooks/useConfirm';
import { useMutation } from '@tanstack/react-query';
import { postModelDataSplitData, getWebContent } from '@/api/model';
import { formatPrice } from '@/utils/user';
const nanoid = customAlphabet('abcdefghijklmnopqrstuvwxyz1234567890', 12);
const SelectUrlModal = ({
onClose,
onSuccess,
modelId
}: {
onClose: () => void;
onSuccess: () => void;
modelId: string;
}) => {
const { toast } = useToast();
const [webUrl, setWebUrl] = useState('');
const [webText, setWebText] = useState('');
const [prompt, setPrompt] = useState(''); // 提示词
const { openConfirm, ConfirmChild } = useConfirm({
content: '确认导入该文件,需要一定时间进行拆解,该任务无法终止!如果余额不足,任务讲被终止。'
});
const { mutate: onclickImport, isLoading: isImporting } = useMutation({
mutationFn: async () => {
if (!webText) return;
await postModelDataSplitData({
modelId,
text: webText,
prompt: `下面是${prompt || '一段长文本'}`
});
toast({
title: '导入数据成功,需要一段拆解和训练',
status: 'success'
});
onClose();
onSuccess();
},
onError(error) {
console.log(error);
toast({
title: '导入数据失败',
status: 'error'
});
}
});
const { mutate: onclickFetchingUrl, isLoading: isFetching } = useMutation({
mutationFn: async () => {
if (!webUrl) return;
const res = await getWebContent(webUrl);
const parser = new DOMParser();
const htmlDoc = parser.parseFromString(res, 'text/html');
const data = htmlDoc?.body?.innerText || '';
if (!data) {
throw new Error('获取不到数据');
}
setWebText(data.replace(/\s+/g, ' '));
},
onError(error) {
console.log(error);
toast({
status: 'error',
title: '获取网站内容失败'
});
}
});
return (
<Modal isOpen={true} onClose={onClose} isCentered>
<ModalOverlay />
<ModalContent maxW={'min(900px, 90vw)'} m={0} position={'relative'} h={'90vh'}>
<ModalHeader></ModalHeader>
<ModalCloseButton />
<ModalBody
display={'flex'}
flexDirection={'column'}
p={4}
h={'100%'}
alignItems={'center'}
justifyContent={'center'}
fontSize={'sm'}
>
<Box mt={2} maxW={['100%', '70%']}>
QA tokens
</Box>
<Box mt={2}>
{encode(webText).length} tokens {formatPrice(encode(webText).length * 3)}
</Box>
<Flex w={'100%'} alignItems={'center'} my={4}>
<Box flex={'0 0 70px'}></Box>
<Input
mx={2}
placeholder="需要获取内容的地址。例如https://fastgpt.ahapocket.cn"
value={webUrl}
onChange={(e) => setWebUrl(e.target.value)}
size={'sm'}
/>
<Button isLoading={isFetching} onClick={() => onclickFetchingUrl()}>
</Button>
</Flex>
<Flex w={'100%'} alignItems={'center'} my={4}>
<Box flex={'0 0 70px'} mr={2}>
</Box>
<Input
placeholder="内容提示词。例如: Laf的介绍/关于gpt4的论文/一段长文本"
value={prompt}
onChange={(e) => setPrompt(e.target.value)}
size={'sm'}
/>
</Flex>
<Textarea
flex={'1 0 0'}
h={0}
w={'100%'}
placeholder="网站的内容"
maxLength={-1}
resize={'none'}
fontSize={'xs'}
whiteSpace={'pre-wrap'}
value={webText}
onChange={(e) => setWebText(e.target.value)}
/>
</ModalBody>
<Flex px={6} pt={2} pb={4}>
<Box flex={1}></Box>
<Button variant={'outline'} mr={3} onClick={onClose}>
</Button>
<Button
isLoading={isImporting}
isDisabled={webText === ''}
onClick={openConfirm(onclickImport)}
>
</Button>
</Flex>
</ModalContent>
<ConfirmChild />
</Modal>
);
};
export default SelectUrlModal;

View File

@@ -37,7 +37,7 @@ const ModelDetail = ({ modelId }: { modelId: string }) => {
setLoading(true);
try {
const res = await getModelById(modelId);
console.log(res);
// console.log(res);
res.security.expiredTime /= 60 * 60 * 1000;
setModel(res);
formHooks.reset(res);
@@ -143,6 +143,7 @@ const ModelDetail = ({ modelId }: { modelId: string }) => {
systemPrompt: data.systemPrompt,
intro: data.intro,
temperature: data.temperature,
search: data.search,
service: data.service,
security: data.security
});

View File

@@ -49,10 +49,6 @@ const ModelPhoneList = ({
<Box flex={'0 0 100px'}>AI模型: </Box>
<Box color={'blackAlpha.500'}>{model.service.modelName}</Box>
</Flex>
<Flex mt={5}>
<Box flex={'0 0 100px'}>: </Box>
<Box color={'blackAlpha.500'}>{model.trainingTimes}</Box>
</Flex>
<Flex mt={5} justifyContent={'flex-end'}>
<Button
mr={3}

View File

@@ -42,12 +42,12 @@ const ModelTable = ({
dataIndex: 'status',
render: (item: ModelSchema) => (
<Tag
colorScheme={formatModelStatus[item.status].colorTheme}
colorScheme={formatModelStatus[item.status]?.colorTheme}
variant="solid"
px={3}
size={'md'}
>
{formatModelStatus[item.status].text}
{formatModelStatus[item.status]?.text}
</Tag>
)
},
@@ -60,11 +60,6 @@ const ModelTable = ({
</Box>
)
},
{
title: '训练次数',
key: 'trainingTimes',
dataIndex: 'trainingTimes'
},
{
title: '操作',
key: 'control',

View File

@@ -0,0 +1,57 @@
import React from 'react';
import { Card, Box, Table, Thead, Tbody, Tr, Th, Td, TableContainer } from '@chakra-ui/react';
import { BillTypeMap } from '@/constants/user';
import { getUserBills } from '@/api/user';
import type { UserBillType } from '@/types/user';
import { usePagination } from '@/hooks/usePagination';
import { useLoading } from '@/hooks/useLoading';
const BillTable = () => {
const { Loading } = useLoading();
const {
data: bills,
isLoading,
Pagination
} = usePagination<UserBillType>({
api: getUserBills
});
return (
<Card mt={6} py={4}>
<Box fontSize={'xl'} fontWeight={'bold'} px={6} mb={1}>
使
</Box>
<TableContainer position={'relative'}>
<Table>
<Thead>
<Tr>
<Th></Th>
<Th></Th>
<Th></Th>
<Th>Tokens </Th>
<Th></Th>
</Tr>
</Thead>
<Tbody fontSize={'sm'}>
{bills.map((item) => (
<Tr key={item.id}>
<Td>{item.time}</Td>
<Td>{BillTypeMap[item.type]}</Td>
<Td>{item.textLen}</Td>
<Td>{item.tokenLen}</Td>
<Td>{item.price}</Td>
</Tr>
))}
</Tbody>
</Table>
<Box mt={4} mr={4} textAlign={'end'}>
<Pagination />
</Box>
<Loading loading={isLoading} fixed={false} />
</TableContainer>
</Card>
);
};
export default BillTable;

View File

@@ -10,21 +10,12 @@ import {
Button,
Input,
Box,
Grid,
Table,
Thead,
Tbody,
Tr,
Th,
Td,
TableContainer
Grid
} from '@chakra-ui/react';
import { getPayCode, checkPayResult } from '@/api/user';
import { useToast } from '@/hooks/useToast';
import { useQuery } from '@tanstack/react-query';
import { useRouter } from 'next/router';
import { modelList } from '@/constants/model';
import { formatPrice } from '../../../utils/user';
const PayModal = ({ onClose }: { onClose: () => void }) => {
const router = useRouter();
@@ -95,7 +86,7 @@ const PayModal = ({ onClose }: { onClose: () => void }) => {
{!payId && (
<>
{/* 价格表 */}
<TableContainer mb={4}>
{/* <TableContainer mb={4}>
<Table>
<Thead>
<Tr>
@@ -112,7 +103,7 @@ const PayModal = ({ onClose }: { onClose: () => void }) => {
))}
</Tbody>
</Table>
</TableContainer>
</TableContainer> */}
<Grid gridTemplateColumns={'repeat(4,1fr)'} gridGap={5} mb={4}>
{[5, 10, 20, 50].map((item) => (
<Button

View File

@@ -0,0 +1,112 @@
import React, { useState, useCallback } from 'react';
import {
Card,
Box,
Flex,
Button,
Table,
Thead,
Tbody,
Tr,
Th,
Td,
TableContainer,
useDisclosure
} from '@chakra-ui/react';
import { getPayOrders, checkPayResult } from '@/api/user';
import { PaySchema } from '@/types/mongoSchema';
import dayjs from 'dayjs';
import { useQuery } from '@tanstack/react-query';
import { formatPrice } from '@/utils/user';
import WxConcat from '@/components/WxConcat';
import { useGlobalStore } from '@/store/global';
import { useToast } from '@/hooks/useToast';
const PayRecordTable = () => {
const { isOpen: isOpenWx, onOpen: onOpenWx, onClose: onCloseWx } = useDisclosure();
const [payOrders, setPayOrders] = useState<PaySchema[]>([]);
const { setLoading } = useGlobalStore();
const { toast } = useToast();
const handleRefreshPayOrder = useCallback(
async (payId: string) => {
setLoading(true);
try {
const data = await checkPayResult(payId);
toast({
title: data,
status: 'info'
});
const res = await getPayOrders();
setPayOrders(res);
} catch (error: any) {
toast({
title: error?.message,
status: 'warning'
});
console.log(error);
}
setLoading(false);
},
[setLoading, toast]
);
useQuery(['initPayOrder'], getPayOrders, {
onSuccess(res) {
setPayOrders(res);
}
});
return (
<>
<Card mt={6} py={4}>
<Flex alignItems={'flex-end'} px={6} mb={1}>
<Box fontSize={'xl'} fontWeight={'bold'}>
</Box>
<Button onClick={onOpenWx} size={'xs'} ml={4} variant={'outline'}>
wx联系
</Button>
</Flex>
<TableContainer px={6}>
<Table>
<Thead>
<Tr>
<Th></Th>
<Th></Th>
<Th></Th>
<Th></Th>
<Th></Th>
</Tr>
</Thead>
<Tbody fontSize={'sm'}>
{payOrders.map((item) => (
<Tr key={item._id}>
<Td>{item.orderId}</Td>
<Td>
{item.createTime ? dayjs(item.createTime).format('YYYY/MM/DD HH:mm:ss') : '-'}
</Td>
<Td>{formatPrice(item.price)}</Td>
<Td>{item.status}</Td>
<Td>
{item.status === 'NOTPAY' && (
<Button onClick={() => handleRefreshPayOrder(item._id)} size={'sm'}>
</Button>
)}
</Td>
</Tr>
))}
</Tbody>
</Table>
</TableContainer>
</Card>
{/* wx 联系 */}
{isOpenWx && <WxConcat onClose={onCloseWx} />}
</>
);
};
export default PayRecordTable;

View File

@@ -1,72 +1,32 @@
import React, { useCallback, useState } from 'react';
import {
Card,
Box,
Flex,
Button,
Table,
Thead,
Tbody,
Tr,
Th,
Td,
TableContainer,
Select,
Input,
IconButton,
useDisclosure
} from '@chakra-ui/react';
import { DeleteIcon } from '@chakra-ui/icons';
import { useForm, useFieldArray } from 'react-hook-form';
import { Card, Box, Flex, Button, Input } from '@chakra-ui/react';
import { useForm } from 'react-hook-form';
import { UserUpdateParams } from '@/types/user';
import { putUserInfo, getUserBills, getPayOrders, checkPayResult } from '@/api/user';
import { putUserInfo } from '@/api/user';
import { useToast } from '@/hooks/useToast';
import { useGlobalStore } from '@/store/global';
import { useUserStore } from '@/store/user';
import { UserType } from '@/types/user';
import { usePaging } from '@/hooks/usePaging';
import type { UserBillType } from '@/types/user';
import { useQuery } from '@tanstack/react-query';
import dynamic from 'next/dynamic';
import { PaySchema } from '@/types/mongoSchema';
import dayjs from 'dayjs';
import { formatPrice } from '@/utils/user';
import WxConcat from '@/components/WxConcat';
import ScrollData from '@/components/ScrollData';
import { BillTypeMap } from '@/constants/user';
const PayRecordTable = dynamic(() => import('./components/PayRecordTable'));
const BilTable = dynamic(() => import('./components/BillTable'));
const PayModal = dynamic(() => import('./components/PayModal'));
const NumberSetting = () => {
const { userInfo, updateUserInfo, initUserInfo } = useUserStore();
const { setLoading } = useGlobalStore();
const { register, handleSubmit, control } = useForm<UserUpdateParams>({
const { register, handleSubmit } = useForm<UserUpdateParams>({
defaultValues: userInfo as UserType
});
const [showPay, setShowPay] = useState(false);
const { isOpen: isOpenWx, onOpen: onOpenWx, onClose: onCloseWx } = useDisclosure();
const { toast } = useToast();
const {
fields: accounts,
append: appendAccount,
remove: removeAccount
} = useFieldArray({
control,
name: 'accounts'
});
const {
nextPage,
isLoadAll,
requesting,
data: bills
} = usePaging<UserBillType>({
api: getUserBills,
pageSize: 30
});
const [payOrders, setPayOrders] = useState<PaySchema[]>([]);
const onclickSave = useCallback(
async (data: UserUpdateParams) => {
if (data.openaiKey === userInfo?.openaiKey) return;
setLoading(true);
try {
await putUserInfo(data);
@@ -78,54 +38,22 @@ const NumberSetting = () => {
} catch (error) {}
setLoading(false);
},
[setLoading, toast, updateUserInfo]
[setLoading, toast, updateUserInfo, userInfo?.openaiKey]
);
useQuery(['init'], initUserInfo);
useQuery(['initPayOrder'], getPayOrders, {
onSuccess(res) {
setPayOrders(res);
}
});
const handleRefreshPayOrder = useCallback(
async (payId: string) => {
setLoading(true);
try {
const data = await checkPayResult(payId);
toast({
title: data,
status: 'info'
});
const res = await getPayOrders();
setPayOrders(res);
} catch (error: any) {
toast({
title: error?.message,
status: 'warning'
});
console.log(error);
}
setLoading(false);
},
[setLoading, toast]
);
return (
<>
{/* 核心信息 */}
<Card px={6} py={4}>
<Box fontSize={'xl'} fontWeight={'bold'}>
</Box>
<Box mt={6}>
<Flex alignItems={'center'}>
<Box flex={'0 0 60px'}>:</Box>
<Box>{userInfo?.email}</Box>
</Flex>
</Box>
<Flex mt={6} alignItems={'center'}>
<Box flex={'0 0 60px'}>:</Box>
<Box>{userInfo?.email}</Box>
</Flex>
<Box mt={6}>
<Flex alignItems={'center'}>
<Box flex={'0 0 60px'}>:</Box>
@@ -140,157 +68,28 @@ const NumberSetting = () => {
openai
</Box>
</Box>
</Card>
<Card mt={6} px={6} py={4}>
<Flex mb={5} justifyContent={'space-between'}>
<Box fontSize={'xl'} fontWeight={'bold'}>
</Box>
<Box>
{accounts.length === 0 && (
<Button
mr={5}
variant="outline"
onClick={() =>
appendAccount({
type: 'openai',
value: ''
})
}
>
</Button>
)}
<Button onClick={handleSubmit(onclickSave)}></Button>
</Box>
<Flex mt={6} alignItems={'center'}>
<Box flex={'0 0 85px'}>openaiKey:</Box>
<Input
{...register(`openaiKey`)}
maxW={'300px'}
placeholder={'openai账号。回车或失去焦点保存'}
size={'sm'}
onBlur={handleSubmit(onclickSave)}
onKeyDown={(e) => {
if (e.keyCode === 13) {
handleSubmit(onclickSave)();
}
}}
></Input>
</Flex>
<TableContainer>
<Table>
<Thead>
<Tr>
<Th></Th>
<Th></Th>
<Th></Th>
</Tr>
</Thead>
<Tbody>
{accounts.map((item, i) => (
<Tr key={item.id}>
<Td minW={'200px'}>
<Select
{...register(`accounts.${i}.type`, {
required: '类型不能为空'
})}
>
<option value="openai">openai</option>
</Select>
</Td>
<Td minW={'200px'} whiteSpace="pre-wrap" wordBreak={'break-all'}>
<Input
{...register(`accounts.${i}.value`, {
required: '账号不能为空'
})}
></Input>
</Td>
<Td>
<IconButton
aria-label="删除账号"
icon={<DeleteIcon />}
colorScheme={'red'}
onClick={() => {
removeAccount(i);
handleSubmit(onclickSave)();
}}
/>
</Td>
</Tr>
))}
</Tbody>
</Table>
</TableContainer>
</Card>
<Card mt={6} py={4}>
<Flex alignItems={'flex-end'} px={6} mb={1}>
<Box fontSize={'xl'} fontWeight={'bold'}>
</Box>
<Button onClick={onOpenWx} size={'xs'} ml={4} variant={'outline'}>
wx联系
</Button>
</Flex>
<TableContainer maxH={'400px'} overflowY={'auto'} px={6}>
<Table>
<Thead>
<Tr>
<Th></Th>
<Th></Th>
<Th></Th>
<Th></Th>
<Th></Th>
</Tr>
</Thead>
<Tbody fontSize={'sm'}>
{payOrders.map((item) => (
<Tr key={item._id}>
<Td>{item.orderId}</Td>
<Td>
{item.createTime ? dayjs(item.createTime).format('YYYY/MM/DD HH:mm:ss') : '-'}
</Td>
<Td>{formatPrice(item.price)}</Td>
<Td>{item.status}</Td>
<Td>
{item.status === 'NOTPAY' && (
<Button onClick={() => handleRefreshPayOrder(item._id)} size={'sm'}>
</Button>
)}
</Td>
</Tr>
))}
</Tbody>
</Table>
</TableContainer>
</Card>
<Card mt={6} py={4}>
<Box fontSize={'xl'} fontWeight={'bold'} px={6} mb={1}>
使
</Box>
<ScrollData
maxH={'400px'}
px={6}
isLoadAll={isLoadAll}
requesting={requesting}
nextPage={nextPage}
>
<TableContainer>
<Table>
<Thead>
<Tr>
<Th></Th>
<Th></Th>
<Th></Th>
<Th>Tokens </Th>
<Th></Th>
</Tr>
</Thead>
<Tbody fontSize={'sm'}>
{bills.map((item) => (
<Tr key={item.id}>
<Td>{item.time}</Td>
<Td>{BillTypeMap[item.type]}</Td>
<Td>{item.textLen}</Td>
<Td>{item.tokenLen}</Td>
<Td>{item.price}</Td>
</Tr>
))}
</Tbody>
</Table>
</TableContainer>
</ScrollData>
</Card>
{/* 充值记录 */}
<PayRecordTable />
{/* 账单表 */}
<BilTable />
{showPay && <PayModal onClose={() => setShowPay(false)} />}
{/* wx 联系 */}
{isOpenWx && <WxConcat onClose={onCloseWx} />}
</>
);
};

139
src/pages/openapi/index.tsx Normal file
View File

@@ -0,0 +1,139 @@
import React, { useState } from 'react';
import Link from 'next/link';
import {
Card,
Box,
Button,
Table,
Thead,
Tbody,
Tr,
Th,
Td,
TableContainer,
IconButton,
Modal,
ModalOverlay,
ModalContent,
ModalHeader,
ModalCloseButton,
ModalBody
} from '@chakra-ui/react';
import { getOpenApiKeys, createAOpenApiKey, delOpenApiById } from '@/api/openapi';
import { useQuery, useMutation } from '@tanstack/react-query';
import { useLoading } from '@/hooks/useLoading';
import dayjs from 'dayjs';
import { DeleteIcon } from '@chakra-ui/icons';
import { useCopyData } from '@/utils/tools';
const OpenApi = () => {
const { Loading } = useLoading();
const {
data: apiKeys = [],
isLoading: isGetting,
refetch
} = useQuery([getOpenApiKeys], getOpenApiKeys);
const [apiKey, setApiKey] = useState('');
const { copyData } = useCopyData();
const { mutate: onclickCreateApiKey, isLoading: isCreating } = useMutation({
mutationFn: () => createAOpenApiKey(),
onSuccess(res) {
setApiKey(res);
refetch();
}
});
const { mutate: onclickRemove, isLoading: isDeleting } = useMutation({
mutationFn: async (id: string) => delOpenApiById(id),
onSuccess() {
refetch();
}
});
return (
<>
<Card px={6} py={4} position={'relative'}>
<Box fontSize={'xl'} fontWeight={'bold'}>
FastGpt Api
</Box>
<Box fontSize={'sm'} mt={2}>
FastGpt Api Fast Gpt api
Api
Key
</Box>
<Box
my={1}
as="a"
href="https://kjqvjse66l.feishu.cn/docx/DmLedTWtUoNGX8xui9ocdUEjnNh"
color={'blue.800'}
textDecoration={'underline'}
target={'_blank'}
>
</Box>
<TableContainer mt={2} position={'relative'}>
<Table>
<Thead>
<Tr>
<Th>Api Key</Th>
<Th></Th>
<Th>使</Th>
<Th />
</Tr>
</Thead>
<Tbody fontSize={'sm'}>
{apiKeys.map(({ id, apiKey, createTime, lastUsedTime }) => (
<Tr key={id}>
<Td>{apiKey}</Td>
<Td>{dayjs(createTime).format('YYYY/MM/DD HH:mm:ss')}</Td>
<Td>
{lastUsedTime
? dayjs(lastUsedTime).format('YYYY/MM/DD HH:mm:ss')
: '没有使用过'}
</Td>
<Td>
<IconButton
icon={<DeleteIcon />}
size={'xs'}
aria-label={'delete'}
variant={'outline'}
colorScheme={'gray'}
onClick={() => onclickRemove(id)}
/>
</Td>
</Tr>
))}
</Tbody>
</Table>
</TableContainer>
<Button
maxW={'200px'}
mt={5}
isLoading={isCreating}
isDisabled={apiKeys.length >= 5}
title={apiKeys.length >= 5 ? '最多五组 Api Key' : ''}
onClick={() => onclickCreateApiKey()}
>
Api Key
</Button>
<Loading loading={isGetting || isDeleting} fixed={false} />
</Card>
<Modal isOpen={!!apiKey} onClose={() => setApiKey('')}>
<ModalOverlay />
<ModalContent>
<ModalHeader>Api Key</ModalHeader>
<ModalCloseButton />
<ModalBody mb={5}>
Api Key
<Box userSelect={'all'} onClick={() => copyData(apiKey)}>
{apiKey}
</Box>
</ModalBody>
</ModalContent>
</Modal>
</>
);
};
export default OpenApi;

View File

@@ -6,7 +6,38 @@ export const openaiError: Record<string, string> = {
'Too Many Requests': '请求次数太多了,请慢点~',
'Bad Gateway': '网关异常,请重试'
};
export const openaiError2: Record<string, string> = {
insufficient_quota: 'API 余额不足',
invalid_request_error: 'openai 接口异常'
};
export const proxyError: Record<string, boolean> = {
ECONNABORTED: true,
ECONNRESET: true
};
export enum ERROR_ENUM {
unAuthorization = 'unAuthorization',
insufficientQuota = 'insufficientQuota'
}
export const ERROR_RESPONSE: Record<
any,
{
code: number;
statusText: string;
message: string;
data?: any;
}
> = {
[ERROR_ENUM.unAuthorization]: {
code: 403,
statusText: ERROR_ENUM.unAuthorization,
message: '凭证错误',
data: null
},
[ERROR_ENUM.insufficientQuota]: {
code: 403,
statusText: ERROR_ENUM.insufficientQuota,
message: '账号余额不足',
data: null
}
};

View File

@@ -1,14 +1,13 @@
import { DataItem } from '@/service/mongo';
import { getOpenAIApi } from '@/service/utils/chat';
import { httpsAgent, getOpenApiKey } from '@/service/utils/tools';
import { httpsAgent } from '@/service/utils/tools';
import { getOpenApiKey } from '../utils/openai';
import type { ChatCompletionRequestMessage } from 'openai';
import { DataItemSchema } from '@/types/mongoSchema';
import { ChatModelNameEnum } from '@/constants/model';
import { pushSplitDataBill } from '@/service/events/pushBill';
export async function generateAbstract(next = false): Promise<any> {
if (process.env.NODE_ENV === 'development') return;
if (global.generatingAbstract && !next) return;
global.generatingAbstract = true;
@@ -40,7 +39,7 @@ export async function generateAbstract(next = false): Promise<any> {
// 获取 openapi Key
let userApiKey, systemKey;
try {
const key = await getOpenApiKey(dataItem.userId, true);
const key = await getOpenApiKey(dataItem.userId);
userApiKey = key.userApiKey;
systemKey = key.systemKey;
} catch (error: any) {
@@ -77,7 +76,7 @@ export async function generateAbstract(next = false): Promise<any> {
},
{
timeout: 120000,
httpsAgent
httpsAgent: httpsAgent(!userApiKey)
}
);
@@ -85,36 +84,6 @@ export async function generateAbstract(next = false): Promise<any> {
const rawContent: string = abstractResponse?.data.choices[0].message?.content || '';
// 从 content 中提取摘要内容
const splitContents = splitText(rawContent);
// console.log(rawContent);
// 生成词向量
// const vectorResponse = await Promise.allSettled(
// splitContents.map((item) =>
// chatAPI.createEmbedding(
// {
// model: 'text-embedding-ada-002',
// input: item.abstract
// },
// {
// timeout: 120000,
// httpsAgent
// }
// )
// )
// );
// 筛选成功的向量请求
// const vectorSuccessResponse = vectorResponse
// .map((item: any, i) => {
// if (item.status !== 'fulfilled') {
// // 没有词向量的【摘要】不要
// console.log('获取词向量错误: ', item);
// return '';
// }
// return {
// abstract: splitContents[i].abstract,
// abstractVector: item?.value?.data?.data?.[0]?.embedding
// };
// })
// .filter((item) => item);
// 插入数据库,并修改状态
await DataItem.findByIdAndUpdate(dataItem._id, {
@@ -136,7 +105,8 @@ export async function generateAbstract(next = false): Promise<any> {
isPay: !userApiKey && splitContents.length > 0,
userId: dataItem.userId,
type: 'abstract',
text: systemPrompt.content + dataItem.text + rawContent
text: systemPrompt.content + dataItem.text + rawContent,
tokenLen: 0
});
} catch (error: any) {
console.log('error: 生成摘要错误', dataItem?._id);

View File

@@ -1,27 +1,32 @@
import { SplitData, ModelData } from '@/service/mongo';
import { SplitData } from '@/service/mongo';
import { getOpenAIApi } from '@/service/utils/chat';
import { httpsAgent, getOpenApiKey } from '@/service/utils/tools';
import { httpsAgent } from '@/service/utils/tools';
import { getOpenApiKey } from '../utils/openai';
import type { ChatCompletionRequestMessage } from 'openai';
import { ChatModelNameEnum } from '@/constants/model';
import { pushSplitDataBill } from '@/service/events/pushBill';
import { generateVector } from './generateVector';
import { connectRedis } from '../redis';
import { VecModelDataPrefix } from '@/constants/redis';
import { customAlphabet } from 'nanoid';
import { ModelSplitDataSchema } from '@/types/mongoSchema';
const nanoid = customAlphabet('abcdefghijklmnopqrstuvwxyz1234567890', 12);
export async function generateQA(next = false): Promise<any> {
if (global.generatingQA && !next) return;
if (global.generatingQA === true && !next) return;
global.generatingQA = true;
const systemPrompt: ChatCompletionRequestMessage = {
role: 'system',
content: `总结助手。我会向你发送一段长文本,请从中总结出5至15个问题和答案,答案请尽量详细,并按以下格式返回: Q1:\nA1:\nQ2:\nA2:\n`
};
let dataId = null;
try {
const redis = await connectRedis();
// 找出一个需要生成的 dataItem
const dataItem = await SplitData.findOne({
textList: { $exists: true, $ne: [] }
});
const data = await SplitData.aggregate([
{ $match: { textList: { $exists: true, $ne: [] } } },
{ $sample: { size: 1 } }
]);
const dataItem: ModelSplitDataSchema = data[0];
if (!dataItem) {
console.log('没有需要生成 QA 的数据');
@@ -29,15 +34,16 @@ export async function generateQA(next = false): Promise<any> {
return;
}
const text = dataItem.textList[dataItem.textList.length - 1];
if (!text) {
throw new Error('无文本');
}
dataId = dataItem._id;
// 获取 5 个源文本
const textList: string[] = dataItem.textList.slice(-5);
// 获取 openapi Key
let userApiKey, systemKey;
let userApiKey = '',
systemKey = '';
try {
const key = await getOpenApiKey(dataItem.userId, true);
const key = await getOpenApiKey(dataItem.userId);
userApiKey = key.userApiKey;
systemKey = key.systemKey;
} catch (error: any) {
@@ -47,84 +53,133 @@ export async function generateQA(next = false): Promise<any> {
textList: [],
errorText: error.message
});
throw new Error(error?.message);
}
throw new Error('获取 openai key 失败');
}
console.log('正在生成一组QA, ID:', dataItem._id);
console.log(`正在生成一组QA, 包含 ${textList.length} 组文本。ID: ${dataItem._id}`);
const startTime = Date.now();
// 获取 openai 请求实例
const chatAPI = getOpenAIApi(userApiKey || systemKey);
const systemPrompt: ChatCompletionRequestMessage = {
role: 'system',
content: `${
dataItem.prompt || '下面是一段长文本'
},请从中提取出5至30个问题和答案,并按以下格式返回: Q1:\nA1:\nQ2:\nA2:\n`
};
// 请求 chatgpt 获取回答
const response = await chatAPI
.createChatCompletion(
{
model: ChatModelNameEnum.GPT35,
temperature: 0.2,
n: 1,
messages: [
systemPrompt,
const response = await Promise.allSettled(
textList.map((text) =>
chatAPI
.createChatCompletion(
{
role: 'user',
content: text
model: ChatModelNameEnum.GPT35,
temperature: 0.8,
n: 1,
messages: [
systemPrompt,
{
role: 'user',
content: text
}
]
},
{
timeout: 180000,
httpsAgent: httpsAgent(!userApiKey)
}
]
},
{
timeout: 120000,
httpsAgent
}
)
.then((res) => {
const rawContent = res?.data.choices[0].message?.content || ''; // chatgpt 原本的回复
const result = splitText(res?.data.choices[0].message?.content || ''); // 格式化后的QA对
// 计费
pushSplitDataBill({
isPay: !userApiKey && result.length > 0,
userId: dataItem.userId,
type: 'QA',
text: systemPrompt.content + text + rawContent,
tokenLen: res.data.usage?.total_tokens || 0
});
return {
rawContent,
result
};
})
)
.then((res) => ({
rawContent: res?.data.choices[0].message?.content || '',
result: splitText(res?.data.choices[0].message?.content || '')
})); // 从 content 中提取 QA
);
// 获取成功的回答
const successResponse: {
rawContent: string;
result: {
q: string;
a: string;
}[];
}[] = response.filter((item) => item.status === 'fulfilled').map((item: any) => item.value);
const resultList = successResponse.map((item) => item.result).flat();
await Promise.allSettled([
SplitData.findByIdAndUpdate(dataItem._id, { $pop: { textList: 1 } }),
ModelData.insertMany(
response.result.map((item) => ({
modelId: dataItem.modelId,
userId: dataItem.userId,
text: item.a,
q: [
{
id: nanoid(),
text: item.q
}
],
status: 1
}))
)
SplitData.findByIdAndUpdate(dataItem._id, {
textList: dataItem.textList.slice(0, -5)
}), // 删掉后5个数据
...resultList.map((item) => {
// 插入 redis
return redis.sendCommand([
'HMSET',
`${VecModelDataPrefix}:${nanoid()}`,
'userId',
String(dataItem.userId),
'modelId',
String(dataItem.modelId),
'q',
item.q,
'text',
item.a,
'status',
'waiting'
]);
})
]);
console.log(
'生成QA成功time:',
`${(Date.now() - startTime) / 1000}s`,
'QA数量',
response.result.length
resultList.length
);
// 计费
pushSplitDataBill({
isPay: !userApiKey && response.result.length > 0,
userId: dataItem.userId,
type: 'QA',
text: systemPrompt.content + text + response.rawContent
});
generateQA(true);
generateVector(true);
generateVector();
} catch (error: any) {
console.log(error);
console.log('生成QA错误:', error?.response);
// log
if (error?.response) {
console.log('openai error: 生成QA错误');
console.log(error.response?.status, error.response?.statusText, error.response?.data);
} else {
console.log('生成QA错误:', error);
}
if (dataId && error?.response?.data?.error?.type === 'insufficient_quota') {
console.log('api 余额不足');
await SplitData.findByIdAndUpdate(dataId, {
textList: [],
errorText: 'api 余额不足'
});
generateQA(true);
return;
}
setTimeout(() => {
generateQA(true);
}, 5000);
}, 4000);
}
}
@@ -132,7 +187,7 @@ export async function generateQA(next = false): Promise<any> {
* 检查文本是否按格式返回
*/
function splitText(text: string) {
const regex = /Q\d+:(\s*)(.*)(\s*)A\d+:(\s*)(.*)(\s*)/g; // 匹配Q和A的正则表达式
const regex = /Q\d+:(\s*)(.*)(\s*)A\d+:(\s*)([\s\S]*?)(?=Q|$)/g; // 匹配Q和A的正则表达式
const matches = text.matchAll(regex); // 获取所有匹配到的结果
const result = []; // 存储最终的结果
@@ -140,7 +195,11 @@ function splitText(text: string) {
const q = match[2];
const a = match[5];
if (q && a) {
result.push({ q, a }); // 如果Q和A都存在就将其添加到结果中
// 如果Q和A都存在就将其添加到结果中
result.push({
q,
a: a.trim().replace(/\n\s*/g, '\n')
});
}
}

View File

@@ -1,93 +1,106 @@
import { getOpenAIApi } from '@/service/utils/chat';
import { httpsAgent } from '@/service/utils/tools';
import { ModelData } from '../models/modelData';
import { connectRedis } from '../redis';
import { VecModelDataIndex } from '@/constants/redis';
import { VecModelDataIdx } from '@/constants/redis';
import { vectorToBuffer } from '@/utils/tools';
import { ModelDataStatusEnum } from '@/constants/redis';
import { openaiCreateEmbedding, getOpenApiKey } from '../utils/openai';
export async function generateVector(next = false): Promise<any> {
if (global.generatingVector && !next) return;
global.generatingVector = true;
let dataId = null;
try {
const redis = await connectRedis();
// 找出一个需要生成的 dataItem
const dataItem = await ModelData.findOne({
status: { $ne: 0 }
});
// 找出一个 status = waiting 的数据
const searchRes = await redis.ft.search(
VecModelDataIdx,
`@status:{${ModelDataStatusEnum.waiting}}`,
{
RETURN: ['q', 'userId'],
LIMIT: {
from: 0,
size: 1
}
}
);
if (!dataItem) {
if (searchRes.total === 0) {
console.log('没有需要生成 【向量】 的数据');
global.generatingVector = false;
return;
}
const dataItem: { id: string; q: string; userId: string } = {
id: searchRes.documents[0].id,
q: String(searchRes.documents[0]?.value?.q || ''),
userId: String(searchRes.documents[0]?.value?.userId || '')
};
dataId = dataItem.id;
// 获取 openapi Key
const openAiKey = process.env.OPENAIKEY as string;
let userApiKey, systemKey;
try {
const res = await getOpenApiKey(dataItem.userId);
userApiKey = res.userApiKey;
systemKey = res.systemKey;
} catch (error: any) {
if (error?.code === 501) {
await redis.del(dataItem.id);
generateVector(true);
return;
}
// 获取 openai 请求实例
const chatAPI = getOpenAIApi(openAiKey);
const dataId = String(dataItem._id);
throw new Error('获取 openai key 失败');
}
// 生成词向量
const response = await Promise.allSettled(
dataItem.q.map((item, i) =>
chatAPI
.createEmbedding(
{
model: 'text-embedding-ada-002',
input: item.text
},
{
timeout: 120000,
httpsAgent
}
)
.then((res) => res?.data?.data?.[0]?.embedding || [])
.then((vector) =>
redis.sendCommand([
'HMSET',
`${VecModelDataIndex}:${item.id}`,
'vector',
vectorToBuffer(vector),
'modelId',
String(dataItem.modelId),
'dataId',
String(dataId)
])
)
)
);
if (response.filter((item) => item.status === 'fulfilled').length === 0) {
throw new Error(JSON.stringify(response));
}
// 修改该数据状态
await ModelData.findByIdAndUpdate(dataItem._id, {
status: 0
const { vector } = await openaiCreateEmbedding({
text: dataItem.q,
userId: dataItem.userId,
isPay: !userApiKey,
apiKey: userApiKey || systemKey
});
console.log(`生成向量成功: ${dataItem._id}`);
// 更新 redis 向量和状态数据
await redis.sendCommand([
'HMSET',
dataItem.id,
'vector',
vectorToBuffer(vector),
'status',
ModelDataStatusEnum.ready
]);
setTimeout(() => {
generateVector(true);
}, 3000);
console.log(`生成向量成功: ${dataItem.id}`);
generateVector(true);
} catch (error: any) {
console.log(error);
console.log('error: 生成向量错误', error?.response?.data);
// log
if (error?.response) {
console.log('openai error: 生成向量错误');
console.log(error.response?.status, error.response?.statusText, error.response?.data);
} else {
console.log('生成向量错误:', error);
}
if (dataId && error?.response?.data?.error?.type === 'insufficient_quota') {
console.log('api 余额不足,删除 redis 模型数据');
const redis = await connectRedis();
redis.del(dataId);
generateVector(true);
return;
}
if (error?.response?.statusText === 'Too Many Requests') {
console.log('次数限制1分钟后尝试');
console.log('生成向量次数限制1分钟后尝试');
// 限制次数1分钟后再试
setTimeout(() => {
generateVector(true);
}, 60000);
return;
}
setTimeout(() => {
generateVector(true);
}, 3000);
}, 2000);
}
}

View File

@@ -1,7 +1,7 @@
import { connectToDatabase, Bill, User } from '../mongo';
import { modelList, ChatModelNameEnum } from '@/constants/model';
import { encode } from 'gpt-token-utils';
import { formatPrice } from '@/utils/user';
import { BillTypeEnum } from '@/constants/user';
import type { DataType } from '@/types/data';
export const pushChatBill = async ({
@@ -14,17 +14,16 @@ export const pushChatBill = async ({
isPay: boolean;
modelName: string;
userId: string;
chatId: string;
chatId?: string;
text: string;
}) => {
let billId;
try {
// 计算 token 数量
const tokens = encode(text);
const tokens = Math.floor(encode(text).length * 0.7);
console.log('text len: ', text.length);
console.log('token len:', tokens.length);
console.log(`chat generate success. text len: ${text.length}. token len: ${tokens}`);
if (isPay) {
await connectToDatabase();
@@ -33,8 +32,7 @@ export const pushChatBill = async ({
const modelItem = modelList.find((item) => item.model === modelName);
// 计算价格
const unitPrice = modelItem?.price || 5;
const price = unitPrice * tokens.length;
console.log(`chat bill, unit price: ${unitPrice}, price: ${formatPrice(price)}`);
const price = unitPrice * tokens;
try {
// 插入 Bill 记录
@@ -44,7 +42,7 @@ export const pushChatBill = async ({
modelName,
chatId,
textLen: text.length,
tokenLen: tokens.length,
tokenLen: tokens,
price
});
billId = res._id;
@@ -66,11 +64,13 @@ export const pushChatBill = async ({
export const pushSplitDataBill = async ({
isPay,
userId,
tokenLen,
text,
type
}: {
isPay: boolean;
userId: string;
tokenLen: number;
text: string;
type: DataType;
}) => {
@@ -79,21 +79,15 @@ export const pushSplitDataBill = async ({
let billId;
try {
// 计算 token 数量
const tokens = encode(text);
console.log('text len: ', text.length);
console.log('token len:', tokens.length);
console.log(`splitData generate success. text len: ${text.length}. token len: ${tokenLen}`);
if (isPay) {
try {
// 获取模型单价格, 都是用 gpt35 拆分
const modelItem = modelList.find((item) => item.model === ChatModelNameEnum.GPT35);
const unitPrice = modelItem?.price || 5;
const unitPrice = modelItem?.price || 3;
// 计算价格
const price = unitPrice * tokens.length;
console.log(`splitData bill, price: ${formatPrice(price)}`);
const price = unitPrice * tokenLen;
// 插入 Bill 记录
const res = await Bill.create({
@@ -101,7 +95,57 @@ export const pushSplitDataBill = async ({
type,
modelName: ChatModelNameEnum.GPT35,
textLen: text.length,
tokenLen: tokens.length,
tokenLen,
price
});
billId = res._id;
// 账号扣费
await User.findByIdAndUpdate(userId, {
$inc: { balance: -price }
});
} catch (error) {
console.log('创建账单失败:', error);
billId && Bill.findByIdAndDelete(billId);
}
}
} catch (error) {
console.log(error);
}
};
export const pushGenerateVectorBill = async ({
isPay,
userId,
text,
tokenLen
}: {
isPay: boolean;
userId: string;
text: string;
tokenLen: number;
}) => {
await connectToDatabase();
let billId;
try {
console.log(`vector generate success. text len: ${text.length}. token len: ${tokenLen}`);
if (isPay) {
try {
const unitPrice = 0.4;
// 计算价格. 至少为1
let price = unitPrice * tokenLen;
price = price > 1 ? price : 1;
// 插入 Bill 记录
const res = await Bill.create({
userId,
type: BillTypeEnum.vector,
modelName: ChatModelNameEnum.VECTOR,
textLen: text.length,
tokenLen,
price
});
billId = res._id;

Some files were not shown because too many files have changed in this diff Show More