Compare commits

..

37 Commits

Author SHA1 Message Date
archer
f56a339ad1 perf: index 2023-06-25 16:05:43 +08:00
archer
68eca25df4 fix: ssr close 2023-06-25 14:16:54 +08:00
archer
9eed321471 perf: docker-compose 2023-06-25 13:41:44 +08:00
archer
426176db47 fix: apikey 2023-06-25 13:20:00 +08:00
archer
cfb31afbd9 fix: select ui;perf: max link and compose 2023-06-25 10:52:58 +08:00
archer
5be57da407 fix: v1 api 2023-06-24 21:39:34 +08:00
archer
057c3411b9 perf: fetch error 2023-06-24 21:21:53 +08:00
archer
83d755ad0e feat: limit prompt 2023-06-24 18:55:46 +08:00
JustSong
ec9852fc63 docs: update README (#103) 2023-06-24 00:38:08 +08:00
archer
4e6f8aefe8 docs 2023-06-23 23:40:07 +08:00
archer
11352b754a fix: model 2023-06-23 23:21:59 +08:00
archer
965ad34283 docs 2023-06-23 23:16:49 +08:00
archer
986206b691 perf: sse response 2023-06-23 23:11:22 +08:00
archer
6787f19d78 feat: price 2023-06-23 18:05:53 +08:00
archer
64c35eaa3a docs 2023-06-23 17:43:14 +08:00
archer
41ada6ecda perf: keys 2023-06-23 17:12:52 +08:00
archer
ae1f7a888e perf: token count;feat: chunk size 2023-06-23 15:08:30 +08:00
archer
9aace871ff fix: ssr 2023-06-21 18:04:36 +08:00
moonrailgun
39739f9305 chore: fix admin build problem (#101) 2023-06-21 15:40:51 +08:00
archer
ce757d918b fix: ssr 2023-06-21 15:22:07 +08:00
archer
d592d4e99a markdown 2023-06-21 15:22:06 +08:00
moonrailgun
11ce10cd80 feat: add zh translation and change title (#100) 2023-06-21 15:21:24 +08:00
archer
6fb312ccfd link text 2023-06-20 10:41:17 +08:00
archer
3166376173 fix: template 2023-06-20 10:40:49 +08:00
archer
a02a528737 perf: my models 2023-06-19 21:08:32 +08:00
archer
dd4ca27dc7 perf: deploy 2023-06-19 20:00:54 +08:00
archer
f2d37c30a5 feat: baidu statistic 2023-06-19 17:28:25 +08:00
archer
1d236f87ae perf: markdown redraw 2023-06-19 16:50:14 +08:00
archer
3b515c3c2d fix: choices empty 2023-06-19 11:30:26 +08:00
archer
e95f83ec8e docs 2023-06-18 23:52:40 +08:00
archer
03793c66da README 2023-06-18 23:25:16 +08:00
archer
84daf85393 fix: base url 2023-06-18 22:38:55 +08:00
archer
6c62d80a4c fix: refresh page 2023-06-18 22:19:49 +08:00
archer
ff2043c0fb feat: maxToken setting 2023-06-18 21:23:36 +08:00
archer
ee9afa310a feat: openapi v2 chat 2023-06-18 21:06:07 +08:00
archer
2b93ae2d00 fix: time conf 2023-06-17 21:53:04 +08:00
archer
00c93a63cd perf: queue link 2023-06-17 21:27:44 +08:00
99 changed files with 4165 additions and 3170 deletions

View File

@@ -1,6 +1,6 @@
# Fast GPT
Fast GPT 允许你使用自己的 openai API KEY 来快速的调用 openai 接口,目前集成了 Gpt35, Gpt4 和 embedding. 可构建自己的知识库。
Fast GPT 允许你使用自己的 openai API KEY 来快速的调用 openai 接口,目前集成了 Gpt35, Gpt4 和 embedding. 可构建自己的知识库。并且 OpenAPI Chat 接口兼容 OpenAI 接口,意味着你只需修改 BaseUrl 和 Authorization 即可在已有项目基础上接入 FastGpt
## 🛸 在线体验
@@ -44,9 +44,10 @@ Fast GPT 允许你使用自己的 openai API KEY 来快速的调用 openai 接
## Powered by
- [TuShan 5 分钟搭建后台管理系统](https://github.com/msgbyte/tushan)
- [Laf 3 分钟快速接入三方应用](https://github.com/labring/laf)
- [Sealos 快速部署集群应用](https://github.com/labring/sealos)
- [TuShan: 5 分钟搭建后台管理系统](https://github.com/msgbyte/tushan)
- [Laf: 3 分钟快速接入三方应用](https://github.com/labring/laf)
- [Sealos: 快速部署集群应用](https://github.com/labring/sealos)
- [One API: 令牌管理 & 二次分发,支持 Azure](https://github.com/songquanpeng/one-api)
## 🌟 Star History

2
admin/.gitignore vendored
View File

@@ -1 +1 @@
node_modules/
node_modules/

View File

@@ -2,13 +2,14 @@
## 项目原理
使用 tushan 项目做前端,然后构造了一个与 mongodb 做沟通的 API 做后端,可以做到创建、修改和删除用户
使用 [Tushan](https://tushan.msgbyte.com/) 项目做前端,然后构造了一个与 mongodb 做沟通的 API 做后端,可以做到创建、修改和删除用户
## 开发
1. 复制 .env.template 文件,添加环境变量
2. pnpm i
3. pnpm dev
1. `cp .env.template .env.local`: 复制 .env.template 文件,添加环境变量
2. `pnpm i`
3. `pnpm dev`
4. 打开 `http://localhost:5173/` 访问前端页面
## 部署
@@ -25,7 +26,8 @@ MONGODB_NAME=fastgpt
ADMIN_USER=username
ADMIN_PASS=password
ADMIN_SECRET=any
VITE_PUBLIC_SERVER_URL=http://localhost:3001 # 和server.js一致
PARENT_URL=http://localhost:3000
PARENT_ROOT_KEY=rootkey
```
## sealos 部署
@@ -33,7 +35,7 @@ VITE_PUBLIC_SERVER_URL=http://localhost:3001 # 和server.js一致
1. 进入 sealos 官网: https://cloud.sealos.io/
2. 打开 App Launchpad(应用管理) 工具
3. 新建应用
1. 镜像名: registry.cn-hangzhou.aliyuncs.com/fastgpt/fastgpt-admin:latest
1. 镜像名: `registry.cn-hangzhou.aliyuncs.com/fastgpt/fastgpt-admin:latest`
2. 容器端口: 3001
3. 环境变量: 参考上面
4. 打开外网访问开关

View File

@@ -4,7 +4,7 @@
<meta charset="UTF-8" />
<link rel="icon" type="image/svg+xml" href="/logo.svg" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Tushan</title>
<title>Fast GPT</title>
</head>
<body>
<div id="root"></div>

View File

@@ -25,7 +25,7 @@
"react-admin": "^4.11.0",
"react-dom": "^18.2.0",
"react-i18next": "^12.3.1",
"tushan": "^0.2.23"
"tushan": "^0.2.30"
},
"devDependencies": {
"@types/jsonexport": "^3.0.2",

3621
admin/pnpm-lock.yaml generated

File diff suppressed because it is too large Load Diff

View File

@@ -86,18 +86,6 @@ const modelSchema = new mongoose.Schema({
});
const SystemSchema = new mongoose.Schema({
openAIKeys: {
type: String,
default: ''
},
openAITrainingKeys: {
type: String,
default: ''
},
gpt4Key: {
type: String,
default: ''
},
vectorMaxProcess: {
type: Number,
default: 10

View File

@@ -4,16 +4,19 @@ import {
ListTable,
Resource,
Tushan,
fetchJSON
fetchJSON,
TushanContextProps,
HTTPClient
} from 'tushan';
import { authProvider } from './auth';
import { userFields, payFields, kbFields, ModelFields, SystemFields } from './fields';
import { Dashboard } from './Dashboard';
import { IconUser, IconApps, IconBook, IconStamp } from 'tushan/icon';
import { i18nZhTranslation } from 'tushan/client/i18n/resources/zh';
const authStorageKey = 'tushan:auth';
const httpClient: typeof fetchJSON = (url, options = {}) => {
const httpClient: HTTPClient = (url, options = {}) => {
try {
if (!options.headers) {
options.headers = new Headers({ Accept: 'application/json' });
@@ -29,11 +32,22 @@ const httpClient: typeof fetchJSON = (url, options = {}) => {
const dataProvider = jsonServerProvider(import.meta.env.VITE_PUBLIC_SERVER_URL, httpClient);
const i18n: TushanContextProps['i18n'] = {
languages: [
{
key: 'zh',
label: '简体中文',
translation: i18nZhTranslation
}
]
};
function App() {
return (
<Tushan
basename="/"
header={'FastGpt-Admin'}
i18n={i18n}
dataProvider={dataProvider}
authProvider={authProvider}
dashboard={<Dashboard />}

View File

@@ -45,9 +45,6 @@ export const ModelFields = [
];
export const SystemFields = [
createTextField('openAIKeys', { label: 'openAIKeys逗号隔开' }),
createTextField('openAITrainingKeys', { label: 'openAITrainingKeys' }),
createTextField('gpt4Key', { label: 'gpt4Key' }),
createTextField('vectorMaxProcess', { label: '向量最大进程' }),
createTextField('qaMaxProcess', { label: 'qa最大进程' }),
createTextField('pgIvfflatProbe', { label: 'pg 探针数量' }),

View File

@@ -1,5 +1,7 @@
# 运行端口,如果不是 3000 口运行,需要改成其他的。注意:不是改了这个变量就会变成其他端口,而是因为改成其他端口,才用这个变量。
PORT=3000
# database max link
DB_MAX_LINK=5
# 代理
# AXIOS_PROXY_HOST=127.0.0.1
# AXIOS_PROXY_PORT=7890
@@ -15,12 +17,12 @@ aliTemplateCode=xxxx
TOKEN_KEY=dfdasfdas
# root key, 最高权限
ROOT_KEY=fdafasd
# openai
# OPENAI_BASE_URL=http://ai.openai.com/v1
# OPENAI_BASE_URL_AUTH=可选安全凭证,会放到 header.auth 里
OPENAIKEY=sk-xxx
OPENAI_TRAINING_KEY=sk-xxx
GPT4KEY=sk-xxx
# 使用 oneapi
# ONEAPI_URL=https://xxxx.cloud.sealos.io/v1
# ONEAPI_KEY=sk-xxxx
# openai 的基本地址(国外的可以忽略,默认走 api.openai.com。不用 oneapi 的话需要下面 2 个参数,用户的 key 也会走下面的参数
OPENAI_BASE_URL=https://xxxx.cloud.sealos.io/openai/v1
OPENAIKEY=sk-xxxx
# db
MONGODB_URI=mongodb://username:password@0.0.0.0:27017/?authSource=admin
MONGODB_NAME=fastgpt

View File

@@ -26,32 +26,31 @@
"crypto": "^1.0.1",
"date-fns": "^2.30.0",
"dayjs": "^1.11.7",
"eventsource-parser": "^0.1.0",
"formidable": "^2.1.1",
"framer-motion": "^9.0.6",
"graphemer": "^1.4.0",
"hyperdown": "^2.4.29",
"immer": "^9.0.19",
"jsonwebtoken": "^9.0.0",
"lodash": "^4.17.21",
"mammoth": "^1.5.1",
"mermaid": "^8.13.5",
"mermaid": "^10.2.3",
"mongoose": "^6.10.0",
"nanoid": "^4.0.1",
"next": "13.1.6",
"nextjs-cors": "^2.1.2",
"nodemailer": "^6.9.1",
"nprogress": "^0.2.0",
"openai": "^3.2.1",
"openai": "^3.3.0",
"papaparse": "^5.4.1",
"pg": "^8.10.0",
"react": "18.2.0",
"react-day-picker": "^8.7.1",
"react-dom": "18.2.0",
"react-hook-form": "^7.43.1",
"react-markdown": "^8.0.5",
"react-markdown": "^8.0.7",
"react-syntax-highlighter": "^15.5.0",
"rehype-katex": "^6.0.2",
"remark-breaks": "^3.0.3",
"remark-gfm": "^3.0.1",
"remark-math": "^5.1.1",
"request-ip": "^3.3.0",

542
client/pnpm-lock.yaml generated
View File

@@ -56,18 +56,12 @@ dependencies:
dayjs:
specifier: ^1.11.7
version: registry.npmmirror.com/dayjs@1.11.7
eventsource-parser:
specifier: ^0.1.0
version: registry.npmmirror.com/eventsource-parser@0.1.0
formidable:
specifier: ^2.1.1
version: registry.npmmirror.com/formidable@2.1.1
framer-motion:
specifier: ^9.0.6
version: registry.npmmirror.com/framer-motion@9.0.6(react-dom@18.2.0)(react@18.2.0)
graphemer:
specifier: ^1.4.0
version: registry.npmmirror.com/graphemer@1.4.0
hyperdown:
specifier: ^2.4.29
version: registry.npmmirror.com/hyperdown@2.4.29
@@ -84,8 +78,8 @@ dependencies:
specifier: ^1.5.1
version: registry.npmmirror.com/mammoth@1.5.1
mermaid:
specifier: ^8.13.5
version: registry.npmmirror.com/mermaid@8.13.5
specifier: ^10.2.3
version: registry.npmmirror.com/mermaid@10.2.3
mongoose:
specifier: ^6.10.0
version: registry.npmmirror.com/mongoose@6.10.0
@@ -105,8 +99,8 @@ dependencies:
specifier: ^0.2.0
version: registry.npmmirror.com/nprogress@0.2.0
openai:
specifier: ^3.2.1
version: registry.npmmirror.com/openai@3.2.1
specifier: ^3.3.0
version: registry.npmmirror.com/openai@3.3.0
papaparse:
specifier: ^5.4.1
version: registry.npmmirror.com/papaparse@5.4.1
@@ -126,14 +120,17 @@ dependencies:
specifier: ^7.43.1
version: registry.npmmirror.com/react-hook-form@7.43.1(react@18.2.0)
react-markdown:
specifier: ^8.0.5
version: registry.npmmirror.com/react-markdown@8.0.5(@types/react@18.0.28)(react@18.2.0)
specifier: ^8.0.7
version: registry.npmmirror.com/react-markdown@8.0.7(@types/react@18.0.28)(react@18.2.0)
react-syntax-highlighter:
specifier: ^15.5.0
version: registry.npmmirror.com/react-syntax-highlighter@15.5.0(react@18.2.0)
rehype-katex:
specifier: ^6.0.2
version: registry.npmmirror.com/rehype-katex@6.0.2
remark-breaks:
specifier: ^3.0.3
version: registry.npmmirror.com/remark-breaks@3.0.3
remark-gfm:
specifier: ^3.0.1
version: registry.npmmirror.com/remark-gfm@3.0.1
@@ -2923,11 +2920,10 @@ packages:
'@babel/helper-validator-identifier': registry.npmmirror.com/@babel/helper-validator-identifier@7.22.5
to-fast-properties: registry.npmmirror.com/to-fast-properties@2.0.0
registry.npmmirror.com/@braintree/sanitize-url@3.1.0:
resolution: {integrity: sha512-GcIY79elgB+azP74j8vqkiXz8xLFfIzbQJdlwOPisgbKT00tviJQuEghOXSMVxJ00HoYJbGswr4kcllUc4xCcg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/@braintree/sanitize-url/-/sanitize-url-3.1.0.tgz}
registry.npmmirror.com/@braintree/sanitize-url@6.0.2:
resolution: {integrity: sha512-Tbsj02wXCbqGmzdnXNk0SOF19ChhRU70BsroIi4Pm6Ehp56in6vch94mfbdQ17DozxkL3BAVjbZ4Qc1a0HFRAg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/@braintree/sanitize-url/-/sanitize-url-6.0.2.tgz}
name: '@braintree/sanitize-url'
version: 3.1.0
deprecated: Potential XSS vulnerability patched in v6.0.0.
version: 6.0.2
dev: false
registry.npmmirror.com/@chakra-ui/accordion@2.2.0(@chakra-ui/system@2.5.8)(framer-motion@9.0.6)(react@18.2.0):
@@ -6109,12 +6105,6 @@ packages:
version: 2.0.3
dev: false
registry.npmmirror.com/commander@2.20.3:
resolution: {integrity: sha512-GpVkmM8vF2vQUkj2LvZmD35JxeJOLCwJ9cUkugyk2nuhbv3+mJvpLYYt+0+USMxE+oj+ey/lJEnhZw75x/OMcQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/commander/-/commander-2.20.3.tgz}
name: commander
version: 2.20.3
dev: false
registry.npmmirror.com/commander@7.2.0:
resolution: {integrity: sha512-QrWXB+ZQSVPmIWIhtEO9H+gwHaMGYiF5ChvoJ+K9ZGHG/sVsa6yiesAD1GC/x46sET00Xlwo1u49RVVVzvcSkw==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/commander/-/commander-7.2.0.tgz}
name: commander
@@ -6197,6 +6187,22 @@ packages:
vary: registry.npmmirror.com/vary@1.1.2
dev: false
registry.npmmirror.com/cose-base@1.0.3:
resolution: {integrity: sha512-s9whTXInMSgAp/NVXVNuVxVKzGH2qck3aQlVHxDCdAEPgtMKwc4Wq6/QKhgdEdgbLSi9rBTAcPoRa6JpiG4ksg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/cose-base/-/cose-base-1.0.3.tgz}
name: cose-base
version: 1.0.3
dependencies:
layout-base: registry.npmmirror.com/layout-base@1.0.2
dev: false
registry.npmmirror.com/cose-base@2.2.0:
resolution: {integrity: sha512-AzlgcsCbUMymkADOJtQm3wO9S3ltPfYOFD5033keQn9NJzIbtnZj+UdBJe7DYml/8TdbtHJW3j58SOnKhWY/5g==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/cose-base/-/cose-base-2.2.0.tgz}
name: cose-base
version: 2.2.0
dependencies:
layout-base: registry.npmmirror.com/layout-base@2.0.1
dev: false
registry.npmmirror.com/cosmiconfig@7.1.0:
resolution: {integrity: sha512-AdmX6xUzdNASswsFtmwSt7Vj8po9IuqXm0UXz7QKPuEUmPB4XyjGfaAr2PSuELMwkRMVH1EpIkX5bTZGRB3eCA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/cosmiconfig/-/cosmiconfig-7.1.0.tgz}
name: cosmiconfig
@@ -6278,10 +6284,38 @@ packages:
name: csstype
version: 3.1.2
registry.npmmirror.com/d3-array@1.2.4:
resolution: {integrity: sha512-KHW6M86R+FUPYGb3R5XiYjXPq7VzwxZ22buHhAEVG5ztoEcZZMLov530mmccaqA1GghZArjQV46fuc8kUqhhHw==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-array/-/d3-array-1.2.4.tgz}
name: d3-array
version: 1.2.4
registry.npmmirror.com/cytoscape-cose-bilkent@4.1.0(cytoscape@3.25.0):
resolution: {integrity: sha512-wgQlVIUJF13Quxiv5e1gstZ08rnZj2XaLHGoFMYXz7SkNfCDOOteKBE6SYRfA9WxxI/iBc3ajfDoc6hb/MRAHQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/cytoscape-cose-bilkent/-/cytoscape-cose-bilkent-4.1.0.tgz}
id: registry.npmmirror.com/cytoscape-cose-bilkent/4.1.0
name: cytoscape-cose-bilkent
version: 4.1.0
peerDependencies:
cytoscape: ^3.2.0
dependencies:
cose-base: registry.npmmirror.com/cose-base@1.0.3
cytoscape: registry.npmmirror.com/cytoscape@3.25.0
dev: false
registry.npmmirror.com/cytoscape-fcose@2.2.0(cytoscape@3.25.0):
resolution: {integrity: sha512-ki1/VuRIHFCzxWNrsshHYPs6L7TvLu3DL+TyIGEsRcvVERmxokbf5Gdk7mFxZnTdiGtnA4cfSmjZJMviqSuZrQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/cytoscape-fcose/-/cytoscape-fcose-2.2.0.tgz}
id: registry.npmmirror.com/cytoscape-fcose/2.2.0
name: cytoscape-fcose
version: 2.2.0
peerDependencies:
cytoscape: ^3.2.0
dependencies:
cose-base: registry.npmmirror.com/cose-base@2.2.0
cytoscape: registry.npmmirror.com/cytoscape@3.25.0
dev: false
registry.npmmirror.com/cytoscape@3.25.0:
resolution: {integrity: sha512-7MW3Iz57mCUo6JQCho6CmPBCbTlJr7LzyEtIkutG255HLVd4XuBg2I9BkTZLI/e4HoaOB/BiAzXuQybQ95+r9Q==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/cytoscape/-/cytoscape-3.25.0.tgz}
name: cytoscape
version: 3.25.0
engines: {node: '>=0.10'}
dependencies:
heap: registry.npmmirror.com/heap@0.2.7
lodash: registry.npmmirror.com/lodash@4.17.21
dev: false
registry.npmmirror.com/d3-array@3.2.4:
@@ -6293,12 +6327,6 @@ packages:
internmap: registry.npmmirror.com/internmap@2.0.3
dev: false
registry.npmmirror.com/d3-axis@1.0.12:
resolution: {integrity: sha512-ejINPfPSNdGFKEOAtnBtdkpr24c4d4jsei6Lg98mxf424ivoDP2956/5HDpIAtmHo85lqT4pruy+zEgvRUBqaQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-axis/-/d3-axis-1.0.12.tgz}
name: d3-axis
version: 1.0.12
dev: false
registry.npmmirror.com/d3-axis@3.0.0:
resolution: {integrity: sha512-IH5tgjV4jE/GhHkRV0HiVYPDtvfjHQlQfJHs0usq7M30XcSBvOotpmH1IgkcXsO/5gEQZD43B//fc7SRT5S+xw==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-axis/-/d3-axis-3.0.0.tgz}
name: d3-axis
@@ -6306,18 +6334,6 @@ packages:
engines: {node: '>=12'}
dev: false
registry.npmmirror.com/d3-brush@1.1.6:
resolution: {integrity: sha512-7RW+w7HfMCPyZLifTz/UnJmI5kdkXtpCbombUSs8xniAyo0vIbrDzDwUJB6eJOgl9u5DQOt2TQlYumxzD1SvYA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-brush/-/d3-brush-1.1.6.tgz}
name: d3-brush
version: 1.1.6
dependencies:
d3-dispatch: registry.npmmirror.com/d3-dispatch@1.0.6
d3-drag: registry.npmmirror.com/d3-drag@1.2.5
d3-interpolate: registry.npmmirror.com/d3-interpolate@1.4.0
d3-selection: registry.npmmirror.com/d3-selection@1.4.2
d3-transition: registry.npmmirror.com/d3-transition@1.3.2
dev: false
registry.npmmirror.com/d3-brush@3.0.0:
resolution: {integrity: sha512-ALnjWlVYkXsVIGlOsuWH1+3udkYFI48Ljihfnh8FZPF2QS9o+PzGLBslO0PjzVoHLZ2KCVgAM8NVkXPJB2aNnQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-brush/-/d3-brush-3.0.0.tgz}
name: d3-brush
@@ -6331,15 +6347,6 @@ packages:
d3-transition: registry.npmmirror.com/d3-transition@3.0.1(d3-selection@3.0.0)
dev: false
registry.npmmirror.com/d3-chord@1.0.6:
resolution: {integrity: sha512-JXA2Dro1Fxw9rJe33Uv+Ckr5IrAa74TlfDEhE/jfLOaXegMQFQTAgAw9WnZL8+HxVBRXaRGCkrNU7pJeylRIuA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-chord/-/d3-chord-1.0.6.tgz}
name: d3-chord
version: 1.0.6
dependencies:
d3-array: registry.npmmirror.com/d3-array@1.2.4
d3-path: registry.npmmirror.com/d3-path@1.0.9
dev: false
registry.npmmirror.com/d3-chord@3.0.1:
resolution: {integrity: sha512-VE5S6TNa+j8msksl7HwjxMHDM2yNK3XCkusIlpX5kwauBfXuyLAtNg9jCp/iHH61tgI4sb6R/EIMWCqEIdjT/g==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-chord/-/d3-chord-3.0.1.tgz}
name: d3-chord
@@ -6349,18 +6356,6 @@ packages:
d3-path: registry.npmmirror.com/d3-path@3.1.0
dev: false
registry.npmmirror.com/d3-collection@1.0.7:
resolution: {integrity: sha512-ii0/r5f4sjKNTfh84Di+DpztYwqKhEyUlKoPrzUFfeSkWxjW49xU2QzO9qrPrNkpdI0XJkfzvmTu8V2Zylln6A==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-collection/-/d3-collection-1.0.7.tgz}
name: d3-collection
version: 1.0.7
dev: false
registry.npmmirror.com/d3-color@1.4.1:
resolution: {integrity: sha512-p2sTHSLCJI2QKunbGb7ocOh7DgTAn8IrLx21QRc/BSnodXM4sv6aLQlnfpvehFMLZEfBc6g9pH9SWQccFYfJ9Q==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-color/-/d3-color-1.4.1.tgz}
name: d3-color
version: 1.4.1
dev: false
registry.npmmirror.com/d3-color@3.1.0:
resolution: {integrity: sha512-zg/chbXyeBtMQ1LbD/WSoW2DpC3I0mpmPdW+ynRTj/x2DAWYrIY7qeZIHidozwV24m4iavr15lNwIwLxRmOxhA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-color/-/d3-color-3.1.0.tgz}
name: d3-color
@@ -6368,14 +6363,6 @@ packages:
engines: {node: '>=12'}
dev: false
registry.npmmirror.com/d3-contour@1.3.2:
resolution: {integrity: sha512-hoPp4K/rJCu0ladiH6zmJUEz6+u3lgR+GSm/QdM2BBvDraU39Vr7YdDCicJcxP1z8i9B/2dJLgDC1NcvlF8WCg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-contour/-/d3-contour-1.3.2.tgz}
name: d3-contour
version: 1.3.2
dependencies:
d3-array: registry.npmmirror.com/d3-array@1.2.4
dev: false
registry.npmmirror.com/d3-contour@4.0.2:
resolution: {integrity: sha512-4EzFTRIikzs47RGmdxbeUvLWtGedDUNkTcmzoeyg4sP/dvCexO47AaQL7VKy/gul85TOxw+IBgA8US2xwbToNA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-contour/-/d3-contour-4.0.2.tgz}
name: d3-contour
@@ -6394,12 +6381,6 @@ packages:
delaunator: registry.npmmirror.com/delaunator@5.0.0
dev: false
registry.npmmirror.com/d3-dispatch@1.0.6:
resolution: {integrity: sha512-fVjoElzjhCEy+Hbn8KygnmMS7Or0a9sI2UzGwoB7cCtvI1XpVN9GpoYlnb3xt2YV66oXYb1fLJ8GMvP4hdU1RA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-dispatch/-/d3-dispatch-1.0.6.tgz}
name: d3-dispatch
version: 1.0.6
dev: false
registry.npmmirror.com/d3-dispatch@3.0.1:
resolution: {integrity: sha512-rzUyPU/S7rwUflMyLc1ETDeBj0NRuHKKAcvukozwhshr6g6c5d8zh4c2gQjY2bZ0dXeGLWc1PF174P2tVvKhfg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-dispatch/-/d3-dispatch-3.0.1.tgz}
name: d3-dispatch
@@ -6407,15 +6388,6 @@ packages:
engines: {node: '>=12'}
dev: false
registry.npmmirror.com/d3-drag@1.2.5:
resolution: {integrity: sha512-rD1ohlkKQwMZYkQlYVCrSFxsWPzI97+W+PaEIBNTMxRuxz9RF0Hi5nJWHGVJ3Om9d2fRTe1yOBINJyy/ahV95w==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-drag/-/d3-drag-1.2.5.tgz}
name: d3-drag
version: 1.2.5
dependencies:
d3-dispatch: registry.npmmirror.com/d3-dispatch@1.0.6
d3-selection: registry.npmmirror.com/d3-selection@1.4.2
dev: false
registry.npmmirror.com/d3-drag@3.0.0:
resolution: {integrity: sha512-pWbUJLdETVA8lQNJecMxoXfH6x+mO2UQo8rSmZ+QqxcbyA3hfeprFgIT//HW2nlHChWeIIMwS2Fq+gEARkhTkg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-drag/-/d3-drag-3.0.0.tgz}
name: d3-drag
@@ -6426,17 +6398,6 @@ packages:
d3-selection: registry.npmmirror.com/d3-selection@3.0.0
dev: false
registry.npmmirror.com/d3-dsv@1.2.0:
resolution: {integrity: sha512-9yVlqvZcSOMhCYzniHE7EVUws7Fa1zgw+/EAV2BxJoG3ME19V6BQFBwI855XQDsxyOuG7NibqRMTtiF/Qup46g==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-dsv/-/d3-dsv-1.2.0.tgz}
name: d3-dsv
version: 1.2.0
hasBin: true
dependencies:
commander: registry.npmmirror.com/commander@2.20.3
iconv-lite: registry.npmmirror.com/iconv-lite@0.4.24
rw: registry.npmmirror.com/rw@1.3.3
dev: false
registry.npmmirror.com/d3-dsv@3.0.1:
resolution: {integrity: sha512-UG6OvdI5afDIFP9w4G0mNq50dSOsXHJaRE8arAS5o9ApWnIElp8GZw1Dun8vP8OyHOZ/QJUKUJwxiiCCnUwm+Q==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-dsv/-/d3-dsv-3.0.1.tgz}
name: d3-dsv
@@ -6449,12 +6410,6 @@ packages:
rw: registry.npmmirror.com/rw@1.3.3
dev: false
registry.npmmirror.com/d3-ease@1.0.7:
resolution: {integrity: sha512-lx14ZPYkhNx0s/2HX5sLFUI3mbasHjSSpwO/KaaNACweVwxUruKyWVcb293wMv1RqTPZyZ8kSZ2NogUZNcLOFQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-ease/-/d3-ease-1.0.7.tgz}
name: d3-ease
version: 1.0.7
dev: false
registry.npmmirror.com/d3-ease@3.0.1:
resolution: {integrity: sha512-wR/XK3D3XcLIZwpbvQwQ5fK+8Ykds1ip7A2Txe0yxncXSdq1L9skcG7blcedkOX+ZcgxGAmLX1FrRGbADwzi0w==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-ease/-/d3-ease-3.0.1.tgz}
name: d3-ease
@@ -6462,14 +6417,6 @@ packages:
engines: {node: '>=12'}
dev: false
registry.npmmirror.com/d3-fetch@1.2.0:
resolution: {integrity: sha512-yC78NBVcd2zFAyR/HnUiBS7Lf6inSCoWcSxFfw8FYL7ydiqe80SazNwoffcqOfs95XaLo7yebsmQqDKSsXUtvA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-fetch/-/d3-fetch-1.2.0.tgz}
name: d3-fetch
version: 1.2.0
dependencies:
d3-dsv: registry.npmmirror.com/d3-dsv@1.2.0
dev: false
registry.npmmirror.com/d3-fetch@3.0.1:
resolution: {integrity: sha512-kpkQIM20n3oLVBKGg6oHrUchHM3xODkTzjMoj7aWQFq5QEM+R6E4WkzT5+tojDY7yjez8KgCBRoj4aEr99Fdqw==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-fetch/-/d3-fetch-3.0.1.tgz}
name: d3-fetch
@@ -6479,17 +6426,6 @@ packages:
d3-dsv: registry.npmmirror.com/d3-dsv@3.0.1
dev: false
registry.npmmirror.com/d3-force@1.2.1:
resolution: {integrity: sha512-HHvehyaiUlVo5CxBJ0yF/xny4xoaxFxDnBXNvNcfW9adORGZfyNF1dj6DGLKyk4Yh3brP/1h3rnDzdIAwL08zg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-force/-/d3-force-1.2.1.tgz}
name: d3-force
version: 1.2.1
dependencies:
d3-collection: registry.npmmirror.com/d3-collection@1.0.7
d3-dispatch: registry.npmmirror.com/d3-dispatch@1.0.6
d3-quadtree: registry.npmmirror.com/d3-quadtree@1.0.7
d3-timer: registry.npmmirror.com/d3-timer@1.0.10
dev: false
registry.npmmirror.com/d3-force@3.0.0:
resolution: {integrity: sha512-zxV/SsA+U4yte8051P4ECydjD/S+qeYtnaIyAs9tgHCqfguma/aAQDjo85A9Z6EKhBirHRJHXIgJUlffT4wdLg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-force/-/d3-force-3.0.0.tgz}
name: d3-force
@@ -6501,12 +6437,6 @@ packages:
d3-timer: registry.npmmirror.com/d3-timer@3.0.1
dev: false
registry.npmmirror.com/d3-format@1.4.5:
resolution: {integrity: sha512-J0piedu6Z8iB6TbIGfZgDzfXxUFN3qQRMofy2oPdXzQibYGqPB/9iMcxr/TGalU+2RsyDO+U4f33id8tbnSRMQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-format/-/d3-format-1.4.5.tgz}
name: d3-format
version: 1.4.5
dev: false
registry.npmmirror.com/d3-format@3.1.0:
resolution: {integrity: sha512-YyUI6AEuY/Wpt8KWLgZHsIU86atmikuoOmCfommt0LYHiQSPjvX2AcFc38PX0CBpr2RCyZhjex+NS/LPOv6YqA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-format/-/d3-format-3.1.0.tgz}
name: d3-format
@@ -6514,14 +6444,6 @@ packages:
engines: {node: '>=12'}
dev: false
registry.npmmirror.com/d3-geo@1.12.1:
resolution: {integrity: sha512-XG4d1c/UJSEX9NfU02KwBL6BYPj8YKHxgBEw5om2ZnTRSbIcego6dhHwcxuSR3clxh0EpE38os1DVPOmnYtTPg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-geo/-/d3-geo-1.12.1.tgz}
name: d3-geo
version: 1.12.1
dependencies:
d3-array: registry.npmmirror.com/d3-array@1.2.4
dev: false
registry.npmmirror.com/d3-geo@3.1.0:
resolution: {integrity: sha512-JEo5HxXDdDYXCaWdwLRt79y7giK8SbhZJbFWXqbRTolCHFI5jRqteLzCsq51NKbUoX0PjBVSohxrx+NoOUujYA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-geo/-/d3-geo-3.1.0.tgz}
name: d3-geo
@@ -6531,12 +6453,6 @@ packages:
d3-array: registry.npmmirror.com/d3-array@3.2.4
dev: false
registry.npmmirror.com/d3-hierarchy@1.1.9:
resolution: {integrity: sha512-j8tPxlqh1srJHAtxfvOUwKNYJkQuBFdM1+JAUfq6xqH5eAqf93L7oG1NVqDa4CpFZNvnNKtCYEUC8KY9yEn9lQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-hierarchy/-/d3-hierarchy-1.1.9.tgz}
name: d3-hierarchy
version: 1.1.9
dev: false
registry.npmmirror.com/d3-hierarchy@3.1.2:
resolution: {integrity: sha512-FX/9frcub54beBdugHjDCdikxThEqjnR93Qt7PvQTOHxyiNCAlvMrHhclk3cD5VeAaq9fxmfRp+CnWw9rEMBuA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-hierarchy/-/d3-hierarchy-3.1.2.tgz}
name: d3-hierarchy
@@ -6544,14 +6460,6 @@ packages:
engines: {node: '>=12'}
dev: false
registry.npmmirror.com/d3-interpolate@1.4.0:
resolution: {integrity: sha512-V9znK0zc3jOPV4VD2zZn0sDhZU3WAE2bmlxdIwwQPPzPjvyLkd8B3JUVdS1IDUFDkWZ72c9qnv1GK2ZagTZ8EA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-interpolate/-/d3-interpolate-1.4.0.tgz}
name: d3-interpolate
version: 1.4.0
dependencies:
d3-color: registry.npmmirror.com/d3-color@1.4.1
dev: false
registry.npmmirror.com/d3-interpolate@3.0.1:
resolution: {integrity: sha512-3bYs1rOD33uo8aqJfKP3JWPAibgw8Zm2+L9vBKEHJ2Rg+viTR7o5Mmv5mZcieN+FRYaAOWX5SJATX6k1PWz72g==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-interpolate/-/d3-interpolate-3.0.1.tgz}
name: d3-interpolate
@@ -6561,12 +6469,6 @@ packages:
d3-color: registry.npmmirror.com/d3-color@3.1.0
dev: false
registry.npmmirror.com/d3-path@1.0.9:
resolution: {integrity: sha512-VLaYcn81dtHVTjEHd8B+pbe9yHWpXKZUC87PzoFmsFrJqgFwDe/qxfp5MlfsfM1V5E/iVt0MmEbWQ7FVIXh/bg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-path/-/d3-path-1.0.9.tgz}
name: d3-path
version: 1.0.9
dev: false
registry.npmmirror.com/d3-path@3.1.0:
resolution: {integrity: sha512-p3KP5HCf/bvjBSSKuXid6Zqijx7wIfNW+J/maPs+iwR35at5JCbLUT0LzF1cnjbCHWhqzQTIN2Jpe8pRebIEFQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-path/-/d3-path-3.1.0.tgz}
name: d3-path
@@ -6574,12 +6476,6 @@ packages:
engines: {node: '>=12'}
dev: false
registry.npmmirror.com/d3-polygon@1.0.6:
resolution: {integrity: sha512-k+RF7WvI08PC8reEoXa/w2nSg5AUMTi+peBD9cmFc+0ixHfbs4QmxxkarVal1IkVkgxVuk9JSHhJURHiyHKAuQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-polygon/-/d3-polygon-1.0.6.tgz}
name: d3-polygon
version: 1.0.6
dev: false
registry.npmmirror.com/d3-polygon@3.0.1:
resolution: {integrity: sha512-3vbA7vXYwfe1SYhED++fPUQlWSYTTGmFmQiany/gdbiWgU/iEyQzyymwL9SkJjFFuCS4902BSzewVGsHHmHtXg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-polygon/-/d3-polygon-3.0.1.tgz}
name: d3-polygon
@@ -6587,12 +6483,6 @@ packages:
engines: {node: '>=12'}
dev: false
registry.npmmirror.com/d3-quadtree@1.0.7:
resolution: {integrity: sha512-RKPAeXnkC59IDGD0Wu5mANy0Q2V28L+fNe65pOCXVdVuTJS3WPKaJlFHer32Rbh9gIo9qMuJXio8ra4+YmIymA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-quadtree/-/d3-quadtree-1.0.7.tgz}
name: d3-quadtree
version: 1.0.7
dev: false
registry.npmmirror.com/d3-quadtree@3.0.1:
resolution: {integrity: sha512-04xDrxQTDTCFwP5H6hRhsRcb9xxv2RzkcsygFzmkSIOJy3PeRJP7sNk3VRIbKXcog561P9oU0/rVH6vDROAgUw==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-quadtree/-/d3-quadtree-3.0.1.tgz}
name: d3-quadtree
@@ -6600,12 +6490,6 @@ packages:
engines: {node: '>=12'}
dev: false
registry.npmmirror.com/d3-random@1.1.2:
resolution: {integrity: sha512-6AK5BNpIFqP+cx/sreKzNjWbwZQCSUatxq+pPRmFIQaWuoD+NrbVWw7YWpHiXpCQ/NanKdtGDuB+VQcZDaEmYQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-random/-/d3-random-1.1.2.tgz}
name: d3-random
version: 1.1.2
dev: false
registry.npmmirror.com/d3-random@3.0.1:
resolution: {integrity: sha512-FXMe9GfxTxqd5D6jFsQ+DJ8BJS4E/fT5mqqdjovykEB2oFbTMDVdg1MGFxfQW+FBOGoB++k8swBrgwSHT1cUXQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-random/-/d3-random-3.0.1.tgz}
name: d3-random
@@ -6613,15 +6497,6 @@ packages:
engines: {node: '>=12'}
dev: false
registry.npmmirror.com/d3-scale-chromatic@1.5.0:
resolution: {integrity: sha512-ACcL46DYImpRFMBcpk9HhtIyC7bTBR4fNOPxwVSl0LfulDAwyiHyPOTqcDG1+t5d4P9W7t/2NAuWu59aKko/cg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-scale-chromatic/-/d3-scale-chromatic-1.5.0.tgz}
name: d3-scale-chromatic
version: 1.5.0
dependencies:
d3-color: registry.npmmirror.com/d3-color@1.4.1
d3-interpolate: registry.npmmirror.com/d3-interpolate@1.4.0
dev: false
registry.npmmirror.com/d3-scale-chromatic@3.0.0:
resolution: {integrity: sha512-Lx9thtxAKrO2Pq6OO2Ua474opeziKr279P/TKZsMAhYyNDD3EnCffdbgeSYN5O7m2ByQsxtuP2CSDczNUIZ22g==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-scale-chromatic/-/d3-scale-chromatic-3.0.0.tgz}
name: d3-scale-chromatic
@@ -6632,19 +6507,6 @@ packages:
d3-interpolate: registry.npmmirror.com/d3-interpolate@3.0.1
dev: false
registry.npmmirror.com/d3-scale@2.2.2:
resolution: {integrity: sha512-LbeEvGgIb8UMcAa0EATLNX0lelKWGYDQiPdHj+gLblGVhGLyNbaCn3EvrJf0A3Y/uOOU5aD6MTh5ZFCdEwGiCw==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-scale/-/d3-scale-2.2.2.tgz}
name: d3-scale
version: 2.2.2
dependencies:
d3-array: registry.npmmirror.com/d3-array@1.2.4
d3-collection: registry.npmmirror.com/d3-collection@1.0.7
d3-format: registry.npmmirror.com/d3-format@1.4.5
d3-interpolate: registry.npmmirror.com/d3-interpolate@1.4.0
d3-time: registry.npmmirror.com/d3-time@1.1.0
d3-time-format: registry.npmmirror.com/d3-time-format@2.3.0
dev: false
registry.npmmirror.com/d3-scale@4.0.2:
resolution: {integrity: sha512-GZW464g1SH7ag3Y7hXjf8RoUuAFIqklOAq3MRl4OaWabTFJY9PN/E1YklhXLh+OQ3fM9yS2nOkCoS+WLZ6kvxQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-scale/-/d3-scale-4.0.2.tgz}
name: d3-scale
@@ -6658,12 +6520,6 @@ packages:
d3-time-format: registry.npmmirror.com/d3-time-format@4.1.0
dev: false
registry.npmmirror.com/d3-selection@1.4.2:
resolution: {integrity: sha512-SJ0BqYihzOjDnnlfyeHT0e30k0K1+5sR3d5fNueCNeuhZTnGw4M4o8mqJchSwgKMXCNFo+e2VTChiSJ0vYtXkg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-selection/-/d3-selection-1.4.2.tgz}
name: d3-selection
version: 1.4.2
dev: false
registry.npmmirror.com/d3-selection@3.0.0:
resolution: {integrity: sha512-fmTRWbNMmsmWq6xJV8D19U/gw/bwrHfNXxrIN+HfZgnzqTHp9jOmKMhsTUjXOJnZOdZY9Q28y4yebKzqDKlxlQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-selection/-/d3-selection-3.0.0.tgz}
name: d3-selection
@@ -6671,14 +6527,6 @@ packages:
engines: {node: '>=12'}
dev: false
registry.npmmirror.com/d3-shape@1.3.7:
resolution: {integrity: sha512-EUkvKjqPFUAZyOlhY5gzCxCeI0Aep04LwIRpsZ/mLFelJiUfnK56jo5JMDSE7yyP2kLSb6LtF+S5chMk7uqPqw==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-shape/-/d3-shape-1.3.7.tgz}
name: d3-shape
version: 1.3.7
dependencies:
d3-path: registry.npmmirror.com/d3-path@1.0.9
dev: false
registry.npmmirror.com/d3-shape@3.2.0:
resolution: {integrity: sha512-SaLBuwGm3MOViRq2ABk3eLoxwZELpH6zhl3FbAoJ7Vm1gofKx6El1Ib5z23NUEhF9AsGl7y+dzLe5Cw2AArGTA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-shape/-/d3-shape-3.2.0.tgz}
name: d3-shape
@@ -6688,14 +6536,6 @@ packages:
d3-path: registry.npmmirror.com/d3-path@3.1.0
dev: false
registry.npmmirror.com/d3-time-format@2.3.0:
resolution: {integrity: sha512-guv6b2H37s2Uq/GefleCDtbe0XZAuy7Wa49VGkPVPMfLL9qObgBST3lEHJBMUp8S7NdLQAGIvr2KXk8Hc98iKQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-time-format/-/d3-time-format-2.3.0.tgz}
name: d3-time-format
version: 2.3.0
dependencies:
d3-time: registry.npmmirror.com/d3-time@1.1.0
dev: false
registry.npmmirror.com/d3-time-format@4.1.0:
resolution: {integrity: sha512-dJxPBlzC7NugB2PDLwo9Q8JiTR3M3e4/XANkreKSUxF8vvXKqm1Yfq4Q5dl8budlunRVlUUaDUgFt7eA8D6NLg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-time-format/-/d3-time-format-4.1.0.tgz}
name: d3-time-format
@@ -6705,12 +6545,6 @@ packages:
d3-time: registry.npmmirror.com/d3-time@3.1.0
dev: false
registry.npmmirror.com/d3-time@1.1.0:
resolution: {integrity: sha512-Xh0isrZ5rPYYdqhAVk8VLnMEidhz5aP7htAADH6MfzgmmicPkTo8LhkLxci61/lCB7n7UmE3bN0leRt+qvkLxA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-time/-/d3-time-1.1.0.tgz}
name: d3-time
version: 1.1.0
dev: false
registry.npmmirror.com/d3-time@3.1.0:
resolution: {integrity: sha512-VqKjzBLejbSMT4IgbmVgDjpkYrNWUYJnbCGo874u7MMKIWsILRX+OpX/gTk8MqjpT1A/c6HY2dCA77ZN0lkQ2Q==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-time/-/d3-time-3.1.0.tgz}
name: d3-time
@@ -6720,12 +6554,6 @@ packages:
d3-array: registry.npmmirror.com/d3-array@3.2.4
dev: false
registry.npmmirror.com/d3-timer@1.0.10:
resolution: {integrity: sha512-B1JDm0XDaQC+uvo4DT79H0XmBskgS3l6Ve+1SBCfxgmtIb1AVrPIoqd+nPSv+loMX8szQ0sVUhGngL7D5QPiXw==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-timer/-/d3-timer-1.0.10.tgz}
name: d3-timer
version: 1.0.10
dev: false
registry.npmmirror.com/d3-timer@3.0.1:
resolution: {integrity: sha512-ndfJ/JxxMd3nw31uyKoY2naivF+r29V+Lc0svZxe1JvvIRmi8hUsrMvdOwgS1o6uBHmiz91geQ0ylPP0aj1VUA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-timer/-/d3-timer-3.0.1.tgz}
name: d3-timer
@@ -6733,19 +6561,6 @@ packages:
engines: {node: '>=12'}
dev: false
registry.npmmirror.com/d3-transition@1.3.2:
resolution: {integrity: sha512-sc0gRU4PFqZ47lPVHloMn9tlPcv8jxgOQg+0zjhfZXMQuvppjG6YuwdMBE0TuqCZjeJkLecku/l9R0JPcRhaDA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-transition/-/d3-transition-1.3.2.tgz}
name: d3-transition
version: 1.3.2
dependencies:
d3-color: registry.npmmirror.com/d3-color@1.4.1
d3-dispatch: registry.npmmirror.com/d3-dispatch@1.0.6
d3-ease: registry.npmmirror.com/d3-ease@1.0.7
d3-interpolate: registry.npmmirror.com/d3-interpolate@1.4.0
d3-selection: registry.npmmirror.com/d3-selection@1.4.2
d3-timer: registry.npmmirror.com/d3-timer@1.0.10
dev: false
registry.npmmirror.com/d3-transition@3.0.1(d3-selection@3.0.0):
resolution: {integrity: sha512-ApKvfjsSR6tg06xrL434C0WydLr7JewBB3V+/39RMHsaXTOG0zmt/OAXeng5M5LBm0ojmxJrpomQVZ1aPvBL4w==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-transition/-/d3-transition-3.0.1.tgz}
id: registry.npmmirror.com/d3-transition/3.0.1
@@ -6763,24 +6578,6 @@ packages:
d3-timer: registry.npmmirror.com/d3-timer@3.0.1
dev: false
registry.npmmirror.com/d3-voronoi@1.1.4:
resolution: {integrity: sha512-dArJ32hchFsrQ8uMiTBLq256MpnZjeuBtdHpaDlYuQyjU0CVzCJl/BVW+SkszaAeH95D/8gxqAhgx0ouAWAfRg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-voronoi/-/d3-voronoi-1.1.4.tgz}
name: d3-voronoi
version: 1.1.4
dev: false
registry.npmmirror.com/d3-zoom@1.8.3:
resolution: {integrity: sha512-VoLXTK4wvy1a0JpH2Il+F2CiOhVu7VRXWF5M/LroMIh3/zBAC3WAt7QoIvPibOavVo20hN6/37vwAsdBejLyKQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-zoom/-/d3-zoom-1.8.3.tgz}
name: d3-zoom
version: 1.8.3
dependencies:
d3-dispatch: registry.npmmirror.com/d3-dispatch@1.0.6
d3-drag: registry.npmmirror.com/d3-drag@1.2.5
d3-interpolate: registry.npmmirror.com/d3-interpolate@1.4.0
d3-selection: registry.npmmirror.com/d3-selection@1.4.2
d3-transition: registry.npmmirror.com/d3-transition@1.3.2
dev: false
registry.npmmirror.com/d3-zoom@3.0.0:
resolution: {integrity: sha512-b8AmV3kfQaqWAuacbPuNbL6vahnOJflOhexLzMMNLga62+/nh0JzvJ0aO/5a5MVgUFGS7Hu1P9P03o3fJkDCyw==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3-zoom/-/d3-zoom-3.0.0.tgz}
name: d3-zoom
@@ -6794,44 +6591,6 @@ packages:
d3-transition: registry.npmmirror.com/d3-transition@3.0.1(d3-selection@3.0.0)
dev: false
registry.npmmirror.com/d3@5.16.0:
resolution: {integrity: sha512-4PL5hHaHwX4m7Zr1UapXW23apo6pexCgdetdJ5kTmADpG/7T9Gkxw0M0tf/pjoB63ezCCm0u5UaFYy2aMt0Mcw==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3/-/d3-5.16.0.tgz}
name: d3
version: 5.16.0
dependencies:
d3-array: registry.npmmirror.com/d3-array@1.2.4
d3-axis: registry.npmmirror.com/d3-axis@1.0.12
d3-brush: registry.npmmirror.com/d3-brush@1.1.6
d3-chord: registry.npmmirror.com/d3-chord@1.0.6
d3-collection: registry.npmmirror.com/d3-collection@1.0.7
d3-color: registry.npmmirror.com/d3-color@1.4.1
d3-contour: registry.npmmirror.com/d3-contour@1.3.2
d3-dispatch: registry.npmmirror.com/d3-dispatch@1.0.6
d3-drag: registry.npmmirror.com/d3-drag@1.2.5
d3-dsv: registry.npmmirror.com/d3-dsv@1.2.0
d3-ease: registry.npmmirror.com/d3-ease@1.0.7
d3-fetch: registry.npmmirror.com/d3-fetch@1.2.0
d3-force: registry.npmmirror.com/d3-force@1.2.1
d3-format: registry.npmmirror.com/d3-format@1.4.5
d3-geo: registry.npmmirror.com/d3-geo@1.12.1
d3-hierarchy: registry.npmmirror.com/d3-hierarchy@1.1.9
d3-interpolate: registry.npmmirror.com/d3-interpolate@1.4.0
d3-path: registry.npmmirror.com/d3-path@1.0.9
d3-polygon: registry.npmmirror.com/d3-polygon@1.0.6
d3-quadtree: registry.npmmirror.com/d3-quadtree@1.0.7
d3-random: registry.npmmirror.com/d3-random@1.1.2
d3-scale: registry.npmmirror.com/d3-scale@2.2.2
d3-scale-chromatic: registry.npmmirror.com/d3-scale-chromatic@1.5.0
d3-selection: registry.npmmirror.com/d3-selection@1.4.2
d3-shape: registry.npmmirror.com/d3-shape@1.3.7
d3-time: registry.npmmirror.com/d3-time@1.1.0
d3-time-format: registry.npmmirror.com/d3-time-format@2.3.0
d3-timer: registry.npmmirror.com/d3-timer@1.0.10
d3-transition: registry.npmmirror.com/d3-transition@1.3.2
d3-voronoi: registry.npmmirror.com/d3-voronoi@1.1.4
d3-zoom: registry.npmmirror.com/d3-zoom@1.8.3
dev: false
registry.npmmirror.com/d3@7.8.5:
resolution: {integrity: sha512-JgoahDG51ncUfJu6wX/1vWQEqOflgXyl4MaHqlcSruTez7yhaRKR9i8VjjcQGeS2en/jnFivXuaIMnseMMt0XA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/d3/-/d3-7.8.5.tgz}
name: d3
@@ -6870,24 +6629,13 @@ packages:
d3-zoom: registry.npmmirror.com/d3-zoom@3.0.0
dev: false
registry.npmmirror.com/dagre-d3@0.6.4:
resolution: {integrity: sha512-e/6jXeCP7/ptlAM48clmX4xTZc5Ek6T6kagS7Oz2HrYSdqcLZFLqpAfh7ldbZRFfxCZVyh61NEPR08UQRVxJzQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/dagre-d3/-/dagre-d3-0.6.4.tgz}
name: dagre-d3
version: 0.6.4
registry.npmmirror.com/dagre-d3-es@7.0.10:
resolution: {integrity: sha512-qTCQmEhcynucuaZgY5/+ti3X/rnszKZhEQH/ZdWdtP1tA/y3VoHJzcVrO9pjjJCNpigfscAtoUB5ONcd2wNn0A==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/dagre-d3-es/-/dagre-d3-es-7.0.10.tgz}
name: dagre-d3-es
version: 7.0.10
dependencies:
d3: registry.npmmirror.com/d3@5.16.0
dagre: registry.npmmirror.com/dagre@0.8.5
graphlib: registry.npmmirror.com/graphlib@2.1.8
lodash: registry.npmmirror.com/lodash@4.17.21
dev: false
registry.npmmirror.com/dagre@0.8.5:
resolution: {integrity: sha512-/aTqmnRta7x7MCCpExk7HQL2O4owCT2h8NT//9I1OQ9vt29Pa0BzSAkR5lwFUcQ7491yVi/3CXU9jQ5o0Mn2Sw==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/dagre/-/dagre-0.8.5.tgz}
name: dagre
version: 0.8.5
dependencies:
graphlib: registry.npmmirror.com/graphlib@2.1.8
lodash: registry.npmmirror.com/lodash@4.17.21
d3: registry.npmmirror.com/d3@7.8.5
lodash-es: registry.npmmirror.com/lodash-es@4.17.21
dev: false
registry.npmmirror.com/damerau-levenshtein@1.0.8:
@@ -7185,10 +6933,10 @@ packages:
domelementtype: registry.npmmirror.com/domelementtype@2.3.0
dev: true
registry.npmmirror.com/dompurify@2.3.3:
resolution: {integrity: sha512-dqnqRkPMAjOZE0FogZ+ceJNM2dZ3V/yNOuFB7+39qpO93hHhfRpHw3heYQC7DPK9FqbQTfBKUJhiSfz4MvXYwg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/dompurify/-/dompurify-2.3.3.tgz}
registry.npmmirror.com/dompurify@3.0.3:
resolution: {integrity: sha512-axQ9zieHLnAnHh0sfAamKYiqXMJAVwu+LM/alQ7WDagoWessyWvMSFyW65CqF3owufNu8HBcE4cM2Vflu7YWcQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/dompurify/-/dompurify-3.0.3.tgz}
name: dompurify
version: 2.3.3
version: 3.0.3
dev: false
registry.npmmirror.com/domutils@2.8.0:
@@ -7228,6 +6976,12 @@ packages:
name: electron-to-chromium
version: 1.4.425
registry.npmmirror.com/elkjs@0.8.2:
resolution: {integrity: sha512-L6uRgvZTH+4OF5NE/MBbzQx/WYpru1xCBE9respNj6qznEewGUIfhzmm7horWWxbNO2M0WckQypGctR8lH79xQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/elkjs/-/elkjs-0.8.2.tgz}
name: elkjs
version: 0.8.2
dev: false
registry.npmmirror.com/emoji-regex@9.2.2:
resolution: {integrity: sha512-L18DaJsXSUk2+42pv8mLs5jJT2hqFkFE4j21wOmgbUqsZ2hL72NsUU785g9RXgo3s0ZNgVl42TiHp3ZtOv/Vyg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/emoji-regex/-/emoji-regex-9.2.2.tgz}
name: emoji-regex
@@ -7753,13 +7507,6 @@ packages:
version: 2.0.3
engines: {node: '>=0.10.0'}
registry.npmmirror.com/eventsource-parser@0.1.0:
resolution: {integrity: sha512-M9QjFtEIkwytUarnx113HGmgtk52LSn3jNAtnWKi3V+b9rqSfQeVdLsaD5AG/O4IrGQwmAAHBIsqbmURPTd2rA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/eventsource-parser/-/eventsource-parser-0.1.0.tgz}
name: eventsource-parser
version: 0.1.0
engines: {node: '>=14.18'}
dev: false
registry.npmmirror.com/execa@5.1.1:
resolution: {integrity: sha512-8uSpZZocAZRBAPIEINJj3Lo9HyGitllczc27Eh5YYojjMFMn8yHMDMaUHE2Jqfq05D/wucwI4JGURyXt1vchyg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/execa/-/execa-5.1.1.tgz}
name: execa
@@ -8253,20 +8000,6 @@ packages:
version: 1.0.4
dev: true
registry.npmmirror.com/graphemer@1.4.0:
resolution: {integrity: sha512-EtKwoO6kxCL9WO5xipiHTZlSzBm7WLT627TqC/uVRd0HKmq8NXyebnNYxDoBi7wt8eTWrUrKXCOVaFq9x1kgag==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/graphemer/-/graphemer-1.4.0.tgz}
name: graphemer
version: 1.4.0
dev: false
registry.npmmirror.com/graphlib@2.1.8:
resolution: {integrity: sha512-jcLLfkpoVGmH7/InMC/1hIvOPSUh38oJtGhvrOFGzioE1DZ+0YW16RgmOJhHiuWTvGiJQ9Z1Ik43JvkRPRvE+A==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/graphlib/-/graphlib-2.1.8.tgz}
name: graphlib
version: 2.1.8
dependencies:
lodash: registry.npmmirror.com/lodash@4.17.21
dev: false
registry.npmmirror.com/has-bigints@1.0.2:
resolution: {integrity: sha512-tSvCKtBr9lkF0Ex0aQiP9N+OpV4zi2r/Nee5VkRDbaqv35RLYMzbwQfFSZZH0kR+Rd6302UJZ2p/bJCEoR3VoQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/has-bigints/-/has-bigints-1.0.2.tgz}
name: has-bigints
@@ -8401,6 +8134,12 @@ packages:
space-separated-tokens: registry.npmmirror.com/space-separated-tokens@2.0.2
dev: false
registry.npmmirror.com/heap@0.2.7:
resolution: {integrity: sha512-2bsegYkkHO+h/9MGbn6KWcE45cHZgPANo5LXF7EvWdT0yT2EguSVO1nDgU5c8+ZOPwp2vMNa7YFsJhVcDR9Sdg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/heap/-/heap-0.2.7.tgz}
name: heap
version: 0.2.7
dev: false
registry.npmmirror.com/hexoid@1.0.0:
resolution: {integrity: sha512-QFLV0taWQOZtvIRIAdBChesmogZrtuXvVWsFHZTk2SU+anspqZ2vMnoLg7IE1+Uk16N19APic1BuF8bC8c2m5g==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/hexoid/-/hexoid-1.0.0.tgz}
name: hexoid
@@ -9122,10 +8861,10 @@ packages:
commander: registry.npmmirror.com/commander@8.3.0
dev: false
registry.npmmirror.com/khroma@1.4.1:
resolution: {integrity: sha512-+GmxKvmiRuCcUYDgR7g5Ngo0JEDeOsGdNONdU2zsiBQaK4z19Y2NvXqfEDE0ZiIrg45GTZyAnPLVsLZZACYm3Q==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/khroma/-/khroma-1.4.1.tgz}
registry.npmmirror.com/khroma@2.0.0:
resolution: {integrity: sha512-2J8rDNlQWbtiNYThZRvmMv5yt44ZakX+Tz5ZIp/mN1pt4snn+m030Va5Z4v8xA0cQFDXBwO/8i42xL4QPsVk3g==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/khroma/-/khroma-2.0.0.tgz}
name: khroma
version: 1.4.1
version: 2.0.0
dev: false
registry.npmmirror.com/kitx@2.1.0:
@@ -9157,6 +8896,18 @@ packages:
language-subtag-registry: registry.npmmirror.com/language-subtag-registry@0.3.22
dev: true
registry.npmmirror.com/layout-base@1.0.2:
resolution: {integrity: sha512-8h2oVEZNktL4BH2JCOI90iD1yXwL6iNW7KcCKT2QZgQJR2vbqDsldCTPRU9NifTCqHZci57XvQQ15YTu+sTYPg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/layout-base/-/layout-base-1.0.2.tgz}
name: layout-base
version: 1.0.2
dev: false
registry.npmmirror.com/layout-base@2.0.1:
resolution: {integrity: sha512-dp3s92+uNI1hWIpPGH3jK2kxE2lMjdXdr+DH8ynZHpd6PUlH6x6cbuXnoMmiNumznqaNO31xu9e79F0uuZ0JFg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/layout-base/-/layout-base-2.0.1.tgz}
name: layout-base
version: 2.0.1
dev: false
registry.npmmirror.com/levn@0.3.0:
resolution: {integrity: sha512-0OO4y2iOHix2W6ujICbKIaEQXvFQHue65vUG3pb5EUomzPI90z9hsA1VsO/dbIIpC53J8gxM9Q4Oho0jrCM/yA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/levn/-/levn-0.3.0.tgz}
name: levn
@@ -9199,6 +8950,12 @@ packages:
p-locate: registry.npmmirror.com/p-locate@5.0.0
dev: true
registry.npmmirror.com/lodash-es@4.17.21:
resolution: {integrity: sha512-mKnC+QJ9pWVzv+C4/U3rRsHapFfHvQFoFB92e52xeyGMcX6/OlIl78je1u8vePzYZSkkogMPJ2yjxxsb89cxyw==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/lodash-es/-/lodash-es-4.17.21.tgz}
name: lodash-es
version: 4.17.21
dev: false
registry.npmmirror.com/lodash.debounce@4.0.8:
resolution: {integrity: sha512-FT1yDzDYEoYWhnSGnpE/4Kj1fLZkDFyqRb7fNt6FdYOSxlUWAtp42Eh6Wb0rGIv/m9Bgo7x4GhQbm5Ys4SG5ow==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/lodash.debounce/-/lodash.debounce-4.0.8.tgz}
name: lodash.debounce
@@ -9414,6 +9171,15 @@ packages:
mdast-util-to-markdown: registry.npmmirror.com/mdast-util-to-markdown@1.5.0
dev: false
registry.npmmirror.com/mdast-util-newline-to-break@1.0.0:
resolution: {integrity: sha512-491LcYv3gbGhhCrLoeALncQmega2xPh+m3gbsIhVsOX4sw85+ShLFPvPyibxc1Swx/6GtzxgVodq+cGa/47ULg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/mdast-util-newline-to-break/-/mdast-util-newline-to-break-1.0.0.tgz}
name: mdast-util-newline-to-break
version: 1.0.0
dependencies:
'@types/mdast': registry.npmmirror.com/@types/mdast@3.0.11
mdast-util-find-and-replace: registry.npmmirror.com/mdast-util-find-and-replace@2.2.2
dev: false
registry.npmmirror.com/mdast-util-phrasing@3.0.1:
resolution: {integrity: sha512-WmI1gTXUBJo4/ZmSk79Wcb2HcjPJBzM1nlI/OUWA8yk2X9ik3ffNbBGsU+09BFmXaL1IBb9fiuvq6/KMiNycSg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/mdast-util-phrasing/-/mdast-util-phrasing-3.0.1.tgz}
name: mdast-util-phrasing
@@ -9487,20 +9253,30 @@ packages:
engines: {node: '>= 8'}
dev: true
registry.npmmirror.com/mermaid@8.13.5:
resolution: {integrity: sha512-xLINkCQqZZfqDaLpQVy9BOsws8jT6sLBE2ympDEg4G2uvUu1n61j/h3OFDaA2N4dpZyN7q2pAYkDQ4yywruivA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/mermaid/-/mermaid-8.13.5.tgz}
registry.npmmirror.com/mermaid@10.2.3:
resolution: {integrity: sha512-cMVE5s9PlQvOwfORkyVpr5beMsLdInrycAosdr+tpZ0WFjG4RJ/bUHST7aTgHNJbujHkdBRAm+N50P3puQOfPw==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/mermaid/-/mermaid-10.2.3.tgz}
name: mermaid
version: 8.13.5
version: 10.2.3
dependencies:
'@braintree/sanitize-url': registry.npmmirror.com/@braintree/sanitize-url@3.1.0
'@braintree/sanitize-url': registry.npmmirror.com/@braintree/sanitize-url@6.0.2
cytoscape: registry.npmmirror.com/cytoscape@3.25.0
cytoscape-cose-bilkent: registry.npmmirror.com/cytoscape-cose-bilkent@4.1.0(cytoscape@3.25.0)
cytoscape-fcose: registry.npmmirror.com/cytoscape-fcose@2.2.0(cytoscape@3.25.0)
d3: registry.npmmirror.com/d3@7.8.5
dagre: registry.npmmirror.com/dagre@0.8.5
dagre-d3: registry.npmmirror.com/dagre-d3@0.6.4
dompurify: registry.npmmirror.com/dompurify@2.3.3
graphlib: registry.npmmirror.com/graphlib@2.1.8
khroma: registry.npmmirror.com/khroma@1.4.1
moment-mini: registry.npmmirror.com/moment-mini@2.29.4
dagre-d3-es: registry.npmmirror.com/dagre-d3-es@7.0.10
dayjs: registry.npmmirror.com/dayjs@1.11.7
dompurify: registry.npmmirror.com/dompurify@3.0.3
elkjs: registry.npmmirror.com/elkjs@0.8.2
khroma: registry.npmmirror.com/khroma@2.0.0
lodash-es: registry.npmmirror.com/lodash-es@4.17.21
mdast-util-from-markdown: registry.npmmirror.com/mdast-util-from-markdown@1.3.1
non-layered-tidy-tree-layout: registry.npmmirror.com/non-layered-tidy-tree-layout@2.0.2
stylis: registry.npmmirror.com/stylis@4.2.0
ts-dedent: registry.npmmirror.com/ts-dedent@2.2.0
uuid: registry.npmmirror.com/uuid@9.0.0
web-worker: registry.npmmirror.com/web-worker@1.2.0
transitivePeerDependencies:
- supports-color
dev: false
registry.npmmirror.com/micromark-core-commonmark@1.1.0:
@@ -9890,12 +9666,6 @@ packages:
minimist: registry.npmmirror.com/minimist@1.2.8
dev: false
registry.npmmirror.com/moment-mini@2.29.4:
resolution: {integrity: sha512-uhXpYwHFeiTbY9KSgPPRoo1nt8OxNVdMVoTBYHfSEKeRkIkwGpO+gERmhuhBtzfaeOyTkykSrm2+noJBgqt3Hg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/moment-mini/-/moment-mini-2.29.4.tgz}
name: moment-mini
version: 2.29.4
dev: false
registry.npmmirror.com/mongodb-connection-string-url@2.6.0:
resolution: {integrity: sha512-WvTZlI9ab0QYtTYnuMLgobULWhokRjtC7db9LtcVfJ+Hsnyr5eo6ZtNAt3Ly24XZScGMelOcGtm7lSn0332tPQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/mongodb-connection-string-url/-/mongodb-connection-string-url-2.6.0.tgz}
name: mongodb-connection-string-url
@@ -10091,6 +9861,12 @@ packages:
engines: {node: '>=6.0.0'}
dev: false
registry.npmmirror.com/non-layered-tidy-tree-layout@2.0.2:
resolution: {integrity: sha512-gkXMxRzUH+PB0ax9dUN0yYF0S25BqeAYqhgMaLUFmpXLEk7Fcu8f4emJuOAY0V8kjDICxROIKsTAKsV/v355xw==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/non-layered-tidy-tree-layout/-/non-layered-tidy-tree-layout-2.0.2.tgz}
name: non-layered-tidy-tree-layout
version: 2.0.2
dev: false
registry.npmmirror.com/normalize-path@3.0.0:
resolution: {integrity: sha512-6eZs5Ls3WtCisHWp9S2GUy8dqkpGi4BVSz3GaqiE6ezub0512ESztXUwUB6C6IKbQkY2Pnb/mD4WYojCRwcwLA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/normalize-path/-/normalize-path-3.0.0.tgz}
name: normalize-path
@@ -10249,10 +10025,10 @@ packages:
is-wsl: registry.npmmirror.com/is-wsl@2.2.0
dev: true
registry.npmmirror.com/openai@3.2.1:
resolution: {integrity: sha512-762C9BNlJPbjjlWZi4WYK9iM2tAVAv0uUp1UmI34vb0CN5T2mjB/qM6RYBmNKMh/dN9fC+bxqPwWJZUTWW052A==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/openai/-/openai-3.2.1.tgz}
registry.npmmirror.com/openai@3.3.0:
resolution: {integrity: sha512-uqxI/Au+aPRnsaQRe8CojU0eCR7I0mBiKjD3sNMzY6DaC1ZVrc85u98mtJW6voDug8fgGN+DIZmTDxTthxb7dQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/openai/-/openai-3.3.0.tgz}
name: openai
version: 3.2.1
version: 3.3.0
dependencies:
axios: registry.npmmirror.com/axios@0.26.1
form-data: registry.npmmirror.com/form-data@4.0.0
@@ -10808,11 +10584,11 @@ packages:
version: 18.2.0
dev: false
registry.npmmirror.com/react-markdown@8.0.5(@types/react@18.0.28)(react@18.2.0):
resolution: {integrity: sha512-jGJolWWmOWAvzf+xMdB9zwStViODyyFQhNB/bwCerbBKmrTmgmA599CGiOlP58OId1IMoIRsA8UdI1Lod4zb5A==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/react-markdown/-/react-markdown-8.0.5.tgz}
id: registry.npmmirror.com/react-markdown/8.0.5
registry.npmmirror.com/react-markdown@8.0.7(@types/react@18.0.28)(react@18.2.0):
resolution: {integrity: sha512-bvWbzG4MtOU62XqBx3Xx+zB2raaFFsq4mYiAzfjXJMEz2sixgeAfraA3tvzULF02ZdOMUOKTBFFaZJDDrq+BJQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/react-markdown/-/react-markdown-8.0.7.tgz}
id: registry.npmmirror.com/react-markdown/8.0.7
name: react-markdown
version: 8.0.5
version: 8.0.7
peerDependencies:
'@types/react': '>=16'
react: '>=16'
@@ -11069,6 +10845,16 @@ packages:
unified: registry.npmmirror.com/unified@10.1.2
dev: false
registry.npmmirror.com/remark-breaks@3.0.3:
resolution: {integrity: sha512-C7VkvcUp1TPUc2eAYzsPdaUh8Xj4FSbQnYA5A9f80diApLZscTDeG7efiWP65W8hV2sEy3JuGVU0i6qr5D8Hug==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/remark-breaks/-/remark-breaks-3.0.3.tgz}
name: remark-breaks
version: 3.0.3
dependencies:
'@types/mdast': registry.npmmirror.com/@types/mdast@3.0.11
mdast-util-newline-to-break: registry.npmmirror.com/mdast-util-newline-to-break@1.0.0
unified: registry.npmmirror.com/unified@10.1.2
dev: false
registry.npmmirror.com/remark-gfm@3.0.1:
resolution: {integrity: sha512-lEFDoi2PICJyNrACFOfDD3JlLkuSbOa5Wd8EPt06HUdptv8Gn0bxYTdbU/XXQ3swAPkEaGxxPN9cbnMHvVu1Ig==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/remark-gfm/-/remark-gfm-3.0.1.tgz}
name: remark-gfm
@@ -11777,6 +11563,13 @@ packages:
version: 2.1.0
dev: false
registry.npmmirror.com/ts-dedent@2.2.0:
resolution: {integrity: sha512-q5W7tVM71e2xjHZTlgfTDoPF/SmqKG5hddq9SzR49CH2hayqRKJtQ4mtRlSxKaJlR/+9rEM+mnBHf7I2/BQcpQ==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/ts-dedent/-/ts-dedent-2.2.0.tgz}
name: ts-dedent
version: 2.2.0
engines: {node: '>=6.10'}
dev: false
registry.npmmirror.com/tsconfig-paths@3.14.2:
resolution: {integrity: sha512-o/9iXgCYc5L/JxCHPe3Hvh8Q/2xm5Z+p18PESBU6Ff33695QnCHBEjcytY2q19ua7Mbl/DavtBOLq+oG0RCL+g==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/tsconfig-paths/-/tsconfig-paths-3.14.2.tgz}
name: tsconfig-paths
@@ -12156,6 +11949,13 @@ packages:
dev: false
optional: true
registry.npmmirror.com/uuid@9.0.0:
resolution: {integrity: sha512-MXcSTerfPa4uqyzStbRoTgt5XIe3x5+42+q1sDuy3R5MDk66URdLMOZe5aPX/SQd+kuYAh0FdP/pO28IkQyTeg==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/uuid/-/uuid-9.0.0.tgz}
name: uuid
version: 9.0.0
hasBin: true
dev: false
registry.npmmirror.com/uvu@0.5.6:
resolution: {integrity: sha512-+g8ENReyr8YsOc6fv/NVJs2vFdHBnBNdfE49rshrTzDWOlUx4Gq7KOS2GD8eqhy2j+Ejq29+SbKH8yjkAqXqoA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/uvu/-/uvu-0.5.6.tgz}
name: uvu
@@ -12222,6 +12022,12 @@ packages:
version: 2.0.1
dev: false
registry.npmmirror.com/web-worker@1.2.0:
resolution: {integrity: sha512-PgF341avzqyx60neE9DD+XS26MMNMoUQRz9NOZwW32nPQrF6p77f1htcnjBSEV8BGMKZ16choqUG4hyI0Hx7mA==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/web-worker/-/web-worker-1.2.0.tgz}
name: web-worker
version: 1.2.0
dev: false
registry.npmmirror.com/webidl-conversions@7.0.0:
resolution: {integrity: sha512-VwddBukDzu71offAQR975unBIGqfKZpM+8ZX6ySk8nYhVoo5CYaZyzt3YBvYtRtO+aoGlqxPg/B87NGVZ/fu6g==, registry: https://registry.npm.taobao.org/, tarball: https://registry.npmmirror.com/webidl-conversions/-/webidl-conversions-7.0.0.tgz}
name: webidl-conversions

View File

@@ -7,10 +7,10 @@
| 计费项 | 价格: 元/ 1K tokens包含上下文|
| --- | --- |
| 知识库 - 索引 | 0.001 |
| chatgpt - 对话 | 0.022 |
| chatgpt16K - 对话 | 0.025 |
| gpt4 - 对话 | 0.5 |
| 文件拆分 | 0.025 |
| chatgpt - 对话 | 0.015 |
| chatgpt16K - 对话 | 0.015 |
| gpt4 - 对话 | 0.1 |
| 文件拆分 | 0.015 |
**其他问题**
| 交流群 | 小助手 |

View File

@@ -19,10 +19,10 @@ FastGpt 项目完全开源,可随意私有化部署,去除平台风险忧虑
| 计费项 | 价格: 元/ 1K tokens包含上下文|
| --- | --- |
| 知识库 - 索引 | 0.001 |
| chatgpt - 对话 | 0.022 |
| chatgpt16K - 对话 | 0.025 |
| gpt4 - 对话 | 0.5 |
| 文件拆分 | 0.025 |
| chatgpt - 对话 | 0.015 |
| chatgpt16K - 对话 | 0.015 |
| gpt4 - 对话 | 0.1 |
| 文件拆分 | 0.015 |
### 交流群/问题反馈

View File

@@ -1,5 +1,7 @@
### Fast GPT V3.8.4
### Fast GPT V3.9
1. 新增 - mermaid 导图兼容,可以在应用市场 'mermaid 导图' 进行体验
2. 优化 - 部分 UI 和账号页
2. 优化 - 知识库搜索速度
1. 限时优惠活动,更低价的 tokens
2. 新增 - 直接分段训练,可调节段落大小
3. 优化 - tokens 计算性能。
4. 优化 - key 池管理,结合 one-api 项目,实现更方便的 key 池管理,具体参考[docker 部署 FastGpt](https://github.com/c121914yu/FastGPT/blob/main/docs/deploy/docker.md)
5. 新增 - V2 版 OpenAPI可以在任意第三方套壳 ChatGpt 项目中直接使用 FastGpt 的应用,注意!是直接,不需要改任何代码。具体参考[API 文档中《在第三方应用中使用 FastGpt》](https://kjqvjse66l.feishu.cn/docx/DmLedTWtUoNGX8xui9ocdUEjnNh)

View File

@@ -0,0 +1,8 @@
var _hmt = _hmt || [];
(function () {
const hm = document.createElement('script');
hm.src = 'https://hm.baidu.com/hm.js?a5357e9dab086658bac0b6faf148882e';
const s = document.getElementsByTagName('script')[0];
s.parentNode.insertBefore(hm, s);
})();

View File

@@ -1,67 +1,103 @@
import { GUIDE_PROMPT_HEADER, NEW_CHATID_HEADER, QUOTE_LEN_HEADER } from '@/constants/chat';
import { Props, ChatResponseType } from '@/pages/api/openapi/v1/chat/completions';
import { sseResponseEventEnum } from '@/constants/chat';
import { getErrText } from '@/utils/tools';
import { parseStreamChunk } from '@/utils/adapt';
interface StreamFetchProps {
url: string;
data: any;
data: Props;
onMessage: (text: string) => void;
abortSignal: AbortController;
}
export const streamFetch = ({ url, data, onMessage, abortSignal }: StreamFetchProps) =>
new Promise<{
responseText: string;
newChatId: string;
systemPrompt: string;
quoteLen: number;
}>(async (resolve, reject) => {
try {
const res = await fetch(url, {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify(data),
signal: abortSignal.signal
});
const reader = res.body?.getReader();
if (!reader) return;
export const streamFetch = ({ data, onMessage, abortSignal }: StreamFetchProps) =>
new Promise<ChatResponseType & { responseText: string; errMsg: string }>(
async (resolve, reject) => {
try {
const response = await window.fetch('/api/openapi/v1/chat/completions', {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
signal: abortSignal.signal,
body: JSON.stringify({
...data,
stream: true
})
});
const decoder = new TextDecoder();
const newChatId = decodeURIComponent(res.headers.get(NEW_CHATID_HEADER) || '');
const systemPrompt = decodeURIComponent(res.headers.get(GUIDE_PROMPT_HEADER) || '').trim();
const quoteLen = res.headers.get(QUOTE_LEN_HEADER)
? Number(res.headers.get(QUOTE_LEN_HEADER))
: 0;
let responseText = '';
const read = async () => {
try {
const { done, value } = await reader?.read();
if (done) {
if (res.status === 200) {
resolve({ responseText, newChatId, quoteLen, systemPrompt });
} else {
const parseError = JSON.parse(responseText);
reject(parseError?.message || '请求异常');
}
return;
}
const text = decoder.decode(value);
responseText += text;
onMessage(text);
read();
} catch (err: any) {
if (err?.message === 'The user aborted a request.') {
return resolve({ responseText, newChatId, quoteLen, systemPrompt });
}
reject(typeof err === 'string' ? err : err?.message || '请求异常');
if (response.status !== 200) {
const err = await response.json();
return reject(err);
}
};
read();
} catch (err: any) {
console.log(err, '====');
reject(typeof err === 'string' ? err : err?.message || '请求异常');
if (!response?.body) {
throw new Error('Request Error');
}
const reader = response.body?.getReader();
// response data
let responseText = '';
let newChatId = '';
let quoteLen = 0;
let errMsg = '';
const read = async () => {
try {
const { done, value } = await reader.read();
if (done) {
if (response.status === 200) {
return resolve({
responseText,
newChatId,
quoteLen,
errMsg
});
} else {
return reject('响应过程出现异常~');
}
}
const chunkResponse = parseStreamChunk(value);
chunkResponse.forEach((item) => {
// parse json data
const data = (() => {
try {
return JSON.parse(item.data);
} catch (error) {
return item.data;
}
})();
if (item.event === sseResponseEventEnum.answer && data !== '[DONE]') {
const answer: string = data?.choices?.[0].delta.content || '';
onMessage(answer);
responseText += answer;
} else if (item.event === sseResponseEventEnum.chatResponse) {
const chatResponse = data as ChatResponseType;
newChatId = chatResponse.newChatId;
quoteLen = chatResponse.quoteLen || 0;
} else if (item.event === sseResponseEventEnum.error) {
errMsg = getErrText(data, '流响应错误');
}
});
read();
} catch (err: any) {
if (err?.message === 'The user aborted a request.') {
return resolve({
responseText,
newChatId,
quoteLen,
errMsg
});
}
reject(getErrText(err, '请求异常'));
}
};
read();
} catch (err: any) {
console.log(err);
reject(getErrText(err, '请求异常'));
}
}
});
);

View File

@@ -4,6 +4,8 @@ import type { ChatItemType } from '@/types/chat';
export interface InitChatResponse {
chatId: string;
modelId: string;
systemPrompt?: string;
limitPrompt?: string;
model: {
name: string;
avatar: string;

View File

@@ -287,8 +287,7 @@ const CodeLight = ({
children,
className,
inline,
match,
...props
match
}: {
children: React.ReactNode & React.ReactNode[];
className?: string;
@@ -315,18 +314,14 @@ const CodeLight = ({
<Box ml={1}></Box>
</Flex>
</Flex>
<SyntaxHighlighter style={codeLight as any} language={match?.[1]} PreTag="pre" {...props}>
<SyntaxHighlighter style={codeLight as any} language={match?.[1]} PreTag="pre">
{String(children)}
</SyntaxHighlighter>
</Box>
);
}
return (
<code className={className} {...props}>
{children}
</code>
);
return <code className={className}>{children}</code>;
};
export default React.memo(CodeLight);

View File

@@ -0,0 +1,18 @@
import React from 'react';
import { Box } from '@chakra-ui/react';
const regex = /((http|https|ftp):\/\/[^\s\u4e00-\u9fa5\u3000-\u303f\uff00-\uffef]+)/gi;
const Link = (props: { href?: string; children?: React.ReactNode[] }) => {
const decText = decodeURIComponent(props.href || '');
const replaceText = decText.replace(regex, (match, p1) => {
const text = decText === props.children?.[0] ? p1 : props.children?.[0];
const isInternal = /^\/#/i.test(p1);
const target = isInternal ? '_self' : '_blank';
return `<a href="${p1}" target=${target}>${text}</a>`;
});
return <Box as={'span'} dangerouslySetInnerHTML={{ __html: replaceText }} />;
};
export default React.memo(Link);

View File

@@ -1,7 +1,7 @@
import React, { useState } from 'react';
import { Image, Skeleton } from '@chakra-ui/react';
const MdImage = ({ src }: { src: string }) => {
const MdImage = ({ src }: { src?: string }) => {
const [isLoading, setIsLoading] = useState(true);
const [succeed, setSucceed] = useState(false);
return (

View File

@@ -1,15 +1,16 @@
import React, { useEffect, useRef, memo, useCallback, useState } from 'react';
import React, { useEffect, useRef, memo, useCallback, useState, useMemo } from 'react';
import { Box } from '@chakra-ui/react';
// @ts-ignore
import mermaid from 'mermaid';
import MyIcon from '../Icon';
import styles from './index.module.scss';
import MyIcon from '../../Icon';
const mermaidAPI = mermaid.mermaidAPI;
mermaidAPI.initialize({
startOnLoad: false,
startOnLoad: true,
theme: 'base',
flowchart: {
useMaxWidth: false
},
themeVariables: {
fontSize: '14px',
primaryColor: '#d6e8ff',
@@ -21,52 +22,53 @@ mermaidAPI.initialize({
}
});
const punctuationMap: Record<string, string> = {
'': ',',
'': ';',
'。': '.',
'': ':',
'': '!',
'': '?',
'“': '"',
'”': '"',
'': "'",
'': "'",
'【': '[',
'】': ']',
'': '(',
'': ')',
'《': '<',
'》': '>',
'、': ','
};
const MermaidBlock = ({ code }: { code: string }) => {
const dom = useRef<HTMLDivElement>(null);
const ref = useRef<HTMLDivElement>(null);
const [svg, setSvg] = useState('');
const [errorSvgCode, setErrorSvgCode] = useState('');
useEffect(() => {
(async () => {
const punctuationMap: Record<string, string> = {
'': ',',
'': ';',
'。': '.',
'': ':',
'': '!',
'': '?',
'“': '"',
'”': '"',
'': "'",
'': "'",
'【': '[',
'】': ']',
'': '(',
'': ')',
'《': '<',
'》': '>',
'、': ','
};
const formatCode = code.replace(
/([,;。:!?“”‘’【】()《》、])/g,
(match) => punctuationMap[match]
);
if (!code) return;
try {
const svgCode = await mermaidAPI.render(`mermaid-${Date.now()}`, formatCode);
setSvg(svgCode);
} catch (error) {
setErrorSvgCode(formatCode);
console.log(error);
const formatCode = code.replace(
new RegExp(`[${Object.keys(punctuationMap).join('')}]`, 'g'),
(match) => punctuationMap[match]
);
const { svg } = await mermaid.render(`mermaid-${Date.now()}`, formatCode);
setSvg(svg);
} catch (e: any) {
console.log('[Mermaid] ', e?.message);
}
})();
}, [code]);
const onclickExport = useCallback(() => {
const svg = dom.current?.children[0];
const svg = ref.current?.children[0];
if (!svg) return;
const w = svg.clientWidth * 4;
const h = svg.clientHeight * 4;
const rate = svg.clientHeight / svg.clientWidth;
const w = 3000;
const h = rate * w;
const canvas = document.createElement('canvas');
canvas.width = w;
@@ -78,7 +80,7 @@ const MermaidBlock = ({ code }: { code: string }) => {
ctx.fillRect(0, 0, w, h);
const img = new Image();
img.src = `data:image/svg+xml;charset=utf-8,${encodeURIComponent(dom.current.innerHTML)}`;
img.src = `data:image/svg+xml;charset=utf-8,${encodeURIComponent(ref.current.innerHTML)}`;
img.onload = () => {
ctx.drawImage(img, 0, 0, w, h);
@@ -97,17 +99,25 @@ const MermaidBlock = ({ code }: { code: string }) => {
}, []);
return (
<Box position={'relative'}>
<Box
position={'relative'}
_hover={{
'& > .export': {
display: 'block'
}
}}
>
<Box
ref={dom}
as={'p'}
className={styles.mermaid}
overflowX={'auto'}
ref={ref}
minW={'100px'}
minH={'50px'}
py={4}
dangerouslySetInnerHTML={{ __html: svg }}
/>
<MyIcon
className="export"
display={'none'}
name={'export'}
w={'20px'}
position={'absolute'}
@@ -124,4 +134,4 @@ const MermaidBlock = ({ code }: { code: string }) => {
);
};
export default memo(MermaidBlock);
export default MermaidBlock;

View File

@@ -319,7 +319,6 @@
border: medium none;
margin: 0;
padding: 0;
white-space: pre;
}
.markdown .highlight pre,
.markdown pre {
@@ -345,10 +344,6 @@
word-break: break-all;
}
p {
white-space: pre-line;
}
pre {
display: block;
width: 100%;
@@ -419,9 +414,4 @@
.mermaid {
overflow-x: auto;
svg {
height: auto !important;
width: auto;
}
}

View File

@@ -1,65 +1,54 @@
import React, { memo, useMemo } from 'react';
import React from 'react';
import ReactMarkdown from 'react-markdown';
import { formatLinkText } from '@/utils/tools';
import remarkGfm from 'remark-gfm';
import remarkMath from 'remark-math';
import rehypeKatex from 'rehype-katex';
import RemarkGfm from 'remark-gfm';
import RemarkMath from 'remark-math';
import RehypeKatex from 'rehype-katex';
import RemarkBreaks from 'remark-breaks';
import 'katex/dist/katex.min.css';
import styles from './index.module.scss';
import CodeLight from './codeLight';
import Loading from './Loading';
import MermaidCodeBlock from './MermaidCodeBlock';
import MdImage from './Image';
const Markdown = ({
source,
isChatting = false,
formatLink
}: {
source: string;
formatLink?: boolean;
isChatting?: boolean;
}) => {
const formatSource = useMemo(() => {
return formatLink ? formatLinkText(source) : source;
}, [source, formatLink]);
import Link from './Link';
import CodeLight from './CodeLight';
import MermaidCodeBlock from './img/MermaidCodeBlock';
import MdImage from './img/Image';
function Code({ inline, className, children }: any) {
const match = /language-(\w+)/.exec(className || '');
if (match?.[1] === 'mermaid') {
return <MermaidCodeBlock code={String(children)} />;
}
return (
<CodeLight className={className} inline={inline} match={match}>
{children}
</CodeLight>
);
}
function Image({ src }: { src?: string }) {
return <MdImage src={src} />;
}
const Markdown = ({ source, isChatting = false }: { source: string; isChatting?: boolean }) => {
return (
<ReactMarkdown
className={`markdown ${styles.markdown}
${isChatting ? (source === '' ? styles.waitingAnimation : styles.animation) : ''}
`}
remarkPlugins={[remarkGfm, remarkMath]}
rehypePlugins={[rehypeKatex]}
${isChatting ? (source === '' ? styles.waitingAnimation : styles.animation) : ''}
`}
remarkPlugins={[RemarkGfm, RemarkMath, RemarkBreaks]}
rehypePlugins={[RehypeKatex]}
components={{
a: Link,
img: Image,
pre: 'div',
img({ src = '' }) {
return isChatting ? <Loading text="图片加载中..." /> : <MdImage src={src} />;
},
code({ node, inline, className, children, ...props }) {
const match = /language-(\w+)/.exec(className || '');
if (match?.[1] === 'mermaid') {
return isChatting ? (
<Loading text="导图加载中..." />
) : (
<MermaidCodeBlock code={String(children)} />
);
}
return (
<CodeLight className={className} inline={inline} match={match} {...props}>
{children}
</CodeLight>
);
}
code: Code
}}
linkTarget="_blank"
>
{formatSource}
{source}
</ReactMarkdown>
);
};
export default memo(Markdown);
export default Markdown;

View File

@@ -1,76 +0,0 @@
import React, { useRef, useEffect, useMemo } from 'react';
import type { BoxProps } from '@chakra-ui/react';
import { Box } from '@chakra-ui/react';
import { throttle } from 'lodash';
import { useLoading } from '@/hooks/useLoading';
interface Props extends BoxProps {
nextPage: () => void;
isLoadAll: boolean;
requesting: boolean;
children: React.ReactNode;
initRequesting?: boolean;
}
const ScrollData = ({
children,
nextPage,
isLoadAll,
requesting,
initRequesting,
...props
}: Props) => {
const { Loading } = useLoading({ defaultLoading: true });
const elementRef = useRef<HTMLDivElement>(null);
const loadText = useMemo(() => {
if (requesting) return '请求中……';
if (isLoadAll) return '已加载全部';
return '点击加载更多';
}, [isLoadAll, requesting]);
useEffect(() => {
if (!elementRef.current) return;
const scrolling = throttle((e: Event) => {
const element = e.target as HTMLDivElement;
if (!element) return;
// 当前滚动位置
const scrollTop = element.scrollTop;
// 可视高度
const clientHeight = element.clientHeight;
// 内容总高度
const scrollHeight = element.scrollHeight;
// 判断是否滚动到底部
if (scrollTop + clientHeight + 100 >= scrollHeight) {
nextPage();
}
}, 100);
elementRef.current.addEventListener('scroll', scrolling);
return () => {
// eslint-disable-next-line react-hooks/exhaustive-deps
elementRef.current?.removeEventListener('scroll', scrolling);
};
}, [elementRef, nextPage]);
return (
<Box {...props} ref={elementRef} overflowY={'auto'} position={'relative'}>
{children}
<Box
mt={2}
fontSize={'xs'}
color={'blackAlpha.500'}
textAlign={'center'}
cursor={loadText === '点击加载更多' ? 'pointer' : 'default'}
onClick={() => {
if (loadText !== '点击加载更多') return;
nextPage();
}}
>
{loadText}
</Box>
{initRequesting && <Loading fixed={false} />}
</Box>
);
};
export default ScrollData;

View File

@@ -1,4 +1,4 @@
import React from 'react';
import React, { useRef } from 'react';
import { Menu, MenuButton, MenuList, MenuItem, Button, useDisclosure } from '@chakra-ui/react';
import type { ButtonProps } from '@chakra-ui/react';
import { ChevronDownIcon } from '@chakra-ui/icons';
@@ -13,6 +13,7 @@ interface Props extends ButtonProps {
}
const MySelect = ({ placeholder, value, width = 'auto', list, onchange, ...props }: Props) => {
const ref = useRef<HTMLDivElement>(null);
const menuItemStyles = {
borderRadius: 'sm',
py: 2,
@@ -26,8 +27,9 @@ const MySelect = ({ placeholder, value, width = 'auto', list, onchange, ...props
return (
<Menu autoSelect={false} onOpen={onOpen} onClose={onClose}>
<MenuButton as={'span'}>
<MenuButton style={{ width: '100%', position: 'relative' }} as={'span'}>
<Button
ref={ref}
width={width}
px={3}
variant={'base'}
@@ -47,9 +49,15 @@ const MySelect = ({ placeholder, value, width = 'auto', list, onchange, ...props
</Button>
</MenuButton>
<MenuList
minW={
Array.isArray(width) ? width.map((item) => `${item} !important`) : `${width} !important`
}
minW={(() => {
const w = ref.current?.clientWidth;
if (w) {
return `${w}px !important`;
}
return Array.isArray(width)
? width.map((item) => `${item} !important`)
: `${width} !important`;
})()}
p={'6px'}
border={'1px solid #fff'}
boxShadow={'0px 2px 4px rgba(161, 167, 179, 0.25), 0px 0px 1px rgba(121, 141, 159, 0.25);'}
@@ -78,4 +86,4 @@ const MySelect = ({ placeholder, value, width = 'auto', list, onchange, ...props
);
};
export default MySelect;
export default React.memo(MySelect);

View File

@@ -1,6 +1,8 @@
export const NEW_CHATID_HEADER = 'response-new-chat-id';
export const QUOTE_LEN_HEADER = 'response-quote-len';
export const GUIDE_PROMPT_HEADER = 'response-guide-prompt';
export enum sseResponseEventEnum {
error = 'error',
answer = 'answer',
chatResponse = 'chatResponse'
}
export enum ChatRoleEnum {
System = 'System',

View File

@@ -12,11 +12,8 @@ export enum OpenAiChatEnum {
'GPT4' = 'gpt-4',
'GPT432k' = 'gpt-4-32k'
}
export enum ClaudeEnum {
'Claude' = 'Claude'
}
export type ChatModelType = `${OpenAiChatEnum}` | `${ClaudeEnum}`;
export type ChatModelType = `${OpenAiChatEnum}`;
export type ChatModelItemType = {
chatModel: ChatModelType;
@@ -34,7 +31,7 @@ export const ChatModelMap = {
contextMaxToken: 4000,
systemMaxToken: 2400,
maxTemperature: 1.2,
price: 2.2
price: 1.5
},
[OpenAiChatEnum.GPT3516k]: {
chatModel: OpenAiChatEnum.GPT3516k,
@@ -42,7 +39,7 @@ export const ChatModelMap = {
contextMaxToken: 16000,
systemMaxToken: 8000,
maxTemperature: 1.2,
price: 2.5
price: 1.5
},
[OpenAiChatEnum.GPT4]: {
chatModel: OpenAiChatEnum.GPT4,
@@ -50,7 +47,7 @@ export const ChatModelMap = {
contextMaxToken: 8000,
systemMaxToken: 4000,
maxTemperature: 1.2,
price: 50
price: 10
},
[OpenAiChatEnum.GPT432k]: {
chatModel: OpenAiChatEnum.GPT432k,
@@ -59,14 +56,6 @@ export const ChatModelMap = {
systemMaxToken: 8000,
maxTemperature: 1.2,
price: 90
},
[ClaudeEnum.Claude]: {
chatModel: ClaudeEnum.Claude,
name: 'Claude(免费体验)',
contextMaxToken: 9000,
systemMaxToken: 2700,
maxTemperature: 1,
price: 0
}
};
@@ -93,7 +82,9 @@ export const defaultModel: ModelSchema = {
searchLimit: 5,
searchEmptyText: '',
systemPrompt: '',
limitPrompt: '',
temperature: 0,
maxToken: 4000,
chatModel: OpenAiChatEnum.GPT35
},
share: {

View File

@@ -1,21 +1,27 @@
import { useState, useCallback, useMemo, useEffect } from 'react';
import { useRef, useState, useCallback, useLayoutEffect, useMemo, useEffect } from 'react';
import type { PagingData } from '../types/index';
import { IconButton, Flex, Box, Input } from '@chakra-ui/react';
import { ArrowBackIcon, ArrowForwardIcon } from '@chakra-ui/icons';
import { useMutation } from '@tanstack/react-query';
import { useToast } from './useToast';
import { throttle } from 'lodash';
const thresholdVal = 100;
export const usePagination = <T = any,>({
api,
pageSize = 10,
params = {},
defaultRequest = true
defaultRequest = true,
type = 'button'
}: {
api: (data: any) => any;
pageSize?: number;
params?: Record<string, any>;
defaultRequest?: boolean;
type?: 'button' | 'scroll';
}) => {
const elementRef = useRef<HTMLDivElement>(null);
const { toast } = useToast();
const [pageNum, setPageNum] = useState(1);
const [total, setTotal] = useState(0);
@@ -108,6 +114,60 @@ export const usePagination = <T = any,>({
);
}, [isLoading, maxPage, mutate, pageNum]);
const ScrollData = useCallback(
({ children, ...props }: { children: React.ReactNode }) => {
const loadText = useMemo(() => {
if (isLoading) return '请求中……';
if (total <= data.length) return '已加载全部';
return '点击加载更多';
}, []);
return (
<Box {...props} ref={elementRef} overflow={'overlay'}>
{children}
<Box
mt={2}
fontSize={'xs'}
color={'blackAlpha.500'}
textAlign={'center'}
cursor={loadText === '点击加载更多' ? 'pointer' : 'default'}
onClick={() => {
if (loadText !== '点击加载更多') return;
mutate(pageNum + 1);
}}
>
{loadText}
</Box>
</Box>
);
},
[data.length, isLoading, mutate, pageNum, total]
);
useLayoutEffect(() => {
if (!elementRef.current || type !== 'scroll') return;
const scrolling = throttle((e: Event) => {
const element = e.target as HTMLDivElement;
if (!element) return;
// 当前滚动位置
const scrollTop = element.scrollTop;
// 可视高度
const clientHeight = element.clientHeight;
// 内容总高度
const scrollHeight = element.scrollHeight;
// 判断是否滚动到底部
if (scrollTop + clientHeight + thresholdVal >= scrollHeight) {
mutate(pageNum + 1);
}
}, 100);
elementRef.current.addEventListener('scroll', scrolling);
return () => {
// eslint-disable-next-line react-hooks/exhaustive-deps
elementRef.current?.removeEventListener('scroll', scrolling);
};
}, [elementRef, mutate, pageNum, type]);
useEffect(() => {
defaultRequest && mutate(1);
}, []);
@@ -119,6 +179,7 @@ export const usePagination = <T = any,>({
data,
isLoading,
Pagination,
ScrollData,
getData: mutate
};
};

View File

@@ -8,7 +8,7 @@ interface Props {
export function useScreen(data?: Props) {
const { defaultIsPc = false } = data || {};
const [isPc] = useMediaQuery('(min-width: 900px)', {
ssr: true,
ssr: false,
fallback: defaultIsPc
});

View File

@@ -31,7 +31,7 @@ const queryClient = new QueryClient({
function App({ Component, pageProps }: AppProps) {
const {
loadInitData,
initData: { googleVerKey }
initData: { googleVerKey, baiduTongji }
} = useGlobalStore();
useEffect(() => {
@@ -49,22 +49,19 @@ function App({ Component, pageProps }: AppProps) {
/>
<link rel="icon" href="/favicon.ico" />
</Head>
<Script src="/js/particles.js" strategy="lazyOnload"></Script>
<Script src="/js/qrcode.min.js" strategy="afterInteractive"></Script>
<Script src="/js/pdf.js" strategy="afterInteractive"></Script>
<Script src="/js/html2pdf.bundle.min.js" strategy="afterInteractive"></Script>
{baiduTongji && <Script src="/js/baidutongji.js" strategy="afterInteractive"></Script>}
{googleVerKey && (
<>
<Script
src={`https://www.recaptcha.net/recaptcha/api.js?render=${googleVerKey}`}
strategy="afterInteractive"
></Script>
<Script
src={`https://www.google.com/recaptcha/api.js?render=${googleVerKey}`}
strategy="afterInteractive"
></Script>
</>
)}
<Script src="/js/particles.js"></Script>
<QueryClientProvider client={queryClient}>
<ChakraProvider theme={theme}>
<ColorModeScript initialColorMode={theme.config.initialColorMode} />

View File

@@ -1,192 +0,0 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { connectToDatabase } from '@/service/mongo';
import { authChat } from '@/service/utils/auth';
import { modelServiceToolMap } from '@/service/utils/chat';
import { ChatItemType } from '@/types/chat';
import { jsonRes } from '@/service/response';
import { ChatModelMap } from '@/constants/model';
import { pushChatBill } from '@/service/events/pushBill';
import { resStreamResponse } from '@/service/utils/chat';
import { appKbSearch } from '../openapi/kb/appKbSearch';
import { ChatRoleEnum, QUOTE_LEN_HEADER, GUIDE_PROMPT_HEADER } from '@/constants/chat';
import { BillTypeEnum } from '@/constants/user';
import { sensitiveCheck } from '../openapi/text/sensitiveCheck';
import { NEW_CHATID_HEADER } from '@/constants/chat';
import { saveChat } from './saveChat';
import { Types } from 'mongoose';
/* 发送提示词 */
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
res.on('close', () => {
res.end();
});
res.on('error', () => {
console.log('error: ', 'request error');
res.end();
});
try {
const { chatId, prompt, modelId } = req.body as {
prompt: [ChatItemType, ChatItemType];
modelId: string;
chatId?: string;
};
if (!modelId || !prompt) {
throw new Error('缺少参数');
}
await connectToDatabase();
let startTime = Date.now();
const { model, showModelDetail, content, userOpenAiKey, systemAuthKey, userId } =
await authChat({
modelId,
chatId,
req
});
const modelConstantsData = ChatModelMap[model.chat.chatModel];
const {
rawSearch = [],
userSystemPrompt = [],
quotePrompt = []
} = await (async () => {
// 使用了知识库搜索
if (model.chat.relatedKbs?.length > 0) {
const { rawSearch, userSystemPrompt, quotePrompt } = await appKbSearch({
model,
userId,
fixedQuote: content[content.length - 1]?.quote || [],
prompt: prompt[0],
similarity: model.chat.searchSimilarity,
limit: model.chat.searchLimit
});
return {
rawSearch: rawSearch,
userSystemPrompt: userSystemPrompt ? [userSystemPrompt] : [],
quotePrompt: [quotePrompt]
};
}
if (model.chat.systemPrompt) {
return {
userSystemPrompt: [
{
obj: ChatRoleEnum.System,
value: model.chat.systemPrompt
}
]
};
}
return {};
})();
// get conversationId. create a newId if it is null
const conversationId = chatId || String(new Types.ObjectId());
!chatId && res.setHeader(NEW_CHATID_HEADER, conversationId);
if (showModelDetail) {
userSystemPrompt[0] &&
res.setHeader(GUIDE_PROMPT_HEADER, encodeURIComponent(userSystemPrompt[0].value));
res.setHeader(QUOTE_LEN_HEADER, rawSearch.length);
}
// search result is empty
if (model.chat.relatedKbs?.length > 0 && !quotePrompt[0]?.value && model.chat.searchEmptyText) {
const response = model.chat.searchEmptyText;
await saveChat({
chatId,
newChatId: conversationId,
modelId,
prompts: [
prompt[0],
{
...prompt[1],
quote: [],
value: response
}
],
userId
});
return res.end(response);
}
// 读取对话内容
const prompts = [...quotePrompt, ...content, ...userSystemPrompt, prompt[0]];
// content check
await sensitiveCheck({
input: [...quotePrompt, ...userSystemPrompt, prompt[0]].map((item) => item.value).join('')
});
// 计算温度
const temperature = (modelConstantsData.maxTemperature * (model.chat.temperature / 10)).toFixed(
2
);
// 发出 chat 请求
const { streamResponse, responseMessages } = await modelServiceToolMap[
model.chat.chatModel
].chatCompletion({
apiKey: userOpenAiKey || systemAuthKey,
temperature: +temperature,
messages: prompts,
stream: true,
res,
chatId: conversationId
});
console.log('api response time:', `${(Date.now() - startTime) / 1000}s`);
if (res.closed) return res.end();
try {
const { totalTokens, finishMessages, responseContent } = await resStreamResponse({
model: model.chat.chatModel,
res,
chatResponse: streamResponse,
prompts: responseMessages
});
// save chat
await saveChat({
chatId,
newChatId: conversationId,
modelId,
prompts: [
prompt[0],
{
...prompt[1],
value: responseContent,
quote: showModelDetail ? rawSearch : [],
systemPrompt: showModelDetail ? userSystemPrompt[0]?.value : ''
}
],
userId
});
res.end();
// 只有使用平台的 key 才计费
pushChatBill({
isPay: !userOpenAiKey,
chatModel: model.chat.chatModel,
userId,
chatId: conversationId,
textLen: finishMessages.map((item) => item.value).join('').length,
tokens: totalTokens,
type: BillTypeEnum.chat
});
} catch (error) {
res.end();
console.log('error结束', error);
}
} catch (err: any) {
res.status(500);
jsonRes(res, {
code: 500,
error: err
});
}
}

View File

@@ -20,31 +20,32 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
await connectToDatabase();
let model: ModelSchema;
// 没有 modelId 时直接获取用户的第一个id
if (!modelId) {
const myModel = await Model.findOne({ userId });
if (!myModel) {
const { _id } = await Model.create({
name: '应用1',
userId
});
model = (await Model.findById(_id)) as ModelSchema;
const model = await (async () => {
if (!modelId) {
const myModel = await Model.findOne({ userId });
if (!myModel) {
const { _id } = await Model.create({
name: '应用1',
userId
});
return (await Model.findById(_id)) as ModelSchema;
} else {
return myModel;
}
} else {
model = myModel;
// 校验使用权限
const authRes = await authModel({
modelId,
userId,
authUser: false,
authOwner: false
});
return authRes.model;
}
modelId = model._id;
} else {
// 校验使用权限
const authRes = await authModel({
modelId,
userId,
authUser: false,
authOwner: false
});
model = authRes.model;
}
})();
modelId = modelId || model._id;
// 历史记录
let history: ChatItemType[] = [];
@@ -86,6 +87,8 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
]);
}
const isOwner = String(model.userId) === userId;
jsonRes<InitChatResponse>(res, {
data: {
chatId: chatId || '',
@@ -94,9 +97,11 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
name: model.name,
avatar: model.avatar,
intro: model.intro,
canUse: model.share.isShare || String(model.userId) === userId
canUse: model.share.isShare || isOwner
},
chatModel: model.chat.chatModel,
systemPrompt: isOwner ? model.chat.systemPrompt : '',
limitPrompt: isOwner ? model.chat.limitPrompt : '',
history
}
});

View File

@@ -4,10 +4,9 @@ import { ChatItemType } from '@/types/chat';
import { connectToDatabase, Chat, Model } from '@/service/mongo';
import { authModel } from '@/service/utils/auth';
import { authUser } from '@/service/utils/auth';
import mongoose from 'mongoose';
import { Types } from 'mongoose';
type Props = {
newChatId?: string;
chatId?: string;
modelId: string;
prompts: [ChatItemType, ChatItemType];
@@ -16,7 +15,7 @@ type Props = {
/* 聊天内容存存储 */
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
try {
const { chatId, modelId, prompts, newChatId } = req.body as Props;
const { chatId, modelId, prompts } = req.body as Props;
if (!prompts) {
throw new Error('缺少参数');
@@ -24,16 +23,15 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
const { userId } = await authUser({ req, authToken: true });
const nId = await saveChat({
const response = await saveChat({
chatId,
modelId,
prompts,
newChatId,
userId
});
jsonRes(res, {
data: nId
data: response
});
} catch (err) {
jsonRes(res, {
@@ -44,25 +42,31 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
}
export async function saveChat({
chatId,
newChatId,
chatId,
modelId,
prompts,
userId
}: Props & { userId: string }) {
}: Props & { newChatId?: Types.ObjectId; userId: string }) {
await connectToDatabase();
const { model } = await authModel({ modelId, userId, authOwner: false });
const content = prompts.map((item) => ({
_id: item._id ? new mongoose.Types.ObjectId(item._id) : undefined,
_id: item._id,
obj: item.obj,
value: item.value,
systemPrompt: item.systemPrompt,
systemPrompt: item.systemPrompt || '',
quote: item.quote || []
}));
const [id] = await Promise.all([
...(chatId // update chat
if (String(model.userId) === userId) {
await Model.findByIdAndUpdate(modelId, {
updateTime: new Date()
});
}
const [response] = await Promise.all([
...(chatId
? [
Chat.findByIdAndUpdate(chatId, {
$push: {
@@ -73,17 +77,21 @@ export async function saveChat({
title: content[0].value.slice(0, 20),
latestChat: content[1].value,
updateTime: new Date()
}).then(() => '')
}).then(() => ({
newChatId: ''
}))
]
: [
Chat.create({
_id: newChatId ? new mongoose.Types.ObjectId(newChatId) : undefined,
_id: newChatId,
userId,
modelId,
content,
title: content[0].value.slice(0, 20),
latestChat: content[1].value
}).then((res) => res._id)
}).then((res) => ({
newChatId: String(res._id)
}))
]),
// update model
...(String(model.userId) === userId
@@ -96,6 +104,6 @@ export async function saveChat({
]);
return {
id
...response
};
}

View File

@@ -1,149 +0,0 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { connectToDatabase } from '@/service/mongo';
import { authShareChat } from '@/service/utils/auth';
import { modelServiceToolMap } from '@/service/utils/chat';
import { ChatItemSimpleType } from '@/types/chat';
import { jsonRes } from '@/service/response';
import { ChatModelMap } from '@/constants/model';
import { pushChatBill, updateShareChatBill } from '@/service/events/pushBill';
import { resStreamResponse } from '@/service/utils/chat';
import { ChatRoleEnum } from '@/constants/chat';
import { BillTypeEnum } from '@/constants/user';
import { sensitiveCheck } from '../../openapi/text/sensitiveCheck';
import { appKbSearch } from '../../openapi/kb/appKbSearch';
/* 发送提示词 */
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
res.on('error', () => {
console.log('error: ', 'request error');
res.end();
});
try {
const { shareId, password, historyId, prompts } = req.body as {
prompts: ChatItemSimpleType[];
password: string;
shareId: string;
historyId: string;
};
if (!historyId || !prompts) {
throw new Error('分享链接无效');
}
await connectToDatabase();
let startTime = Date.now();
const { model, userOpenAiKey, systemAuthKey, userId } = await authShareChat({
shareId,
password
});
const modelConstantsData = ChatModelMap[model.chat.chatModel];
const prompt = prompts[prompts.length - 1];
const {
rawSearch = [],
userSystemPrompt = [],
quotePrompt = []
} = await (async () => {
// 使用了知识库搜索
if (model.chat.relatedKbs?.length > 0) {
const { rawSearch, userSystemPrompt, quotePrompt } = await appKbSearch({
model,
userId,
fixedQuote: [],
prompt: prompt,
similarity: model.chat.searchSimilarity,
limit: model.chat.searchLimit
});
return {
rawSearch: rawSearch,
userSystemPrompt: userSystemPrompt ? [userSystemPrompt] : [],
quotePrompt: [quotePrompt]
};
}
if (model.chat.systemPrompt) {
return {
userSystemPrompt: [
{
obj: ChatRoleEnum.System,
value: model.chat.systemPrompt
}
]
};
}
return {};
})();
// search result is empty
if (model.chat.relatedKbs?.length > 0 && !quotePrompt[0]?.value && model.chat.searchEmptyText) {
const response = model.chat.searchEmptyText;
return res.end(response);
}
// 读取对话内容
const completePrompts = [...quotePrompt, ...prompts.slice(0, -1), ...userSystemPrompt, prompt];
// content check
await sensitiveCheck({
input: [...quotePrompt, ...userSystemPrompt, prompt].map((item) => item.value).join('')
});
// 计算温度
const temperature = (modelConstantsData.maxTemperature * (model.chat.temperature / 10)).toFixed(
2
);
// 发出请求
const { streamResponse, responseMessages } = await modelServiceToolMap[
model.chat.chatModel
].chatCompletion({
apiKey: userOpenAiKey || systemAuthKey,
temperature: +temperature,
messages: completePrompts,
stream: true,
res,
chatId: historyId
});
console.log('api response time:', `${(Date.now() - startTime) / 1000}s`);
if (res.closed) return res.end();
try {
const { totalTokens, finishMessages } = await resStreamResponse({
model: model.chat.chatModel,
res,
chatResponse: streamResponse,
prompts: responseMessages
});
res.end();
/* bill */
pushChatBill({
isPay: !userOpenAiKey,
chatModel: model.chat.chatModel,
userId,
textLen: finishMessages.map((item) => item.value).join('').length,
tokens: totalTokens,
type: BillTypeEnum.chat
});
updateShareChatBill({
shareId,
tokens: totalTokens
});
} catch (error) {
res.end();
console.log('error结束', error);
}
} catch (err: any) {
res.status(500);
jsonRes(res, {
code: 500,
error: err
});
}
}

View File

@@ -24,8 +24,8 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse<
const authCount = await Model.countDocuments({
userId
});
if (authCount >= 30) {
throw new Error('上限 30 个应用');
if (authCount >= 50) {
throw new Error('上限 50 个应用');
}
// 创建模型

View File

@@ -18,14 +18,14 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse<
{
userId
},
'_id avatar name chat.systemPrompt'
'_id avatar name intro'
).sort({
updateTime: -1
}),
Collection.find({ userId })
.populate({
path: 'modelId',
select: '_id avatar name chat.systemPrompt',
select: '_id avatar name intro',
match: { 'share.isShare': true }
})
.then((res) => res.filter((item) => item.modelId))
@@ -37,14 +37,14 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse<
_id: item._id,
name: item.name,
avatar: item.avatar,
systemPrompt: item.chat.systemPrompt
intro: item.intro
})),
myCollectionModels: myCollections
.map((item: any) => ({
_id: item.modelId?._id,
name: item.modelId?.name,
avatar: item.modelId?.avatar,
systemPrompt: item.modelId?.chat.systemPrompt
intro: item.modelId?.intro
}))
.filter((item) => !myModels.find((model) => String(model._id) === String(item._id))) // 去重
}

View File

@@ -2,15 +2,13 @@ import type { NextApiRequest, NextApiResponse } from 'next';
import { connectToDatabase } from '@/service/mongo';
import { authUser, authModel, getApiKey } from '@/service/utils/auth';
import { modelServiceToolMap, resStreamResponse } from '@/service/utils/chat';
import { ChatItemSimpleType } from '@/types/chat';
import { ChatItemType } from '@/types/chat';
import { jsonRes } from '@/service/response';
import { ChatModelMap } from '@/constants/model';
import { pushChatBill } from '@/service/events/pushBill';
import { ChatRoleEnum } from '@/constants/chat';
import { withNextCors } from '@/service/utils/tools';
import { BillTypeEnum } from '@/constants/user';
import { NEW_CHATID_HEADER } from '@/constants/chat';
import { Types } from 'mongoose';
import { appKbSearch } from '../kb/appKbSearch';
/* 发送提示词 */
@@ -31,7 +29,7 @@ export default withNextCors(async function handler(req: NextApiRequest, res: Nex
isStream = true
} = req.body as {
chatId?: string;
prompts: ChatItemSimpleType[];
prompts: ChatItemType[];
modelId: string;
isStream: boolean;
};
@@ -67,10 +65,14 @@ export default withNextCors(async function handler(req: NextApiRequest, res: Nex
const modelConstantsData = ChatModelMap[model.chat.chatModel];
const prompt = prompts[prompts.length - 1];
const { userSystemPrompt = [], quotePrompt = [] } = await (async () => {
const {
userSystemPrompt = [],
userLimitPrompt = [],
quotePrompt = []
} = await (async () => {
// 使用了知识库搜索
if (model.chat.relatedKbs?.length > 0) {
const { userSystemPrompt, quotePrompt } = await appKbSearch({
const { quotePrompt, userSystemPrompt, userLimitPrompt } = await appKbSearch({
model,
userId,
fixedQuote: [],
@@ -80,21 +82,29 @@ export default withNextCors(async function handler(req: NextApiRequest, res: Nex
});
return {
userSystemPrompt: userSystemPrompt ? [userSystemPrompt] : [],
userSystemPrompt,
userLimitPrompt,
quotePrompt: [quotePrompt]
};
}
if (model.chat.systemPrompt) {
return {
userSystemPrompt: [
{
obj: ChatRoleEnum.System,
value: model.chat.systemPrompt
}
]
};
}
return {};
return {
userSystemPrompt: model.chat.systemPrompt
? [
{
obj: ChatRoleEnum.System,
value: model.chat.systemPrompt
}
]
: [],
userLimitPrompt: model.chat.limitPrompt
? [
{
obj: ChatRoleEnum.Human,
value: model.chat.limitPrompt
}
]
: []
};
})();
// search result is empty
@@ -104,17 +114,19 @@ export default withNextCors(async function handler(req: NextApiRequest, res: Nex
}
// 读取对话内容
const completePrompts = [...quotePrompt, ...prompts.slice(0, -1), ...userSystemPrompt, prompt];
const completePrompts = [
...quotePrompt,
...userSystemPrompt,
...prompts.slice(0, -1),
...userLimitPrompt,
prompt
];
// 计算温度
const temperature = (modelConstantsData.maxTemperature * (model.chat.temperature / 10)).toFixed(
2
);
// get conversationId. create a newId if it is null
const conversationId = chatId || String(new Types.ObjectId());
!chatId && res?.setHeader(NEW_CHATID_HEADER, conversationId);
// 发出请求
const { streamResponse, responseMessages, responseText, totalTokens } =
await modelServiceToolMap[model.chat.chatModel].chatCompletion({
@@ -122,8 +134,7 @@ export default withNextCors(async function handler(req: NextApiRequest, res: Nex
temperature: +temperature,
messages: completePrompts,
stream: isStream,
res,
chatId: conversationId
res
});
console.log('api response time:', `${(Date.now() - startTime) / 1000}s`);

View File

@@ -3,7 +3,7 @@ import { jsonRes } from '@/service/response';
import { authUser } from '@/service/utils/auth';
import { PgClient } from '@/service/pg';
import { withNextCors } from '@/service/utils/tools';
import type { ChatItemSimpleType } from '@/types/chat';
import type { ChatItemType } from '@/types/chat';
import type { ModelSchema } from '@/types/mongoSchema';
import { authModel } from '@/service/utils/auth';
import { ChatModelMap } from '@/constants/model';
@@ -18,7 +18,7 @@ export type QuoteItemType = {
source?: string;
};
type Props = {
prompts: ChatItemSimpleType[];
prompts: ChatItemType[];
similarity: number;
limit: number;
appId: string;
@@ -28,7 +28,11 @@ type Response = {
userSystemPrompt: {
obj: ChatRoleEnum;
value: string;
};
}[];
userLimitPrompt: {
obj: ChatRoleEnum;
value: string;
}[];
quotePrompt: {
obj: ChatRoleEnum;
value: string;
@@ -79,15 +83,15 @@ export default withNextCors(async function handler(req: NextApiRequest, res: Nex
export async function appKbSearch({
model,
userId,
fixedQuote,
fixedQuote = [],
prompt,
similarity = 0.8,
limit = 5
}: {
model: ModelSchema;
userId: string;
fixedQuote: QuoteItemType[];
prompt: ChatItemSimpleType;
fixedQuote?: QuoteItemType[];
prompt: ChatItemType;
similarity: number;
limit: number;
}): Promise<Response> {
@@ -96,8 +100,7 @@ export async function appKbSearch({
// get vector
const promptVector = await openaiEmbedding({
userId,
input: [prompt.value],
type: 'chat'
input: [prompt.value]
});
// search kb
@@ -120,7 +123,7 @@ export async function appKbSearch({
...searchRes.slice(0, 3),
...fixedQuote.slice(0, 2),
...searchRes.slice(3),
...fixedQuote.slice(2, 4)
...fixedQuote.slice(2, Math.floor(fixedQuote.length * 0.4))
].filter((item) => {
if (idSet.has(item.id)) {
return false;
@@ -131,17 +134,24 @@ export async function appKbSearch({
// 计算固定提示词的 token 数量
const userSystemPrompt = model.chat.systemPrompt // user system prompt
? {
obj: ChatRoleEnum.Human,
value: model.chat.systemPrompt
}
: {
obj: ChatRoleEnum.Human,
value: `知识库是关于 ${model.name} 的内容,参考知识库回答问题。与 "${model.name}" 无关内容,直接回复: "我不知道"。`
};
? [
{
obj: ChatRoleEnum.System,
value: model.chat.systemPrompt
}
]
: [];
const userLimitPrompt = [
{
obj: ChatRoleEnum.Human,
value: model.chat.limitPrompt
? model.chat.limitPrompt
: `知识库是关于 ${model.name} 的内容,参考知识库回答问题。与 "${model.name}" 无关内容,直接回复: "我不知道"。`
}
];
const fixedSystemTokens = modelToolMap[model.chat.chatModel].countTokens({
messages: [userSystemPrompt]
messages: [...userSystemPrompt, ...userLimitPrompt]
});
// filter part quote by maxToken
@@ -165,6 +175,7 @@ export async function appKbSearch({
return {
rawSearch,
userSystemPrompt,
userLimitPrompt,
quotePrompt: {
obj: ChatRoleEnum.System,
value: quoteText

View File

@@ -73,9 +73,6 @@ export async function pushDataToKb({
const set = new Set();
const filterData: DateItemType[] = [];
const time = Date.now();
console.log('push data', data.length);
data.forEach((item) => {
const text = item.q + item.a;
@@ -156,8 +153,6 @@ export async function pushDataToKb({
insertData.length > 0 && startQueue();
console.log('push data finish', Date.now() - time);
return {
insertLen: insertData.length
};

View File

@@ -29,8 +29,7 @@ export default withNextCors(async function handler(req: NextApiRequest, res: Nex
const vector = await openaiEmbedding({
userId,
input: [text],
type: 'training'
input: [text]
});
const response: any = await PgClient.query(

View File

@@ -21,8 +21,7 @@ export default withNextCors(async function handler(req: NextApiRequest, res: Nex
if (q) {
return openaiEmbedding({
userId,
input: [q],
type: 'chat'
input: [q]
});
}
return [];

View File

@@ -1,31 +1,29 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { authUser, getApiKey } from '@/service/utils/auth';
import { authUser, getApiKey, getSystemOpenAiKey } from '@/service/utils/auth';
import { withNextCors } from '@/service/utils/tools';
import { getOpenAIApi } from '@/service/utils/chat/openai';
import { embeddingModel } from '@/constants/model';
import { axiosConfig } from '@/service/utils/tools';
import { pushGenerateVectorBill } from '@/service/events/pushBill';
import { ApiKeyType } from '@/service/utils/auth';
import { OpenAiChatEnum } from '@/constants/model';
type Props = {
input: string[];
type?: ApiKeyType;
};
type Response = number[][];
export default withNextCors(async function handler(req: NextApiRequest, res: NextApiResponse<any>) {
try {
const { userId } = await authUser({ req });
let { input, type } = req.query as Props;
let { input } = req.query as Props;
if (!Array.isArray(input)) {
throw new Error('缺少参数');
}
jsonRes<Response>(res, {
data: await openaiEmbedding({ userId, input, type, mustPay: true })
data: await openaiEmbedding({ userId, input, mustPay: true })
});
} catch (err) {
console.log(err);
@@ -39,18 +37,17 @@ export default withNextCors(async function handler(req: NextApiRequest, res: Nex
export async function openaiEmbedding({
userId,
input,
mustPay = false,
type = 'chat'
mustPay = false
}: { userId: string; mustPay?: boolean } & Props) {
const { userOpenAiKey, systemAuthKey } = await getApiKey({
model: OpenAiChatEnum.GPT35,
model: 'gpt-3.5-turbo',
userId,
mustPay,
type
mustPay
});
const apiKey = userOpenAiKey || systemAuthKey;
// 获取 chatAPI
const chatAPI = getOpenAIApi();
const chatAPI = getOpenAIApi(apiKey);
// 把输入的内容转成向量
const result = await chatAPI
@@ -61,13 +58,19 @@ export async function openaiEmbedding({
},
{
timeout: 60000,
...axiosConfig(userOpenAiKey || systemAuthKey)
...axiosConfig(apiKey)
}
)
.then((res) => ({
tokenLen: res.data.usage.total_tokens || 0,
vectors: res.data.data.map((item) => item.embedding)
}));
.then((res) => {
if (!res.data?.usage?.total_tokens) {
// @ts-ignore
return Promise.reject(res.data?.error?.message || 'Embedding Error');
}
return {
tokenLen: res.data.usage.total_tokens || 0,
vectors: res.data.data.map((item) => item.embedding)
};
});
pushGenerateVectorBill({
isPay: !userOpenAiKey,

View File

@@ -2,18 +2,18 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { authUser } from '@/service/utils/auth';
import type { ChatItemSimpleType } from '@/types/chat';
import type { ChatItemType } from '@/types/chat';
import { countOpenAIToken } from '@/utils/plugin/openai';
import { OpenAiChatEnum } from '@/constants/model';
type ModelType = `${OpenAiChatEnum}`;
type Props = {
messages: ChatItemSimpleType[];
messages: ChatItemType[];
model: ModelType;
maxLen: number;
};
type Response = ChatItemSimpleType[];
type Response = ChatItemType[];
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
try {
@@ -45,11 +45,11 @@ export function gpt_chatItemTokenSlice({
model,
maxToken
}: {
messages: ChatItemSimpleType[];
messages: ChatItemType[];
model: ModelType;
maxToken: number;
}) {
let result: ChatItemSimpleType[] = [];
let result: ChatItemType[] = [];
for (let i = 0; i < messages.length; i++) {
const msgs = [...result, messages[i]];

View File

@@ -33,7 +33,7 @@ export async function sensitiveCheck({ input }: Props) {
}
const response = await axios({
...axiosConfig(getSystemOpenAiKey('chat')),
...axiosConfig(getSystemOpenAiKey()),
method: 'POST',
url: `/moderations`,
data: {

View File

@@ -0,0 +1,335 @@
import type { NextApiRequest, NextApiResponse } from 'next';
import { connectToDatabase } from '@/service/mongo';
import { authUser, authModel, getApiKey, authShareChat } from '@/service/utils/auth';
import { modelServiceToolMap, V2_StreamResponse } from '@/service/utils/chat';
import { jsonRes } from '@/service/response';
import { ChatModelMap } from '@/constants/model';
import { pushChatBill, updateShareChatBill } from '@/service/events/pushBill';
import { ChatRoleEnum, sseResponseEventEnum } from '@/constants/chat';
import { withNextCors } from '@/service/utils/tools';
import { BillTypeEnum } from '@/constants/user';
import { appKbSearch } from '../../../openapi/kb/appKbSearch';
import type { CreateChatCompletionRequest } from 'openai';
import { gptMessage2ChatType, textAdaptGptResponse } from '@/utils/adapt';
import { getChatHistory } from './getHistory';
import { saveChat } from '@/pages/api/chat/saveChat';
import { sseResponse } from '@/service/utils/tools';
import { type ChatCompletionRequestMessage } from 'openai';
import { Types } from 'mongoose';
import { sensitiveCheck } from '../../text/sensitiveCheck';
export type MessageItemType = ChatCompletionRequestMessage & { _id?: string };
type FastGptWebChatProps = {
chatId?: string; // undefined: nonuse history, '': new chat, 'xxxxx': use history
appId?: string;
};
type FastGptShareChatProps = {
password?: string;
shareId?: string;
};
export type Props = CreateChatCompletionRequest &
FastGptWebChatProps &
FastGptShareChatProps & {
messages: MessageItemType[];
};
export type ChatResponseType = {
newChatId: string;
quoteLen?: number;
};
/* 发送提示词 */
export default withNextCors(async function handler(req: NextApiRequest, res: NextApiResponse) {
res.on('close', () => {
res.end();
});
res.on('error', () => {
console.log('error: ', 'request error');
res.end();
});
let { chatId, appId, shareId, password = '', stream = false, messages = [] } = req.body as Props;
let step = 0;
try {
if (!messages) {
throw new Error('Prams Error');
}
if (!Array.isArray(messages)) {
throw new Error('messages is not array');
}
await connectToDatabase();
let startTime = Date.now();
/* user auth */
const {
userId,
appId: authAppid,
authType
} = await (shareId
? authShareChat({
shareId,
password
})
: authUser({ req }));
appId = appId ? appId : authAppid;
if (!appId) {
throw new Error('appId is empty');
}
// auth app permission
const { model, showModelDetail } = await authModel({
userId,
modelId: appId,
authOwner: false,
reserveDetail: true
});
const showAppDetail = !shareId && showModelDetail;
/* get api key */
const { systemAuthKey: apiKey, userOpenAiKey } = await getApiKey({
model: model.chat.chatModel,
userId,
mustPay: authType !== 'token'
});
// get history
const { history } = await getChatHistory({ chatId, userId });
const prompts = history.concat(gptMessage2ChatType(messages));
// adapt fastgpt web
if (prompts[prompts.length - 1].obj === 'AI') {
prompts.pop();
}
// user question
const prompt = prompts[prompts.length - 1];
const {
rawSearch = [],
userSystemPrompt = [],
userLimitPrompt = [],
quotePrompt = []
} = await (async () => {
// 使用了知识库搜索
if (model.chat.relatedKbs?.length > 0) {
const { rawSearch, quotePrompt, userSystemPrompt, userLimitPrompt } = await appKbSearch({
model,
userId,
fixedQuote: history[history.length - 1]?.quote,
prompt,
similarity: model.chat.searchSimilarity,
limit: model.chat.searchLimit
});
return {
rawSearch,
userSystemPrompt,
userLimitPrompt,
quotePrompt: [quotePrompt]
};
}
return {
userSystemPrompt: model.chat.systemPrompt
? [
{
obj: ChatRoleEnum.System,
value: model.chat.systemPrompt
}
]
: [],
userLimitPrompt: model.chat.limitPrompt
? [
{
obj: ChatRoleEnum.Human,
value: model.chat.limitPrompt
}
]
: []
};
})();
// search result is empty
if (model.chat.relatedKbs?.length > 0 && !quotePrompt[0]?.value && model.chat.searchEmptyText) {
const response = model.chat.searchEmptyText;
if (stream) {
sseResponse({
res,
event: sseResponseEventEnum.answer,
data: textAdaptGptResponse({
text: response,
model: model.chat.chatModel,
finish_reason: 'stop'
})
});
return res.end();
} else {
return res.json({
id: chatId || '',
model: model.chat.chatModel,
usage: { prompt_tokens: 0, completion_tokens: 0, total_tokens: 0 },
choices: [
{ message: [{ role: 'assistant', content: response }], finish_reason: 'stop', index: 0 }
]
});
}
}
// api messages. [quote,context,systemPrompt,question]
const completePrompts = [
...quotePrompt,
...userSystemPrompt,
...prompts.slice(0, -1),
...userLimitPrompt,
prompt
];
// chat temperature
const modelConstantsData = ChatModelMap[model.chat.chatModel];
// FastGpt temperature range: 1~10
const temperature = (modelConstantsData.maxTemperature * (model.chat.temperature / 10)).toFixed(
2
);
await sensitiveCheck({
input: `${userSystemPrompt[0]?.value}\n${userLimitPrompt[0]?.value}\n${prompt.value}`
});
// start model api. responseText and totalTokens: valid only if stream = false
const { streamResponse, responseMessages, responseText, totalTokens } =
await modelServiceToolMap[model.chat.chatModel].chatCompletion({
apiKey: userOpenAiKey || apiKey,
temperature: +temperature,
maxToken: model.chat.maxToken,
messages: completePrompts,
stream,
res
});
console.log('api response time:', `${(Date.now() - startTime) / 1000}s`);
if (res.closed) return res.end();
// create a chatId
const newChatId = chatId === '' ? new Types.ObjectId() : undefined;
// response answer
const {
textLen = 0,
answer = responseText,
tokens = totalTokens
} = await (async () => {
if (stream) {
// 创建响应流
res.setHeader('Content-Type', 'text/event-stream;charset-utf-8');
res.setHeader('Access-Control-Allow-Origin', '*');
res.setHeader('Transfer-Encoding', 'chunked');
res.setHeader('X-Accel-Buffering', 'no');
res.setHeader('Cache-Control', 'no-cache, no-transform');
step = 1;
try {
// response newChatId and quota
sseResponse({
res,
event: sseResponseEventEnum.chatResponse,
data: JSON.stringify({
newChatId,
quoteLen: rawSearch.length
})
});
// response answer
const { finishMessages, totalTokens, responseContent } = await V2_StreamResponse({
model: model.chat.chatModel,
res,
chatResponse: streamResponse,
prompts: responseMessages
});
return {
answer: responseContent,
textLen: finishMessages.map((item) => item.value).join('').length,
tokens: totalTokens
};
} catch (error) {
return Promise.reject(error);
}
} else {
return {
textLen: responseMessages.map((item) => item.value).join('').length
};
}
})();
// save chat history
if (typeof chatId === 'string') {
await saveChat({
newChatId,
chatId,
modelId: appId,
prompts: [
prompt,
{
_id: messages[messages.length - 1]._id,
obj: ChatRoleEnum.AI,
value: answer,
...(showAppDetail
? {
quote: rawSearch,
systemPrompt: `${userSystemPrompt[0]?.value}\n\n${userLimitPrompt[0]?.value}`
}
: {})
}
],
userId
});
}
// close response
if (stream) {
res.end();
} else {
res.json({
...(showAppDetail
? {
rawSearch
}
: {}),
newChatId,
id: chatId || '',
model: model.chat.chatModel,
usage: { prompt_tokens: 0, completion_tokens: 0, total_tokens: tokens },
choices: [
{ message: [{ role: 'assistant', content: answer }], finish_reason: 'stop', index: 0 }
]
});
}
pushChatBill({
isPay: !userOpenAiKey,
chatModel: model.chat.chatModel,
userId,
textLen,
tokens,
type: authType === 'apikey' ? BillTypeEnum.openapiChat : BillTypeEnum.chat
});
shareId &&
updateShareChatBill({
shareId,
tokens
});
} catch (err: any) {
res.status(500);
if (step === 1) {
sseResponse({
res,
event: sseResponseEventEnum.error,
data: JSON.stringify(err)
});
res.end();
} else {
jsonRes(res, {
code: 500,
error: err
});
}
}
});

View File

@@ -0,0 +1,66 @@
// Next.js API route support: https://nextjs.org/docs/api-routes/introduction
import type { NextApiRequest, NextApiResponse } from 'next';
import { jsonRes } from '@/service/response';
import { authUser } from '@/service/utils/auth';
import { connectToDatabase, Chat } from '@/service/mongo';
import { Types } from 'mongoose';
import type { ChatItemType } from '@/types/chat';
export type Props = {
chatId?: string;
limit?: number;
};
export type Response = { history: ChatItemType[] };
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
try {
await connectToDatabase();
const { userId } = await authUser({ req });
const { chatId, limit } = req.body as Props;
jsonRes<Response>(res, {
data: await getChatHistory({
chatId,
userId,
limit
})
});
} catch (err) {
jsonRes(res, {
code: 500,
error: err
});
}
}
export async function getChatHistory({
chatId,
userId,
limit = 50
}: Props & { userId: string }): Promise<Response> {
if (!chatId) {
return { history: [] };
}
const history = await Chat.aggregate([
{ $match: { _id: new Types.ObjectId(chatId), userId: new Types.ObjectId(userId) } },
{
$project: {
content: {
$slice: ['$content', -limit] // 返回 content 数组的最后50个元素
}
}
},
{ $unwind: '$content' },
{
$project: {
_id: '$content._id',
obj: '$content.obj',
value: '$content.value',
quote: '$content.quote'
}
}
]);
return { history };
}

View File

@@ -5,13 +5,15 @@ import { jsonRes } from '@/service/response';
export type InitDateResponse = {
beianText: string;
googleVerKey: string;
baiduTongji: boolean;
};
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
jsonRes<InitDateResponse>(res, {
data: {
beianText: process.env.SAFE_BEIAN_TEXT || '',
googleVerKey: process.env.CLIENT_GOOGLE_VER_TOKEN || ''
googleVerKey: process.env.CLIENT_GOOGLE_VER_TOKEN || '',
baiduTongji: process.env.BAIDU_TONGJI === '1'
}
});
}

View File

@@ -7,13 +7,9 @@ import { ChatModelMap, OpenAiChatEnum } from '@/constants/model';
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
const chatModelList: ChatModelItemType[] = [];
if (global.systemEnv.openAIKeys) {
chatModelList.push(ChatModelMap[OpenAiChatEnum.GPT3516k]);
chatModelList.push(ChatModelMap[OpenAiChatEnum.GPT35]);
}
if (global.systemEnv.gpt4Key) {
chatModelList.push(ChatModelMap[OpenAiChatEnum.GPT4]);
}
chatModelList.push(ChatModelMap[OpenAiChatEnum.GPT3516k]);
chatModelList.push(ChatModelMap[OpenAiChatEnum.GPT35]);
chatModelList.push(ChatModelMap[OpenAiChatEnum.GPT4]);
jsonRes(res, {
data: chatModelList

View File

@@ -38,9 +38,6 @@ const ModelList = ({ models, modelId }: { models: ModelListItemType[]; modelId:
<Box className="textEllipsis" color={'myGray.1000'}>
{item.name}
</Box>
<Box className="textEllipsis" color={'myGray.400'} fontSize={'sm'}>
{item.systemPrompt || '这个 应用 没有设置提示词~'}
</Box>
</Box>
</Flex>
</Box>

View File

@@ -43,13 +43,13 @@ import { fileDownload } from '@/utils/file';
import { htmlTemplate } from '@/constants/common';
import { useUserStore } from '@/store/user';
import Loading from '@/components/Loading';
import Markdown from '@/components/Markdown';
import SideBar from '@/components/SideBar';
import Avatar from '@/components/Avatar';
import Empty from './components/Empty';
import QuoteModal from './components/QuoteModal';
import { HUMAN_ICON } from '@/constants/chat';
const Markdown = dynamic(async () => await import('@/components/Markdown'));
const PhoneSliderBar = dynamic(() => import('./components/PhoneSliderBar'), {
ssr: false
});
@@ -59,6 +59,7 @@ const History = dynamic(() => import('./components/History'), {
});
import styles from './index.module.scss';
import { adaptChatItem_openAI } from '@/utils/plugin/openai';
const textareaMinH = '22px';
@@ -170,19 +171,15 @@ const Chat = ({ modelId, chatId }: { modelId: string; chatId: string }) => {
controller.current = abortSignal;
isLeavePage.current = false;
const prompt: ChatItemType[] = prompts.map((item) => ({
_id: item._id,
obj: item.obj,
value: item.value
}));
const messages = adaptChatItem_openAI({ messages: prompts, reserveId: true });
// 流请求,获取数据
const { newChatId, quoteLen, systemPrompt } = await streamFetch({
url: '/api/chat/chat',
const { newChatId, quoteLen, errMsg } = await streamFetch({
data: {
prompt,
messages,
chatId,
modelId
appId: modelId,
model: ''
},
onMessage: (text: string) => {
setChatData((state) => ({
@@ -222,7 +219,9 @@ const Chat = ({ modelId, chatId }: { modelId: string; chatId: string }) => {
...item,
status: 'finish',
quoteLen,
systemPrompt
systemPrompt: `${chatData.systemPrompt}${`${
chatData.limitPrompt ? `\n\n${chatData.limitPrompt}` : ''
}`}`
};
})
}));
@@ -233,16 +232,26 @@ const Chat = ({ modelId, chatId }: { modelId: string; chatId: string }) => {
loadHistory({ pageNum: 1, init: true });
loadMyModels(true);
}, 100);
if (errMsg) {
toast({
status: 'warning',
title: errMsg
});
}
},
[
chatId,
modelId,
setChatData,
loadHistory,
loadMyModels,
generatingMessage,
setForbidLoadChatData,
router
router,
chatData.systemPrompt,
chatData.limitPrompt,
loadHistory,
loadMyModels,
toast
]
);
@@ -328,8 +337,8 @@ const Chat = ({ modelId, chatId }: { modelId: string; chatId: string }) => {
// 删除一句话
const delChatRecord = useCallback(
async (index: number, historyId: string) => {
if (!messageContextMenuData) return;
async (index: number, historyId?: string) => {
if (!messageContextMenuData || !historyId) return;
setIsLoading(true);
try {
@@ -738,7 +747,6 @@ const Chat = ({ modelId, chatId }: { modelId: string; chatId: string }) => {
<Markdown
source={item.value}
isChatting={isChatting && index === chatData.history.length - 1}
formatLink
/>
<Flex>
{!!item.systemPrompt && (
@@ -752,7 +760,7 @@ const Chat = ({ modelId, chatId }: { modelId: string; chatId: string }) => {
px={[2, 4]}
onClick={() => setShowSystemPrompt(item.systemPrompt || '')}
>
&
</Button>
)}
{!!item.quoteLen && (

View File

@@ -56,6 +56,7 @@ const ShareHistory = dynamic(() => import('./components/ShareHistory'), {
});
import styles from './index.module.scss';
import { adaptChatItem_openAI } from '@/utils/plugin/openai';
const textareaMinH = '22px';
@@ -170,19 +171,15 @@ const Chat = ({ shareId, historyId }: { shareId: string; historyId: string }) =>
controller.current = abortSignal;
isLeavePage.current = false;
const formatPrompts = prompts.map((item) => ({
obj: item.obj,
value: item.value
}));
const messages = adaptChatItem_openAI({ messages: prompts, reserveId: true });
// 流请求,获取数据
const { responseText } = await streamFetch({
url: '/api/chat/shareChat/chat',
data: {
prompts: formatPrompts.slice(-shareChatData.maxContext - 1, -1),
messages: messages.slice(-shareChatData.maxContext - 1, -1),
password,
shareId,
historyId
model: ''
},
onMessage: (text: string) => {
setShareChatData((state) => ({
@@ -226,7 +223,7 @@ const Chat = ({ shareId, historyId }: { shareId: string; historyId: string }) =>
setShareChatHistory({
historyId,
shareId,
title: formatPrompts[formatPrompts.length - 2].value,
title: prompts[prompts.length - 2].value,
latestChat: responseText,
chats: responseHistory
});
@@ -235,7 +232,7 @@ const Chat = ({ shareId, historyId }: { shareId: string; historyId: string }) =>
{
type: 'shareChatFinish',
data: {
question: formatPrompts[formatPrompts.length - 2].value,
question: prompts[prompts.length - 2].value,
answer: responseText
}
},
@@ -662,7 +659,6 @@ const Chat = ({ shareId, historyId }: { shareId: string; historyId: string }) =>
<Markdown
source={item.value}
isChatting={isChatting && index === shareChatData.history.length - 1}
formatLink
/>
</Card>
</Box>

View File

@@ -1,4 +1,4 @@
import React, { useState, useCallback } from 'react';
import React, { useState, useCallback, useRef } from 'react';
import {
Box,
Flex,
@@ -24,24 +24,10 @@ import { TrainingModeEnum } from '@/constants/plugin';
import { getErrText } from '@/utils/tools';
import { ChatModelMap, OpenAiChatEnum, embeddingPrice } from '@/constants/model';
import { formatPrice } from '@/utils/user';
import MySlider from '@/components/Slider';
const fileExtension = '.txt,.doc,.docx,.pdf,.md';
const modeMap = {
[TrainingModeEnum.qa]: {
maxLen: 8000,
slideLen: 3000,
price: ChatModelMap[OpenAiChatEnum.GPT3516k].price,
isPrompt: true
},
[TrainingModeEnum.index]: {
maxLen: 1000,
slideLen: 500,
price: embeddingPrice,
isPrompt: false
}
};
const SelectFileModal = ({
onClose,
onSuccess,
@@ -51,6 +37,16 @@ const SelectFileModal = ({
onSuccess: () => void;
kbId: string;
}) => {
const [modeMap, setModeMap] = useState({
[TrainingModeEnum.qa]: {
maxLen: 8000,
price: ChatModelMap[OpenAiChatEnum.GPT3516k].price
},
[TrainingModeEnum.index]: {
maxLen: 600,
price: embeddingPrice
}
});
const [btnLoading, setBtnLoading] = useState(false);
const { toast } = useToast();
const [prompt, setPrompt] = useState('');
@@ -200,7 +196,7 @@ const SelectFileModal = ({
});
}
setBtnLoading(false);
}, [files, mode, mutate, openConfirm, toast]);
}, [files, mode, modeMap, mutate, openConfirm, toast]);
return (
<Modal isOpen={true} onClose={onClose} isCentered>
@@ -244,19 +240,52 @@ const SelectFileModal = ({
/>
</Flex>
{/* 内容介绍 */}
{modeMap[mode].isPrompt && (
<Flex w={'100%'} px={5} alignItems={'center'} mt={4}>
<Box flex={'0 0 70px'} mr={2}>
</Box>
<Input
placeholder="提示词,例如: Laf的介绍/关于gpt4的论文/一段长文本"
value={prompt}
onChange={(e) => setPrompt(e.target.value)}
size={'sm'}
/>
</Flex>
)}
<Flex w={'100%'} px={5} alignItems={'center'} mt={4}>
{mode === TrainingModeEnum.qa && (
<>
<Box flex={'0 0 70px'} mr={2}>
</Box>
<Input
placeholder="提示词,例如: Laf的介绍/关于gpt4的论文/一段长文本"
value={prompt}
onChange={(e) => setPrompt(e.target.value)}
size={'sm'}
/>
</>
)}
{/* chunk size */}
{mode === TrainingModeEnum.index && (
<Flex mt={5}>
<Box w={['70px']} flexShrink={0}>
</Box>
<Box flex={1} ml={'10px'}>
<MySlider
markList={[
{ label: '300', value: 300 },
{ label: '1000', value: 1000 }
]}
width={['100%', '260px']}
min={300}
max={1000}
step={50}
activeVal={modeMap[TrainingModeEnum.index].maxLen}
setVal={(val) => {
setModeMap((state) => ({
...state,
[TrainingModeEnum.index]: {
maxLen: val,
price: embeddingPrice
}
}));
}}
/>
</Box>
</Flex>
)}
</Flex>
{/* 文本内容 */}
<Box flex={'1 0 0'} px={5} h={0} w={'100%'} overflowY={'auto'} mt={4}>
{files.slice(0, 100).map((item, i) => (

View File

@@ -42,6 +42,12 @@ const Test = () => {
pushKbTestItem(testItem);
setInputText('');
setKbTestItem(testItem);
},
onError(err) {
toast({
title: getErrText(err),
status: 'error'
});
}
});

View File

@@ -1,5 +1,5 @@
import React, { useCallback, useMemo, useRef, useState } from 'react';
import { Box, Flex, Input, IconButton, Tooltip, Tab, useTheme } from '@chakra-ui/react';
import React, { useCallback, useMemo, useState } from 'react';
import { Box, Flex, Input, IconButton, Tooltip, useTheme } from '@chakra-ui/react';
import { AddIcon } from '@chakra-ui/icons';
import { useRouter } from 'next/router';
import MyIcon from '@/components/Icon';
@@ -12,10 +12,10 @@ import { MyModelsTypeEnum } from '@/constants/user';
import dynamic from 'next/dynamic';
const Avatar = dynamic(() => import('@/components/Avatar'), {
ssr: true
ssr: false
});
const Tabs = dynamic(() => import('@/components/Tabs'), {
ssr: true
ssr: false
});
const ModelList = ({ modelId }: { modelId: string }) => {
@@ -55,14 +55,12 @@ const ModelList = ({ modelId }: { modelId: string }) => {
const currentModels = useMemo(() => {
const map = {
[MyModelsTypeEnum.my]: {
list: myModels.filter((item) =>
new RegExp(searchText, 'ig').test(item.name + item.systemPrompt)
),
list: myModels.filter((item) => new RegExp(searchText, 'ig').test(item.name + item.intro)),
emptyText: '还没有 AI 应用~\n快来创建一个吧'
},
[MyModelsTypeEnum.collection]: {
list: myCollectionModels.filter((item) =>
new RegExp(searchText, 'ig').test(item.name + item.systemPrompt)
new RegExp(searchText, 'ig').test(item.name + item.intro)
),
emptyText: '收藏的 AI 应用为空~\n快去市场找一个吧'
}

View File

@@ -1,18 +1,17 @@
import React, { useState } from 'react';
import React, { useEffect, useState } from 'react';
import { Box, Divider, Flex, useTheme, Button, Skeleton, useDisclosure } from '@chakra-ui/react';
import { useCopyData } from '@/utils/tools';
import dynamic from 'next/dynamic';
import MyIcon from '@/components/Icon';
const APIKeyModal = dynamic(() => import('@/components/APIKeyModal'), {
ssr: true
ssr: false
});
const baseUrl = 'https://fastgpt.run/api/openapi';
const API = ({ modelId }: { modelId: string }) => {
const theme = useTheme();
const { copyData } = useCopyData();
const [baseUrl, setBaseUrl] = useState('https://fastgpt.run/api/openapi');
const {
isOpen: isOpenAPIModal,
onOpen: onOpenAPIModal,
@@ -20,6 +19,10 @@ const API = ({ modelId }: { modelId: string }) => {
} = useDisclosure();
const [isLoaded, setIsLoaded] = useState(false);
useEffect(() => {
setBaseUrl(`${location.origin}/api/openapi`);
}, []);
return (
<Flex flexDirection={'column'} h={'100%'}>
<Box display={['none', 'flex']} px={5} alignItems={'center'}>

View File

@@ -1,5 +1,15 @@
import React, { useCallback, useState, useMemo } from 'react';
import { Box, Flex, Button, FormControl, Input, Textarea, Divider } from '@chakra-ui/react';
import {
Box,
Flex,
Button,
FormControl,
Input,
Textarea,
Divider,
Tooltip
} from '@chakra-ui/react';
import { QuestionOutlineIcon } from '@chakra-ui/icons';
import { useQuery } from '@tanstack/react-query';
import { useForm } from 'react-hook-form';
import { useRouter } from 'next/router';
@@ -20,11 +30,17 @@ import Avatar from '@/components/Avatar';
import MySelect from '@/components/Select';
import MySlider from '@/components/Slider';
const systemPromptTip =
'模型固定的引导词,通过调整该内容,可以引导模型聊天方向。该内容会被固定在上下文的开头。';
const limitPromptTip =
'限定模型对话范围,会被放置在本次提问前,拥有强引导和限定性。例如:\n1. 知识库是关于 Laf 的介绍,参考知识库回答问题,与 "Laf" 无关内容,直接回复: "我不知道"。\n2. 你仅回答关于 "xxx" 的问题,其他问题回复: "xxxx"';
const Settings = ({ modelId }: { modelId: string }) => {
const { toast } = useToast();
const router = useRouter();
const { Loading, setIsLoading } = useLoading();
const { userInfo, modelDetail, loadModelDetail, refreshModel, setLastModelId } = useUserStore();
const { userInfo, modelDetail, myModels, loadModelDetail, refreshModel, setLastModelId } =
useUserStore();
const { File, onOpen: onOpenSelectFile } = useSelectFile({
fileType: '.jpg,.png',
multiple: false
@@ -36,11 +52,6 @@ const Settings = ({ modelId }: { modelId: string }) => {
const [btnLoading, setBtnLoading] = useState(false);
const [refresh, setRefresh] = useState(false);
const isOwner = useMemo(
() => modelDetail.userId === userInfo?._id,
[modelDetail.userId, userInfo?._id]
);
const {
register,
setValue,
@@ -52,6 +63,20 @@ const Settings = ({ modelId }: { modelId: string }) => {
defaultValues: modelDetail
});
const isOwner = useMemo(
() => modelDetail.userId === userInfo?._id,
[modelDetail.userId, userInfo?._id]
);
const tokenLimit = useMemo(() => {
const max = ChatModelMap[getValues('chat.chatModel')]?.contextMaxToken || 4000;
if (max < getValues('chat.maxToken')) {
setValue('chat.maxToken', max);
}
return max;
}, [getValues, setValue, refresh]);
// 提交保存模型修改
const saveSubmitSuccess = useCallback(
async (data: ModelSchema) => {
@@ -110,7 +135,7 @@ const Settings = ({ modelId }: { modelId: string }) => {
status: 'success'
});
refreshModel.removeModelDetail(modelDetail._id);
router.replace('/model');
router.replace(`/model?modelId=${myModels[1]?._id}`);
} catch (err: any) {
toast({
title: err?.message || '删除失败',
@@ -118,7 +143,7 @@ const Settings = ({ modelId }: { modelId: string }) => {
});
}
setIsLoading(false);
}, [modelDetail, setIsLoading, toast, refreshModel, router]);
}, [modelDetail, setIsLoading, toast, refreshModel, router, myModels]);
const onSelectFile = useCallback(
async (e: File[]) => {
@@ -147,6 +172,7 @@ const Settings = ({ modelId }: { modelId: string }) => {
onSuccess(res) {
res && reset(res);
modelId && setLastModelId(modelId);
setRefresh(!refresh);
},
onError(err: any) {
toast({
@@ -200,7 +226,7 @@ const Settings = ({ modelId }: { modelId: string }) => {
</Box>
<Textarea
rows={5}
rows={4}
maxLength={500}
placeholder={'给你的 AI 应用一个介绍'}
{...register('intro')}
@@ -214,11 +240,14 @@ const Settings = ({ modelId }: { modelId: string }) => {
</Box>
<MySelect
width={['200px', '240px']}
width={['90%', '280px']}
value={getValues('chat.chatModel')}
list={chatModelList.map((item) => ({
id: item.chatModel,
label: item.name
label: `${item.name} (${formatPrice(
ChatModelMap[item.chatModel]?.price,
1000
)} 元/1k tokens)`
}))}
onchange={(val: any) => {
setValue('chat.chatModel', val);
@@ -226,15 +255,6 @@ const Settings = ({ modelId }: { modelId: string }) => {
}}
/>
</Flex>
<Flex alignItems={'center'} mt={5}>
<Box w={['60px', '100px', '140px']} flexShrink={0}>
</Box>
<Box fontSize={['sm', 'md']}>
{formatPrice(ChatModelMap[getValues('chat.chatModel')]?.price, 1000)}
/1K tokens()
</Box>
</Flex>
<Flex alignItems={'center'} my={10}>
<Box w={['60px', '100px', '140px']} flexShrink={0}>
@@ -245,7 +265,7 @@ const Settings = ({ modelId }: { modelId: string }) => {
{ label: '严谨', value: 0 },
{ label: '发散', value: 10 }
]}
width={['100%', '260px']}
width={['90%', '260px']}
min={0}
max={10}
activeVal={getValues('chat.temperature')}
@@ -256,18 +276,54 @@ const Settings = ({ modelId }: { modelId: string }) => {
/>
</Box>
</Flex>
<Flex alignItems={'center'} mt={12} mb={10}>
<Box w={['60px', '100px', '140px']} flexShrink={0}>
</Box>
<Box flex={1} ml={'10px'}>
<MySlider
markList={[
{ label: '100', value: 100 },
{ label: `${tokenLimit}`, value: tokenLimit }
]}
width={['90%', '260px']}
min={100}
max={tokenLimit}
step={50}
activeVal={getValues('chat.maxToken')}
setVal={(val) => {
setValue('chat.maxToken', val);
setRefresh(!refresh);
}}
/>
</Box>
</Flex>
<Flex mt={10} alignItems={'flex-start'}>
<Box w={['60px', '100px', '140px']} flexShrink={0}>
<Tooltip label={systemPromptTip}>
<QuestionOutlineIcon display={['none', 'inline']} ml={1} />
</Tooltip>
</Box>
<Textarea
rows={8}
placeholder={
'模型默认的 prompt 词,通过调整该内容,可以引导模型聊天方向。\n\n如果使用了知识库搜索没有填写该内容时系统会自动补充提示词如果填写了内容则以填写的内容为准。'
}
placeholder={systemPromptTip}
{...register('chat.systemPrompt')}
></Textarea>
</Flex>
<Flex mt={5} alignItems={'flex-start'}>
<Box w={['60px', '100px', '140px']} flexShrink={0}>
<Tooltip label={limitPromptTip}>
<QuestionOutlineIcon display={['none', 'inline']} ml={1} />
</Tooltip>
</Box>
<Textarea
rows={5}
placeholder={limitPromptTip}
{...register('chat.limitPrompt')}
></Textarea>
</Flex>
<Flex mt={5} alignItems={'center'}>
<Box w={['60px', '100px', '140px']} flexShrink={0}></Box>
@@ -276,6 +332,7 @@ const Settings = ({ modelId }: { modelId: string }) => {
w={'120px'}
size={['sm', 'md']}
isLoading={btnLoading}
isDisabled={!isOwner}
onClick={async () => {
try {
await saveUpdateModel();
@@ -289,7 +346,7 @@ const Settings = ({ modelId }: { modelId: string }) => {
}
}}
>
{isOwner ? '保存' : '仅读,无法修改'}
</Button>
<Button
mr={3}
@@ -309,17 +366,20 @@ const Settings = ({ modelId }: { modelId: string }) => {
>
</Button>
<Button
colorScheme={'gray'}
variant={'base'}
size={['sm', 'md']}
isLoading={btnLoading}
_hover={{ color: 'red.600' }}
onClick={openConfirm(handleDelModel)}
>
</Button>
{isOwner && (
<Button
colorScheme={'gray'}
variant={'base'}
size={['sm', 'md']}
isLoading={btnLoading}
_hover={{ color: 'red.600' }}
onClick={openConfirm(handleDelModel)}
>
</Button>
)}
</Flex>
<File onSelect={onSelectFile} />
<ConfirmChild />
<Loading loading={isLoading} fixed={false} />

View File

@@ -1,4 +1,4 @@
import React, { useState, useEffect } from 'react';
import React, { useState, useEffect, useMemo } from 'react';
import { useRouter } from 'next/router';
import { Box, Flex } from '@chakra-ui/react';
import { useUserStore } from '@/store/user';
@@ -7,15 +7,16 @@ import dynamic from 'next/dynamic';
import Tabs from '@/components/Tabs';
import Settings from './components/Settings';
import { defaultModel } from '@/constants/model';
const Kb = dynamic(() => import('./components/Kb'), {
ssr: true
ssr: false
});
const Share = dynamic(() => import('./components/Share'), {
ssr: true
ssr: false
});
const API = dynamic(() => import('./components/API'), {
ssr: true
ssr: false
});
enum TabEnum {
@@ -28,9 +29,14 @@ enum TabEnum {
const ModelDetail = ({ modelId }: { modelId: string }) => {
const router = useRouter();
const { isPc } = useGlobalStore();
const { modelDetail } = useUserStore();
const { modelDetail = defaultModel, userInfo } = useUserStore();
const [currentTab, setCurrentTab] = useState<`${TabEnum}`>(TabEnum.settings);
const isOwner = useMemo(
() => modelDetail.userId === userInfo?._id,
[modelDetail.userId, userInfo?._id]
);
useEffect(() => {
window.onbeforeunload = (e) => {
e.preventDefault();
@@ -67,7 +73,7 @@ const ModelDetail = ({ modelId }: { modelId: string }) => {
w={['300px', '360px']}
list={[
{ label: '配置', id: TabEnum.settings },
{ label: '知识库', id: TabEnum.kb },
...(isOwner ? [{ label: '知识库', id: TabEnum.kb }] : []),
{ label: '分享', id: TabEnum.share },
{ label: 'API', id: TabEnum.API },
{ label: '立即对话', id: 'startChat' }

View File

@@ -10,7 +10,7 @@ import SideBar from '@/components/SideBar';
const ModelDetail = dynamic(() => import('./components/detail/index'), {
loading: () => <Loading fixed={false} />,
ssr: true
ssr: false
});
const Model = ({ modelId }: { modelId: string }) => {

View File

@@ -114,10 +114,10 @@ const PayModal = ({ onClose }: { onClose: () => void }) => {
| 计费项 | 价格: 元/ 1K tokens(包含上下文)|
| --- | --- |
| 知识库 - 索引 | 0.001 |
| chatgpt - 对话 | 0.022 |
| chatgpt16K - 对话 | 0.025 |
| gpt4 - 对话 | 0.5 |
| 文件拆分 | 0.025 |`}
| chatgpt - 对话 | 0.015 |
| chatgpt16K - 对话 | 0.015 |
| gpt4 - 对话 | 0.1 |
| 文件拆分 | 0.015 |`}
/>
</>
)}

View File

@@ -23,21 +23,21 @@ import Tabs from '@/components/Tabs';
import BillTable from './components/BillTable';
const PayRecordTable = dynamic(() => import('./components/PayRecordTable'), {
ssr: true
ssr: false
});
const PromotionTable = dynamic(() => import('./components/PromotionTable'), {
ssr: true
ssr: false
});
const InformTable = dynamic(() => import('./components/InformTable'), {
ssr: true
ssr: false
});
const PayModal = dynamic(() => import('./components/PayModal'), {
loading: () => <Loading fixed={false} />,
ssr: true
ssr: false
});
const WxConcat = dynamic(() => import('@/components/WxConcat'), {
loading: () => <Loading fixed={false} />,
ssr: true
ssr: false
});
enum TableEnum {
@@ -54,7 +54,6 @@ const NumberSetting = ({ tableType }: { tableType: `${TableEnum}` }) => {
{ label: '佣金', id: TableEnum.promotion, Component: <PromotionTable /> },
{ label: '通知', id: TableEnum.inform, Component: <InformTable /> }
]);
const [currentTab, setCurrentTab] = useState(tableType);
const router = useRouter();
const { copyData } = useCopyData();
@@ -84,7 +83,14 @@ const NumberSetting = ({ tableType }: { tableType: `${TableEnum}` }) => {
async (data: UserUpdateParams) => {
setLoading(true);
try {
data.openaiKey && (await authOpenAiKey(data.openaiKey));
if (data.openaiKey) {
const text = await authOpenAiKey(data.openaiKey);
text &&
toast({
title: text,
status: 'warning'
});
}
await putUserInfo({
openaiKey: data.openaiKey,
avatar: data.avatar
@@ -95,7 +101,7 @@ const NumberSetting = ({ tableType }: { tableType: `${TableEnum}` }) => {
});
reset(data);
toast({
title: '更新成功',
title: '更新数据成功',
status: 'success'
});
} catch (error) {
@@ -195,7 +201,7 @@ const NumberSetting = ({ tableType }: { tableType: `${TableEnum}` }) => {
<Box flex={'0 0 85px'}>openaiKey:</Box>
<Input
{...register(`openaiKey`)}
maxW={'300px'}
maxW={'350px'}
placeholder={'openai账号。回车或失去焦点保存'}
size={'sm'}
onBlur={handleSubmit(onclickSave)}
@@ -251,13 +257,13 @@ const NumberSetting = ({ tableType }: { tableType: `${TableEnum}` }) => {
m={'auto'}
w={'200px'}
list={tableList.current}
activeId={currentTab}
activeId={tableType}
size={'sm'}
onChange={(id: any) => setCurrentTab(id)}
onChange={(id: any) => router.replace(`/number?type=${id}`)}
/>
<Box minH={'300px'}>
{(() => {
const item = tableList.current.find((item) => item.id === currentTab);
const item = tableList.current.find((item) => item.id === tableType);
return item ? item.Component : null;
})()}

View File

@@ -23,34 +23,10 @@ export async function generateQA(): Promise<any> {
let userId = '';
try {
const match = {
mode: TrainingModeEnum.qa,
lockTime: { $lte: new Date(Date.now() - 4 * 60 * 1000) }
};
// random get task
const agree = await TrainingData.aggregate([
{
$match: match
},
{ $sample: { size: 1 } },
{
$project: {
_id: 1
}
}
]);
// no task
if (agree.length === 0) {
reduceQueue();
global.qaQueueLen <= 0 && console.log(`没有需要【QA】的数据, ${global.qaQueueLen}`);
return;
}
const data = await TrainingData.findOneAndUpdate(
{
_id: agree[0]._id,
...match
mode: TrainingModeEnum.qa,
lockTime: { $lte: new Date(Date.now() - 4 * 60 * 1000) }
},
{
lockTime: new Date()
@@ -67,7 +43,8 @@ export async function generateQA(): Promise<any> {
// task preemption
if (!data) {
reduceQueue();
return generateQA();
global.qaQueueLen <= 0 && console.log(`没有需要【QA】的数据, ${global.qaQueueLen}`);
return;
}
trainingId = data._id;
@@ -78,7 +55,6 @@ export async function generateQA(): Promise<any> {
const { systemAuthKey } = await getApiKey({
model: OpenAiChatEnum.GPT35,
userId,
type: 'training',
mustPay: true
});
@@ -177,7 +153,8 @@ A2:
sendInform({
type: 'system',
title: 'QA 任务中止',
content: '由于账号余额不足QA 任务中止,重新充值后将会继续。',
content:
'由于账号余额不足,索引生成任务中止,重新充值后将会继续。暂停的任务将在 7 天后被删除。',
userId
});
console.log('余额不足,暂停向量生成任务');

View File

@@ -19,34 +19,10 @@ export async function generateVector(): Promise<any> {
let userId = '';
try {
const match = {
mode: TrainingModeEnum.index,
lockTime: { $lte: new Date(Date.now() - 2 * 60 * 1000) }
};
// random get task
const agree = await TrainingData.aggregate([
{
$match: match
},
{ $sample: { size: 1 } },
{
$project: {
_id: 1
}
}
]);
// no task
if (agree.length === 0) {
reduceQueue();
global.vectorQueueLen <= 0 && console.log(`没有需要【索引】的数据, ${global.vectorQueueLen}`);
return;
}
const data = await TrainingData.findOneAndUpdate(
{
_id: agree[0]._id,
...match
mode: TrainingModeEnum.index,
lockTime: { $lte: new Date(Date.now() - 2 * 60 * 1000) }
},
{
lockTime: new Date()
@@ -63,7 +39,8 @@ export async function generateVector(): Promise<any> {
// task preemption
if (!data) {
reduceQueue();
return generateVector();
global.vectorQueueLen <= 0 && console.log(`没有需要【索引】的数据, ${global.vectorQueueLen}`);
return;
}
trainingId = data._id;
@@ -81,7 +58,6 @@ export async function generateVector(): Promise<any> {
const vectors = await openaiEmbedding({
input: dataItems.map((item) => item.q),
userId,
type: 'training',
mustPay: true
});
@@ -128,7 +104,8 @@ export async function generateVector(): Promise<any> {
sendInform({
type: 'system',
title: '索引生成任务中止',
content: '由于账号余额不足,索引生成任务中止,重新充值后将会继续。',
content:
'由于账号余额不足,索引生成任务中止,重新充值后将会继续。暂停的任务将在 7 天后被删除。',
userId
});
console.log('余额不足,暂停向量生成任务');

View File

@@ -52,7 +52,7 @@ const ChatSchema = new Schema({
},
value: {
type: String,
required: true
default: ''
},
quote: {
type: [

View File

@@ -43,10 +43,18 @@ const ModelSchema = new Schema({
default: ''
},
systemPrompt: {
// 系统提示词
type: String,
default: ''
},
limitPrompt: {
type: String,
default: ''
},
maxToken: {
type: Number,
default: 4000,
min: 100
},
temperature: {
type: Number,
min: 0,

View File

@@ -1,18 +1,6 @@
import { Schema, model, models } from 'mongoose';
const SystemSchema = new Schema({
openAIKeys: {
type: String,
default: ''
},
openAITrainingKeys: {
type: String,
default: ''
},
gpt4Key: {
type: String,
default: ''
},
vectorMaxProcess: {
type: Number,
default: 10

View File

@@ -15,6 +15,10 @@ const TrainingDataSchema = new Schema({
ref: 'kb',
required: true
},
expireAt: {
type: Date,
default: () => new Date()
},
lockTime: {
type: Date,
default: () => new Date('2000/1/1')
@@ -44,5 +48,13 @@ const TrainingDataSchema = new Schema({
}
});
try {
TrainingDataSchema.index({ lockTime: 1 });
TrainingDataSchema.index({ userId: 1 });
TrainingDataSchema.index({ expireAt: 1 }, { expireAfterSeconds: 7 * 24 * 60 });
} catch (error) {
console.log(error);
}
export const TrainingData: MongoModel<TrainingDateType> =
models['trainingData'] || model('trainingData', TrainingDataSchema);

View File

@@ -15,9 +15,6 @@ export async function connectToDatabase(): Promise<void> {
global.qaQueueLen = 0;
global.vectorQueueLen = 0;
global.systemEnv = {
openAIKeys: process.env.OPENAIKEY || '',
openAITrainingKeys: process.env.OPENAI_TRAINING_KEY || '',
gpt4Key: process.env.GPT4KEY || '',
vectorMaxProcess: 10,
qaMaxProcess: 10,
pgIvfflatProbe: 10,
@@ -39,9 +36,9 @@ export async function connectToDatabase(): Promise<void> {
global.mongodb = await mongoose.connect(process.env.MONGODB_URI as string, {
bufferCommands: true,
dbName: process.env.MONGODB_NAME,
maxConnecting: 30,
maxPoolSize: 30,
minPoolSize: 10
maxConnecting: Number(process.env.DB_MAX_LINK || 5),
maxPoolSize: Number(process.env.DB_MAX_LINK || 5),
minPoolSize: 2
});
console.log('mongo connected');
} catch (error) {

View File

@@ -12,7 +12,7 @@ export const connectPg = async () => {
user: process.env.PG_USER,
password: process.env.PG_PASSWORD,
database: process.env.PG_DB_NAME,
max: global.systemEnv.vectorMaxProcess + 10,
max: Number(process.env.DB_MAX_LINK || 5),
idleTimeoutMillis: 30000,
connectionTimeoutMillis: 5000
});

View File

@@ -3,15 +3,15 @@ import jwt from 'jsonwebtoken';
import Cookie from 'cookie';
import { Chat, Model, OpenApi, User, ShareChat, KB } from '../mongo';
import type { ModelSchema } from '@/types/mongoSchema';
import type { ChatItemSimpleType } from '@/types/chat';
import type { ChatItemType } from '@/types/chat';
import mongoose from 'mongoose';
import { ClaudeEnum, defaultModel, embeddingModel, EmbeddingModelType } from '@/constants/model';
import { defaultModel } from '@/constants/model';
import { formatPrice } from '@/utils/user';
import { ERROR_ENUM } from '../errorCode';
import { ChatModelType, OpenAiChatEnum } from '@/constants/model';
import { hashPassword } from '@/service/utils/tools';
export type ApiKeyType = 'training' | 'chat';
export type AuthType = 'token' | 'root' | 'apikey';
export const parseCookie = (cookie?: string): Promise<string> => {
return new Promise((resolve, reject) => {
@@ -39,13 +39,11 @@ export const parseCookie = (cookie?: string): Promise<string> => {
export const authUser = async ({
req,
authToken = false,
authOpenApi = false,
authRoot = false,
authBalance = false
}: {
req: NextApiRequest;
authToken?: boolean;
authOpenApi?: boolean;
authRoot?: boolean;
authBalance?: boolean;
}) => {
@@ -71,6 +69,36 @@ export const authUser = async ({
return Promise.reject(error);
}
};
const parseAuthorization = async (authorization?: string) => {
if (!authorization) {
return Promise.reject(ERROR_ENUM.unAuthorization);
}
// Bearer fastgpt-xxxx-appId
const auth = authorization.split(' ')[1];
if (!auth) {
return Promise.reject(ERROR_ENUM.unAuthorization);
}
const { apiKey, appId } = await (async () => {
const arr = auth.split('-');
if (arr.length !== 3) {
return Promise.reject(ERROR_ENUM.unAuthorization);
}
return {
apiKey: `${arr[0]}-${arr[1]}`,
appId: arr[2]
};
})();
// auth apiKey
const uid = await parseOpenApiKey(apiKey);
return {
uid,
appId
};
};
const parseRootKey = async (rootKey?: string, userId = '') => {
if (!rootKey || !process.env.ROOT_KEY || rootKey !== process.env.ROOT_KEY) {
return Promise.reject(ERROR_ENUM.unAuthorization);
@@ -78,31 +106,43 @@ export const authUser = async ({
return userId;
};
const { cookie, apikey, rootkey, userid } = (req.headers || {}) as {
const { cookie, apikey, rootkey, userid, authorization } = (req.headers || {}) as {
cookie?: string;
apikey?: string;
rootkey?: string;
userid?: string;
authorization?: string;
};
let uid = '';
let appId = '';
let authType: AuthType = 'token';
if (authToken) {
uid = await parseCookie(cookie);
} else if (authOpenApi) {
uid = await parseOpenApiKey(apikey);
authType = 'token';
} else if (authRoot) {
uid = await parseRootKey(rootkey, userid);
authType = 'root';
} else if (cookie) {
uid = await parseCookie(cookie);
authType = 'token';
} else if (apikey) {
uid = await parseOpenApiKey(apikey);
authType = 'apikey';
} else if (authorization) {
const authResponse = await parseAuthorization(authorization);
uid = authResponse.uid;
appId = authResponse.appId;
authType = 'apikey';
} else if (rootkey) {
uid = await parseRootKey(rootkey, userid);
authType = 'root';
} else {
return Promise.reject(ERROR_ENUM.unAuthorization);
}
// balance check
if (authBalance) {
const user = await User.findById(uid);
if (!user) {
@@ -115,44 +155,28 @@ export const authUser = async ({
}
return {
userId: uid
userId: uid,
appId,
authType
};
};
/* random get openai api key */
export const getSystemOpenAiKey = (type: ApiKeyType) => {
const keys = (() => {
if (type === 'training') {
return global.systemEnv.openAITrainingKeys?.split(',') || [];
}
return global.systemEnv.openAIKeys?.split(',') || [];
})();
// 纯字符串类型
const i = Math.floor(Math.random() * keys.length);
return keys[i] || (global.systemEnv.openAIKeys as string);
};
export const getGpt4Key = () => {
const keys = global.systemEnv.gpt4Key?.split(',') || [];
// 纯字符串类型
const i = Math.floor(Math.random() * keys.length);
return keys[i] || (global.systemEnv.openAIKeys as string);
export const getSystemOpenAiKey = () => {
return process.env.ONEAPI_KEY || process.env.OPENAIKEY || '';
};
/* 获取 api 请求的 key */
export const getApiKey = async ({
model,
userId,
mustPay = false,
type = 'chat'
mustPay = false
}: {
model: ChatModelType;
userId: string;
mustPay?: boolean;
type?: ApiKeyType;
}) => {
const user = await User.findById(userId);
const user = await User.findById(userId, 'openaiKey balance');
if (!user) {
return Promise.reject(ERROR_ENUM.unAuthorization);
}
@@ -160,30 +184,29 @@ export const getApiKey = async ({
const keyMap = {
[OpenAiChatEnum.GPT35]: {
userOpenAiKey: user.openaiKey || '',
systemAuthKey: getSystemOpenAiKey(type) as string
systemAuthKey: getSystemOpenAiKey()
},
[OpenAiChatEnum.GPT3516k]: {
userOpenAiKey: user.openaiKey || '',
systemAuthKey: getSystemOpenAiKey(type) as string
systemAuthKey: getSystemOpenAiKey()
},
[OpenAiChatEnum.GPT4]: {
userOpenAiKey: user.openaiKey || '',
systemAuthKey: getGpt4Key() as string
systemAuthKey: getSystemOpenAiKey()
},
[OpenAiChatEnum.GPT432k]: {
userOpenAiKey: user.openaiKey || '',
systemAuthKey: getGpt4Key() as string
},
[ClaudeEnum.Claude]: {
userOpenAiKey: '',
systemAuthKey: process.env.CLAUDE_KEY as string
systemAuthKey: getSystemOpenAiKey()
}
};
if (!keyMap[model]) {
return Promise.reject('App model is exists');
}
// 有自己的key
if (!mustPay && keyMap[model]?.userOpenAiKey) {
if (!mustPay && keyMap[model].userOpenAiKey) {
return {
user,
userOpenAiKey: keyMap[model].userOpenAiKey,
systemAuthKey: ''
};
@@ -195,7 +218,6 @@ export const getApiKey = async ({
}
return {
user,
userOpenAiKey: '',
systemAuthKey: keyMap[model].systemAuthKey
};
@@ -240,7 +262,7 @@ export const authModel = async ({
return {
model,
showModelDetail: model.share.isShareDetail || userId === String(model.userId)
showModelDetail: userId === String(model.userId)
};
};
@@ -277,7 +299,7 @@ export const authChat = async ({
});
// 聊天内容
let content: ChatItemSimpleType[] = [];
let content: ChatItemType[] = [];
if (chatId) {
// 获取 chat 数据
@@ -336,28 +358,9 @@ export const authShareChat = async ({
});
}
const modelId = String(shareChat.modelId);
const userId = String(shareChat.userId);
// 获取 model 数据
const { model, showModelDetail } = await authModel({
modelId,
userId,
authOwner: false,
reserveDetail: true
});
// 获取 user 的 apiKey
const { userOpenAiKey, systemAuthKey } = await getApiKey({
model: model.chat.chatModel,
userId
});
return {
userOpenAiKey,
systemAuthKey,
userId,
model,
showModelDetail
userId: String(shareChat.userId),
appId: String(shareChat.modelId),
authType: 'token' as AuthType
};
};

View File

@@ -1,35 +1,38 @@
import { ChatItemSimpleType } from '@/types/chat';
import { ChatItemType } from '@/types/chat';
import { modelToolMap } from '@/utils/plugin';
import type { ChatModelType } from '@/constants/model';
import { ChatRoleEnum } from '@/constants/chat';
import { OpenAiChatEnum, ClaudeEnum } from '@/constants/model';
import { ChatRoleEnum, sseResponseEventEnum } from '@/constants/chat';
import { sseResponse } from '../tools';
import { OpenAiChatEnum } from '@/constants/model';
import { chatResponse, openAiStreamResponse } from './openai';
import { claudChat, claudStreamResponse } from './claude';
import type { NextApiResponse } from 'next';
import { textAdaptGptResponse } from '@/utils/adapt';
import { parseStreamChunk } from '@/utils/adapt';
export type ChatCompletionType = {
apiKey: string;
temperature: number;
messages: ChatItemSimpleType[];
maxToken?: number;
messages: ChatItemType[];
chatId?: string;
[key: string]: any;
};
export type ChatCompletionResponseType = {
streamResponse: any;
responseMessages: ChatItemSimpleType[];
responseMessages: ChatItemType[];
responseText: string;
totalTokens: number;
};
export type StreamResponseType = {
chatResponse: any;
prompts: ChatItemSimpleType[];
prompts: ChatItemType[];
res: NextApiResponse;
[key: string]: any;
};
export type StreamResponseReturnType = {
responseContent: string;
totalTokens: number;
finishMessages: ChatItemSimpleType[];
finishMessages: ChatItemType[];
};
export const modelServiceToolMap: Record<
@@ -74,15 +77,11 @@ export const modelServiceToolMap: Record<
model: OpenAiChatEnum.GPT432k,
...data
})
},
[ClaudeEnum.Claude]: {
chatCompletion: claudChat,
streamResponse: claudStreamResponse
}
};
/* delete invalid symbol */
const simplifyStr = (str: string) =>
const simplifyStr = (str = '') =>
str
.replace(/\n+/g, '\n') // 连续空行
.replace(/[^\S\r\n]+/g, ' ') // 连续空白内容
@@ -95,11 +94,11 @@ export const ChatContextFilter = ({
maxTokens
}: {
model: ChatModelType;
prompts: ChatItemSimpleType[];
prompts: ChatItemType[];
maxTokens: number;
}) => {
const systemPrompts: ChatItemSimpleType[] = [];
const chatPrompts: ChatItemSimpleType[] = [];
const systemPrompts: ChatItemType[] = [];
const chatPrompts: ChatItemType[] = [];
let rawTextLen = 0;
prompts.forEach((item) => {
@@ -107,6 +106,7 @@ export const ChatContextFilter = ({
rawTextLen += val.length;
const data = {
_id: item._id,
obj: item.obj,
value: val
};
@@ -129,7 +129,7 @@ export const ChatContextFilter = ({
});
// 根据 tokens 截断内容
const chats: ChatItemSimpleType[] = [];
const chats: ChatItemType[] = [];
// 从后往前截取对话内容
for (let i = chatPrompts.length - 1; i >= 0; i--) {
@@ -174,3 +174,87 @@ export const resStreamResponse = async ({
return { responseContent, totalTokens, finishMessages };
};
/* stream response */
export const V2_StreamResponse = async ({
model,
res,
chatResponse,
prompts
}: StreamResponseType & {
model: ChatModelType;
}) => {
let responseContent = '';
let error: any = null;
const clientRes = async (data: string) => {
const { content = '' } = (() => {
try {
const json = JSON.parse(data);
const content: string = json?.choices?.[0].delta.content || '';
error = json.error;
responseContent += content;
return { content };
} catch (error) {
return {};
}
})();
if (res.closed || error) return;
if (data === '[DONE]') {
sseResponse({
res,
event: sseResponseEventEnum.answer,
data: textAdaptGptResponse({
text: null,
finish_reason: 'stop'
})
});
sseResponse({
res,
event: sseResponseEventEnum.answer,
data: '[DONE]'
});
} else {
sseResponse({
res,
event: sseResponseEventEnum.answer,
data: textAdaptGptResponse({
text: content
})
});
}
};
try {
for await (const chunk of chatResponse.data as any) {
if (res.closed) break;
const parse = parseStreamChunk(chunk);
parse.forEach((item) => clientRes(item.data));
}
} catch (error) {
console.log('pipe error', error);
}
if (error) {
console.log(error);
return Promise.reject(error);
}
// count tokens
const finishMessages = prompts.concat({
obj: ChatRoleEnum.AI,
value: responseContent
});
const totalTokens = modelToolMap[model].countTokens({
messages: finishMessages
});
return {
responseContent,
totalTokens,
finishMessages
};
};

View File

@@ -1,45 +1,56 @@
import { Configuration, OpenAIApi } from 'openai';
import { createParser, ParsedEvent, ReconnectInterval } from 'eventsource-parser';
import { axiosConfig } from '../tools';
import { ChatModelMap, OpenAiChatEnum } from '@/constants/model';
import { adaptChatItem_openAI } from '@/utils/plugin/openai';
import { modelToolMap } from '@/utils/plugin';
import { ChatCompletionType, ChatContextFilter, StreamResponseType } from './index';
import { ChatRoleEnum } from '@/constants/chat';
import { parseStreamChunk } from '@/utils/adapt';
export const getOpenAIApi = () =>
new OpenAIApi(
export const getOpenAIApi = (apiKey: string) => {
const openaiBaseUrl = process.env.OPENAI_BASE_URL || 'https://api.openai.com/v1';
return new OpenAIApi(
new Configuration({
basePath: process.env.OPENAI_BASE_URL || 'https://api.openai.com/v1'
basePath: apiKey === process.env.ONEAPI_KEY ? process.env.ONEAPI_URL : openaiBaseUrl
})
);
};
/* 模型对话 */
export const chatResponse = async ({
model,
apiKey,
temperature,
maxToken = 4000,
messages,
stream
}: ChatCompletionType & { model: `${OpenAiChatEnum}` }) => {
const modelTokenLimit = ChatModelMap[model]?.contextMaxToken || 4000;
const filterMessages = ChatContextFilter({
model,
prompts: messages,
maxTokens: Math.ceil(ChatModelMap[model].contextMaxToken * 0.85)
maxTokens: Math.ceil(modelTokenLimit - 300) // filter token. not response maxToken
});
const adaptMessages = adaptChatItem_openAI({ messages: filterMessages });
const chatAPI = getOpenAIApi();
const adaptMessages = adaptChatItem_openAI({ messages: filterMessages, reserveId: false });
const chatAPI = getOpenAIApi(apiKey);
const promptsToken = modelToolMap[model].countTokens({
messages: filterMessages
});
maxToken = maxToken + promptsToken > modelTokenLimit ? modelTokenLimit - promptsToken : maxToken;
const response = await chatAPI.createChatCompletion(
{
model,
temperature: Number(temperature) || 0,
temperature: Number(temperature || 0),
max_tokens: maxToken,
messages: adaptMessages,
frequency_penalty: 0.5, // 越大,重复内容越少
presence_penalty: -0.5, // 越大,越容易出现新内容
stream,
stop: ['.!?。']
stream
// stop: ['.!?。']
},
{
timeout: stream ? 60000 : 480000,
@@ -48,7 +59,7 @@ export const chatResponse = async ({
}
);
const responseText = stream ? '' : response.data.choices[0].message?.content || '';
const responseText = stream ? '' : response.data.choices?.[0].message?.content || '';
const totalTokens = stream ? 0 : response.data.usage?.total_tokens || 0;
return {
@@ -71,29 +82,29 @@ export const openAiStreamResponse = async ({
try {
let responseContent = '';
const onParse = async (event: ParsedEvent | ReconnectInterval) => {
if (event.type !== 'event') return;
const data = event.data;
if (data === '[DONE]') return;
try {
const json = JSON.parse(data);
const content: string = json?.choices?.[0].delta.content || '';
responseContent += content;
const clientRes = async (data: string) => {
const { content = '' } = (() => {
try {
const json = JSON.parse(data);
const content: string = json?.choices?.[0].delta.content || '';
responseContent += content;
return { content };
} catch (error) {
return {};
}
})();
!res.closed && content && res.write(content);
} catch (error) {
error;
}
if (data === '[DONE]') return;
!res.closed && content && res.write(content);
};
try {
const decoder = new TextDecoder();
const parser = createParser(onParse);
for await (const chunk of chatResponse.data as any) {
if (res.closed) {
break;
}
parser.feed(decoder.decode(chunk, { stream: true }));
if (res.closed) break;
const parse = parseStreamChunk(chunk);
parse.forEach((item) => clientRes(item.data));
}
} catch (error) {
console.log('pipe error', error);

View File

@@ -4,6 +4,7 @@ import crypto from 'crypto';
import jwt from 'jsonwebtoken';
import { generateQA } from '../events/generateQA';
import { generateVector } from '../events/generateVector';
import { sseResponseEventEnum } from '@/constants/chat';
/* 密码加密 */
export const hashPassword = (psw: string) => {
@@ -33,14 +34,18 @@ export const clearCookie = (res: NextApiResponse) => {
};
/* openai axios config */
export const axiosConfig = (apikey: string) => ({
baseURL: process.env.OPENAI_BASE_URL || 'https://api.openai.com/v1',
httpsAgent: global.httpsAgent,
headers: {
Authorization: `Bearer ${apikey}`,
auth: process.env.OPENAI_BASE_URL_AUTH || ''
}
});
export const axiosConfig = (apikey: string) => {
const openaiBaseUrl = process.env.OPENAI_BASE_URL || 'https://api.openai.com/v1';
return {
baseURL: apikey === process.env.ONEAPI_KEY ? process.env.ONEAPI_URL : openaiBaseUrl, // 此处仅对非 npm 模块有效
httpsAgent: global.httpsAgent,
headers: {
Authorization: `Bearer ${apikey}`,
auth: process.env.OPENAI_BASE_URL_AUTH || ''
}
};
};
export function withNextCors(handler: NextApiHandler): NextApiHandler {
return async function nextApiHandlerWrappedWithNextCors(
@@ -67,3 +72,16 @@ export const startQueue = () => {
generateVector();
}
};
export const sseResponse = ({
res,
event,
data
}: {
res: NextApiResponse;
event?: `${sseResponseEventEnum}`;
data: string;
}) => {
event && res.write(`event: ${event}\n`);
res.write(`data: ${data}\n\n`);
};

View File

@@ -44,7 +44,7 @@ type State = {
delShareChatHistory: (shareId?: string) => void;
};
const defaultChatData = {
const defaultChatData: ChatType = {
chatId: 'chatId',
modelId: 'modelId',
model: {

View File

@@ -19,7 +19,8 @@ export const useGlobalStore = create<State>()(
immer((set, get) => ({
initData: {
beianText: '',
googleVerKey: ''
googleVerKey: '',
baiduTongji: false
},
async loadInitData() {
try {

View File

@@ -4,16 +4,14 @@ import { QuoteItemType } from '@/pages/api/openapi/kb/appKbSearch';
export type ExportChatType = 'md' | 'pdf' | 'html';
export type ChatItemSimpleType = {
export type ChatItemType = {
_id?: string;
obj: `${ChatRoleEnum}`;
value: string;
quoteLen?: number;
quote?: QuoteItemType[];
systemPrompt?: string;
};
export type ChatItemType = {
_id: string;
} & ChatItemSimpleType;
export type ChatSiteItemType = {
status: 'loading' | 'finish';

View File

@@ -23,9 +23,6 @@ declare global {
var vectorQueueLen: number;
var OpenAiEncMap: Record<string, Tiktoken>;
var systemEnv: {
openAIKeys: string;
openAITrainingKeys: string;
gpt4Key: string;
vectorMaxProcess: number;
qaMaxProcess: number;
pgIvfflatProbe: number;

View File

@@ -5,7 +5,7 @@ export type ModelListItemType = {
_id: string;
name: string;
avatar: string;
systemPrompt: string;
intro: string;
};
export interface ModelUpdateParams {

View File

@@ -43,7 +43,9 @@ export interface ModelSchema {
searchLimit: number;
searchEmptyText: string;
systemPrompt: string;
limitPrompt: string;
temperature: number;
maxToken: number;
chatModel: ChatModelType; // 聊天时用的模型,训练后就是训练的模型
};
share: {
@@ -68,6 +70,7 @@ export interface TrainingDataSchema {
_id: string;
userId: string;
kbId: string;
expireAt: Date;
lockTime: Date;
mode: `${TrainingModeEnum}`;
prompt: string;

View File

@@ -2,6 +2,10 @@ import { formatPrice } from './user';
import dayjs from 'dayjs';
import type { BillSchema } from '../types/mongoSchema';
import type { UserBillType } from '@/types/user';
import { ChatItemType } from '@/types/chat';
import { ChatCompletionRequestMessageRoleEnum } from 'openai';
import { ChatRoleEnum } from '@/constants/chat';
import type { MessageItemType } from '@/pages/api/openapi/v1/chat/completions';
export const adaptBill = (bill: BillSchema): UserBillType => {
return {
@@ -14,3 +18,60 @@ export const adaptBill = (bill: BillSchema): UserBillType => {
price: formatPrice(bill.price)
};
};
export const gptMessage2ChatType = (messages: MessageItemType[]): ChatItemType[] => {
const roleMap: Record<`${ChatCompletionRequestMessageRoleEnum}`, `${ChatRoleEnum}`> = {
[ChatCompletionRequestMessageRoleEnum.Assistant]: ChatRoleEnum.AI,
[ChatCompletionRequestMessageRoleEnum.User]: ChatRoleEnum.Human,
[ChatCompletionRequestMessageRoleEnum.System]: ChatRoleEnum.System,
[ChatCompletionRequestMessageRoleEnum.Function]: ChatRoleEnum.Human
};
return messages.map((item) => ({
_id: item._id,
obj: roleMap[item.role],
value: item.content || ''
}));
};
export const textAdaptGptResponse = ({
text,
model,
finish_reason = null,
extraData = {}
}: {
model?: string;
text: string | null;
finish_reason?: null | 'stop';
extraData?: Object;
}) => {
return JSON.stringify({
...extraData,
id: '',
object: '',
created: 0,
model,
choices: [{ delta: text === null ? {} : { content: text }, index: 0, finish_reason }]
});
};
const decoder = new TextDecoder();
export const parseStreamChunk = (value: BufferSource) => {
const chunk = decoder.decode(value);
const chunkLines = chunk.split('\n\n').filter((item) => item);
const chunkResponse = chunkLines.map((item) => {
const splitEvent = item.split('\n');
if (splitEvent.length === 2) {
return {
event: splitEvent[0].replace('event: ', ''),
data: splitEvent[1].replace('data: ', '')
};
}
return {
event: '',
data: splitEvent[0].replace('data: ', '')
};
});
return chunkResponse;
};

View File

@@ -148,15 +148,9 @@ export const fileDownload = ({
* slideLen - The size of the before and after Text
* maxLen > slideLen
*/
export const splitText_token = ({
text,
maxLen,
slideLen
}: {
text: string;
maxLen: number;
slideLen: number;
}) => {
export const splitText_token = ({ text, maxLen }: { text: string; maxLen: number }) => {
const slideLen = Math.floor(maxLen * 0.3);
try {
const enc = getOpenAiEncMap()[OpenAiChatEnum.GPT35];
// filter empty text. encode sentence

View File

@@ -1,18 +1,15 @@
import { ClaudeEnum, OpenAiChatEnum } from '@/constants/model';
import { OpenAiChatEnum } from '@/constants/model';
import type { ChatModelType } from '@/constants/model';
import type { ChatItemSimpleType } from '@/types/chat';
import type { ChatItemType } from '@/types/chat';
import { countOpenAIToken, openAiSliceTextByToken } from './openai';
import { gpt_chatItemTokenSlice } from '@/pages/api/openapi/text/gptMessagesSlice';
export const modelToolMap: Record<
ChatModelType,
{
countTokens: (data: { messages: ChatItemSimpleType[] }) => number;
countTokens: (data: { messages: ChatItemType[] }) => number;
sliceText: (data: { text: string; length: number }) => string;
tokenSlice: (data: {
messages: ChatItemSimpleType[];
maxToken: number;
}) => ChatItemSimpleType[];
tokenSlice: (data: { messages: ChatItemType[]; maxToken: number }) => ChatItemType[];
}
> = {
[OpenAiChatEnum.GPT35]: {
@@ -34,10 +31,5 @@ export const modelToolMap: Record<
countTokens: ({ messages }) => countOpenAIToken({ model: OpenAiChatEnum.GPT432k, messages }),
sliceText: (data) => openAiSliceTextByToken({ model: OpenAiChatEnum.GPT432k, ...data }),
tokenSlice: (data) => gpt_chatItemTokenSlice({ model: OpenAiChatEnum.GPT432k, ...data })
},
[ClaudeEnum.Claude]: {
countTokens: ({ messages }) => countOpenAIToken({ model: OpenAiChatEnum.GPT35, messages }),
sliceText: (data) => openAiSliceTextByToken({ model: OpenAiChatEnum.GPT35, ...data }),
tokenSlice: (data) => gpt_chatItemTokenSlice({ model: OpenAiChatEnum.GPT35, ...data })
}
};

View File

@@ -1,67 +1,20 @@
import { encoding_for_model, type Tiktoken } from '@dqbd/tiktoken';
import type { ChatItemSimpleType } from '@/types/chat';
import { encoding_for_model } from '@dqbd/tiktoken';
import type { ChatItemType } from '@/types/chat';
import { ChatRoleEnum } from '@/constants/chat';
import { ChatCompletionRequestMessage, ChatCompletionRequestMessageRoleEnum } from 'openai';
import { ChatCompletionRequestMessageRoleEnum } from 'openai';
import { OpenAiChatEnum } from '@/constants/model';
import Graphemer from 'graphemer';
import axios from 'axios';
import dayjs from 'dayjs';
const textDecoder = new TextDecoder();
const graphemer = new Graphemer();
import type { MessageItemType } from '@/pages/api/openapi/v1/chat/completions';
export const getOpenAiEncMap = () => {
if (typeof window !== 'undefined') {
window.OpenAiEncMap = window.OpenAiEncMap || {
[OpenAiChatEnum.GPT35]: encoding_for_model('gpt-3.5-turbo', {
'<|im_start|>': 100264,
'<|im_end|>': 100265,
'<|im_sep|>': 100266
}),
[OpenAiChatEnum.GPT3516k]: encoding_for_model('gpt-3.5-turbo', {
'<|im_start|>': 100264,
'<|im_end|>': 100265,
'<|im_sep|>': 100266
}),
[OpenAiChatEnum.GPT4]: encoding_for_model('gpt-4', {
'<|im_start|>': 100264,
'<|im_end|>': 100265,
'<|im_sep|>': 100266
}),
[OpenAiChatEnum.GPT432k]: encoding_for_model('gpt-4-32k', {
'<|im_start|>': 100264,
'<|im_end|>': 100265,
'<|im_sep|>': 100266
})
};
if (typeof window !== 'undefined' && window.OpenAiEncMap) {
return window.OpenAiEncMap;
}
if (typeof global !== 'undefined') {
global.OpenAiEncMap = global.OpenAiEncMap || {
[OpenAiChatEnum.GPT35]: encoding_for_model('gpt-3.5-turbo', {
'<|im_start|>': 100264,
'<|im_end|>': 100265,
'<|im_sep|>': 100266
}),
[OpenAiChatEnum.GPT3516k]: encoding_for_model('gpt-3.5-turbo', {
'<|im_start|>': 100264,
'<|im_end|>': 100265,
'<|im_sep|>': 100266
}),
[OpenAiChatEnum.GPT4]: encoding_for_model('gpt-4', {
'<|im_start|>': 100264,
'<|im_end|>': 100265,
'<|im_sep|>': 100266
}),
[OpenAiChatEnum.GPT432k]: encoding_for_model('gpt-4-32k', {
'<|im_start|>': 100264,
'<|im_end|>': 100265,
'<|im_sep|>': 100266
})
};
if (typeof global !== 'undefined' && global.OpenAiEncMap) {
return global.OpenAiEncMap;
}
return {
const enc = {
[OpenAiChatEnum.GPT35]: encoding_for_model('gpt-3.5-turbo', {
'<|im_start|>': 100264,
'<|im_end|>': 100265,
@@ -83,19 +36,31 @@ export const getOpenAiEncMap = () => {
'<|im_sep|>': 100266
})
};
if (typeof window !== 'undefined') {
window.OpenAiEncMap = enc;
}
if (typeof global !== 'undefined') {
global.OpenAiEncMap = enc;
}
return enc;
};
export const adaptChatItem_openAI = ({
messages
messages,
reserveId
}: {
messages: ChatItemSimpleType[];
}): ChatCompletionRequestMessage[] => {
messages: ChatItemType[];
reserveId: boolean;
}): MessageItemType[] => {
const map = {
[ChatRoleEnum.AI]: ChatCompletionRequestMessageRoleEnum.Assistant,
[ChatRoleEnum.Human]: ChatCompletionRequestMessageRoleEnum.User,
[ChatRoleEnum.System]: ChatCompletionRequestMessageRoleEnum.System
};
return messages.map((item) => ({
...(reserveId && { _id: item._id }),
role: map[item.obj] || ChatCompletionRequestMessageRoleEnum.System,
content: item.value || ''
}));
@@ -105,62 +70,21 @@ export function countOpenAIToken({
messages,
model
}: {
messages: ChatItemSimpleType[];
messages: ChatItemType[];
model: `${OpenAiChatEnum}`;
}) {
function getChatGPTEncodingText(
messages: {
role: 'system' | 'user' | 'assistant';
content: string;
name?: string;
}[],
model: `${OpenAiChatEnum}`
) {
const isGpt3 = model.startsWith('gpt-3.5-turbo');
const diffVal = model.startsWith('gpt-3.5-turbo') ? 3 : 2;
const msgSep = isGpt3 ? '\n' : '';
const roleSep = isGpt3 ? '\n' : '<|im_sep|>';
const adaptMessages = adaptChatItem_openAI({ messages, reserveId: true });
const token = adaptMessages.reduce((sum, item) => {
const text = `${item.role}\n${item.content}`;
const enc = getOpenAiEncMap()[model];
const encodeText = enc.encode(text);
const tokens = encodeText.length + diffVal;
return sum + tokens;
}, 0);
return [
messages
.map(({ name = '', role, content }) => {
return `<|im_start|>${name || role}${roleSep}${content}<|im_end|>`;
})
.join(msgSep),
`<|im_start|>assistant${roleSep}`
].join(msgSep);
}
function text2TokensLen(encoder: Tiktoken, inputText: string) {
const encoding = encoder.encode(inputText, 'all');
const segments: { text: string; tokens: { id: number; idx: number }[] }[] = [];
let byteAcc: number[] = [];
let tokenAcc: { id: number; idx: number }[] = [];
let inputGraphemes = graphemer.splitGraphemes(inputText);
for (let idx = 0; idx < encoding.length; idx++) {
const token = encoding[idx]!;
byteAcc.push(...encoder.decode_single_token_bytes(token));
tokenAcc.push({ id: token, idx });
const segmentText = textDecoder.decode(new Uint8Array(byteAcc));
const graphemes = graphemer.splitGraphemes(segmentText);
if (graphemes.every((item, idx) => inputGraphemes[idx] === item)) {
segments.push({ text: segmentText, tokens: tokenAcc });
byteAcc = [];
tokenAcc = [];
inputGraphemes = inputGraphemes.slice(graphemes.length);
}
}
return segments.reduce((memo, i) => memo + i.tokens.length, 0) ?? 0;
}
const adaptMessages = adaptChatItem_openAI({ messages });
return text2TokensLen(getOpenAiEncMap()[model], getChatGPTEncodingText(adaptMessages, model));
return token;
}
export const openAiSliceTextByToken = ({
@@ -187,16 +111,11 @@ export const authOpenAiKey = async (key: string) => {
})
.then((res) => {
if (!res.data.access_until) {
return Promise.reject('OpenAI Key 无效,请重试或更换 key');
}
const keyExpiredTime = dayjs(res.data.access_until * 1000);
const currentTime = dayjs();
if (keyExpiredTime.isBefore(currentTime)) {
return Promise.reject('OpenAI Key 已过期');
return Promise.resolve('OpenAI Key 可能无效');
}
})
.catch((err) => {
console.log(err);
return Promise.reject(err?.response?.data?.error || 'OpenAI 账号无效,请重试或更换 key');
return Promise.reject(err?.response?.data?.error?.message || 'OpenAI Key 可能无效');
});
};

View File

@@ -115,14 +115,8 @@ export const voiceBroadcast = ({ text }: { text: string }) => {
};
};
export const formatLinkText = (text: string) => {
const httpReg =
/(http|https|ftp):\/\/[\w\-_]+(\.[\w\-_]+)+([\w\-\.,@?^=%&amp;:/~\+#]*[\w\-\@?^=%&amp;/~\+#])?/gi;
return text.replace(httpReg, ` $& `);
};
export const getErrText = (err: any, def = '') => {
const msg = typeof err === 'string' ? err : err?.message || def || '';
const msg: string = typeof err === 'string' ? err : err?.message || def || '';
msg && console.log('error =>', msg);
return msg;
};

View File

@@ -10,6 +10,10 @@
4. [cloudflare 方案](./proxy/cloudflare.md) - 需要有域名(每日免费 10w 次代理请求)
5. [腾讯云函数代理方案](https://github.com/easychen/openai-api-proxy/blob/master/FUNC.md) - 仅需一台服务器
## openai key 池管理方案
推荐使用 [one-api](https://github.com/songquanpeng/one-api) 项目来管理 key 池,兼容 openai 和微软等多渠道。部署可以看该项目的 README.md也可以看 [在 Sealos 1 分钟部署 one-api](./one-api/sealos.md)
### 1. 准备一些内容
> 1. 服务器开通 80 端口。用代理的话,对应的代理端口也需要打开。

View File

@@ -1,12 +1,15 @@
# 非 host 版本, 不使用本机代理
version: '3.3'
services:
pg:
image: ankane/pgvector:v0.4.2 # dockerhub
# image: registry.cn-hangzhou.aliyuncs.com/fastgpt/pgvector:v0.4.2 # 阿里云
# image: ankane/pgvector:v0.4.2 # dockerhub
image: registry.cn-hangzhou.aliyuncs.com/fastgpt/pgvector:v0.4.2 # 阿里云
container_name: pg
restart: always
ports:
ports: # 生产环境建议不要暴露
- 8100:5432
networks:
- fastgpt
environment:
# 这里的配置只有首次运行生效。修改后,重启镜像是不会生效的。需要把持久化数据删除再重启,才有效果
- POSTGRES_USER=fastgpt
@@ -14,33 +17,40 @@ services:
- POSTGRES_DB=fastgpt
volumes:
# 刚创建的文件
- /root/fastgpt/pg/init.sql:/docker-entrypoint-initdb.d/init.sh
- /root/fastgpt/pg/data:/var/lib/postgresql/data
- /etc/localtime:/etc/localtime:ro
mongodb:
image: mongo:5.0.18
# image : registry.cn-hangzhou.aliyuncs.com/fastgpt/mongo:5.0.18 # 阿里云
- ./pg/init.sql:/docker-entrypoint-initdb.d/init.sh
- ./pg/data:/var/lib/postgresql/data
mongo:
# image: mongo:5.0.18
image: registry.cn-hangzhou.aliyuncs.com/fastgpt/mongo:5.0.18 # 阿里云
container_name: mongo
restart: always
ports:
ports: # 生产环境建议不要暴露
- 27017:27017
networks:
- fastgpt
environment:
# 这里的配置只有首次运行生效。修改后,重启镜像是不会生效的。需要把持久化数据删除再重启,才有效果
- MONGO_INITDB_ROOT_USERNAME=username
- MONGO_INITDB_ROOT_PASSWORD=password
volumes:
- /root/fastgpt/mongo/data:/data/db
- /root/fastgpt/mongo/logs:/var/log/mongodb
- /etc/localtime:/etc/localtime:ro
- ./mongo/data:/data/db
- ./mongo/logs:/var/log/mongodb
fastgpt:
image: ghcr.io/c121914yu/fastgpt:latest # github
# image: c121914yu/fast-gpt:latest # docker hub
# image: registry.cn-hangzhou.aliyuncs.com/fastgpt/fastgpt:latest # 阿里云
network_mode: host
restart: always
container_name: fastgpt
# image: c121914yu/fast-gpt:latest # docker hub
image: registry.cn-hangzhou.aliyuncs.com/fastgpt/fastgpt:latest # 阿里云
# network_mode: host #
ports:
- 3000:3000
networks:
- fastgpt
depends_on:
- mongo
- pg
restart: always
environment: # 可选的变量,不需要的话需要去掉
- PORT=3000 # 运行的端口地址,如果不是 3000需要修改成实际地址。
- DB_MAX_LINK=5 # database max link
# proxy可选
- AXIOS_PROXY_HOST=127.0.0.1
- AXIOS_PROXY_PORT=7890
@@ -56,35 +66,191 @@ services:
- CLIENT_GOOGLE_VER_TOKEN=xxx
- SERVICE_GOOGLE_VER_TOKEN=xx
# token加密凭证随便填作为登录凭证
- TOKEN_KEY=xxxx
- TOKEN_KEY=any
# root key, 最高权限,可以内部接口互相调用
- ROOT_KEY=xxx
- ROOT_KEY=root_key
# 和上方mongo镜像的username,password对应
- MONGODB_URI=mongodb://username:password@0.0.0.0:27017/?authSource=admin
- MONGODB_URI=mongodb://username:password@mongo:27017/?authSource=admin
- MONGODB_NAME=fastgpt
- PG_HOST=0.0.0.0
- PG_PORT=8100
- PG_HOST=pg
- PG_PORT=5432
# 和上方PG镜像对应.
- PG_USER=fastgpt
- PG_PASSWORD=1234
- PG_DB_NAME=fastgpt
# openai
# 对话用的key多个key逗号分开
- OPENAIKEY=sk-xxxxx,sk-xxx
# 训练用的key
- OPENAI_TRAINING_KEY=sk-xxx,sk-xxxx
- GPT4KEY=sk-xxx
# oneapi 配置 推荐使用 one-api 管理key
- ONEAPI_URL=https://kfcwurtbijvh.cloud.sealos.io/v1
- ONEAPI_KEY=sk-itJ9v8qthRiFDzfs62Ea21Aa9b004c8791937dCf4cC568Ff
# openai 相关配置:使用了 oneapi 后,下面只需要填下 OPENAI_BASE_URL (国外可全忽略)
- OPENAIKEY=sk-xxxxx
- OPENAI_BASE_URL=https://api.openai.com/v1
- OPENAI_BASE_URL_AUTH=可选的安全凭证
nginx:
image: nginx:alpine3.17
- OPENAI_BASE_URL_AUTH=可选的安全凭证,会放到 header.auth 里
fastgpt-admin:
image: registry.cn-hangzhou.aliyuncs.com/fastgpt/fastgpt-admin:latest
container_name: fastgpt-admin
restart: always
ports:
- 3001:3001
networks:
- fastgpt
depends_on:
- mongo
- fastgpt
environment:
- MONGODB_URI=mongodb://username:password@mongo:27017/?authSource=admin
- MONGODB_NAME=fastgpt
- ADMIN_USER=username
- ADMIN_PASS=password
- ADMIN_SECRET=any
- PARENT_URL=http://fastgpt:3000
- PARENT_ROOT_KEY=root_key
keyadmin:
container_name: keyadmin
image: justsong/one-api
restart: always
ports:
- 3002:3000
environment:
- TZ=Asia/Shanghai
volumes:
- /keyadmin:/data
nginx: # 宝塔不需要额外装 nginx
image: registry.cn-hangzhou.aliyuncs.com/fastgpt/nginx:alpine3.17
# image: nginx:alpine3.17
container_name: nginx
restart: always
network_mode: host
volumes:
# 刚创建的文件
- /root/fastgpt/nginx/nginx.conf:/etc/nginx/nginx.conf:ro
- /root/fastgpt/nginx/logs:/var/log/nginx
- ./nginx/nginx.conf:/etc/nginx/nginx.conf:ro
- ./nginx/logs:/var/log/nginx
# https证书没有的话不填对应的nginx.conf也要修改
- /root/fastgpt/nginx/ssl/docgpt.key:/ssl/docgpt.key
- /root/fastgpt/nginx/ssl/docgpt.pem:/ssl/docgpt.pem
- ./nginx/ssl/docgpt.key:/ssl/docgpt.key
- ./nginx/ssl/docgpt.pem:/ssl/docgpt.pem
networks:
fastgpt:
# host 版本, 不推荐,推荐直接用上面的,用个 BASE_URL 中转
# version: '3.3'
# services:
# pg:
# # image: ankane/pgvector:v0.4.2 # dockerhub
# image: registry.cn-hangzhou.aliyuncs.com/fastgpt/pgvector:v0.4.2 # 阿里云
# container_name: pg
# restart: always
# ports: # 生产环境建议不要暴露
# - 8100:5432
# networks:
# - fastgpt
# environment:
# # 这里的配置只有首次运行生效。修改后,重启镜像是不会生效的。需要把持久化数据删除再重启,才有效果
# - POSTGRES_USER=fastgpt
# - POSTGRES_PASSWORD=1234
# - POSTGRES_DB=fastgpt
# volumes:
# # 刚创建的文件
# - ./pg/init.sql:/docker-entrypoint-initdb.d/init.sh
# - ./pg/data:/var/lib/postgresql/data
# mongo:
# # image: mongo:5.0.18
# image: registry.cn-hangzhou.aliyuncs.com/fastgpt/mongo:5.0.18 # 阿里云
# container_name: mongo
# restart: always
# ports: # 生产环境建议不要暴露
# - 27017:27017
# networks:
# - fastgpt
# environment:
# # 这里的配置只有首次运行生效。修改后,重启镜像是不会生效的。需要把持久化数据删除再重启,才有效果
# - MONGO_INITDB_ROOT_USERNAME=username
# - MONGO_INITDB_ROOT_PASSWORD=password
# volumes:
# - ./mongo/data:/data/db
# - ./mongo/logs:/var/log/mongodb
# fastgpt:
# # image: ghcr.io/c121914yu/fastgpt:latest # github
# # image: c121914yu/fast-gpt:latest # docker hub
# image: registry.cn-hangzhou.aliyuncs.com/fastgpt/fastgpt:latest # 阿里云
# network_mode: host
# restart: always
# container_name: fastgpt
# environment: # 可选的变量,不需要的话需要去掉
# - PORT=3000 # 运行的端口地址,如果不是 3000需要修改成实际地址。
# - DB_MAX_LINK=15 # database max link
# # proxy可选
# - AXIOS_PROXY_HOST=127.0.0.1
# - AXIOS_PROXY_PORT=7890
# # 发送邮箱验证码配置。用的是QQ邮箱。参考 nodeMail 获取MAILE_CODE自行百度。
# - MY_MAIL=54545@qq.com
# - MAILE_CODE=1234
# # 阿里短信服务(邮箱和短信至少二选一)
# - aliAccessKeyId=xxxx
# - aliAccessKeySecret=xxxx
# - aliSignName=xxxxx
# - aliTemplateCode=SMS_xxxx
# # google V3 安全校验(可选)
# - CLIENT_GOOGLE_VER_TOKEN=xxx
# - SERVICE_GOOGLE_VER_TOKEN=xx
# # token加密凭证随便填作为登录凭证
# - TOKEN_KEY=xxxx
# # root key, 最高权限,可以内部接口互相调用
# - ROOT_KEY=xxx
# # 和上方mongo镜像的username,password对应
# - MONGODB_URI=mongodb://username:password@0.0.0.0:27017/?authSource=admin
# - MONGODB_NAME=fastgpt
# - PG_HOST=0.0.0.0
# - PG_PORT=8100
# # 和上方PG镜像对应.
# - PG_USER=fastgpt
# - PG_PASSWORD=1234
# - PG_DB_NAME=fastgpt
# # oneapi 配置 推荐使用 one-api 管理key
# - ONEAPI_URL=https://kfcwurtbijvh.cloud.sealos.io/v1
# - ONEAPI_KEY=sk-itJ9v8qthRiFDzfs62Ea21Aa9b004c8791937dCf4cC568Ff
# # openai 相关配置:使用了 oneapi 后,下面只需要填下 OPENAI_BASE_URL (国外可全忽略)
# - OPENAIKEY=sk-xxxxx
# - OPENAI_BASE_URL=https://api.openai.com/v1
# - OPENAI_BASE_URL_AUTH=可选的安全凭证,会放到 header.auth 里
# fastgpt-admin:
# image: registry.cn-hangzhou.aliyuncs.com/fastgpt/fastgpt-admin:latest
# container_name: fastgpt-admin
# restart: always
# ports:
# - 3001:3001
# networks:
# - fastgpt
# depends_on:
# - mongo
# - fastgpt
# environment:
# - MONGODB_URI=mongodb://username:password@mongo:27017/?authSource=admin
# - MONGODB_NAME=fastgpt
# - ADMIN_USER=username
# - ADMIN_PASS=password
# - ADMIN_SECRET=any
# - PARENT_URL=http://fastgpt:3000
# - PARENT_ROOT_KEY=root_key
# key-admin:
# container_name: key-admin
# image: justsong/one-api
# restart: always
# ports:
# - 3002:3000
# environment:
# - TZ=Asia/Shanghai
# volumes:
# - /home/ubuntu/data/one-api:/data
# nginx: # 宝塔不需要额外装 nginx
# image: registry.cn-hangzhou.aliyuncs.com/fastgpt/nginx:alpine3.17
# # image: nginx:alpine3.17
# container_name: nginx
# restart: always
# network_mode: host
# volumes:
# # 刚创建的文件
# - ./nginx/nginx.conf:/etc/nginx/nginx.conf:ro
# - ./nginx/logs:/var/log/nginx
# # https证书没有的话不填对应的nginx.conf也要修改
# - ./nginx/ssl/docgpt.key:/ssl/docgpt.key
# - ./nginx/ssl/docgpt.pem:/ssl/docgpt.pem
# networks:
# fastgpt:

View File

@@ -0,0 +1,36 @@
# 在 Sealos 1 分钟部署 one-api
## 1. 进入 Sealos 公有云
https://cloud.sealos.io/
## 2. 打开 AppLaunchpad(应用管理) 工具
![step1](./sealosImg/step1.png)
## 3. 点击创建新应用
## 4. 填写对应参数
镜像ghcr.io/songquanpeng/one-api:latest
![step2](./sealosImg/step2.png)
打开外网访问开关后Sealos 会自动分配一个可访问的地址,不需要自己配置。
![step3](./sealosImg/step3.png)
填写完参数后,点击右上角部署即可。
## 5. 访问
点击 Sealos 提供的外网访问地址,即可访问 one-api 项目。
![step3](./sealosImg/step4.png)
![step3](./sealosImg/step5.png)
## 6. 替换 FastGpt 的环境变量
```
# 下面的地址是 Sealos 提供的,务必写上 v1
OPENAI_BASE_URL=https://xxxx.cloud.sealos.io/v1
# 下面的 key 由 one-api 提供
OPENAIKEY=sk-xxxxxx
```

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.0 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 208 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 215 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 247 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 128 KiB

View File

@@ -1,31 +0,0 @@
# 运行端口,如果不是 3000 口运行,需要改成其他的。注意:不是改了这个变量就会变成其他端口,而是因为改成其他端口,才用这个变量。
PORT=3000
# 代理
# AXIOS_PROXY_HOST=127.0.0.1
# AXIOS_PROXY_PORT=7890
# email
MY_MAIL=xxxx@qq.com
MAILE_CODE=xxxx
# ali ems
aliAccessKeyId=xxxx
aliAccessKeySecret=xxxx
aliSignName=xxxx
aliTemplateCode=xxxx
# token
TOKEN_KEY=dfdasfdas
# root key, 最高权限
ROOT_KEY=fdafasd
# openai
# OPENAI_BASE_URL=http://ai.openai.com/v1
# OPENAI_BASE_URL_AUTH=可选安全凭证,会放到 header.auth 里
OPENAIKEY=sk-xxx
OPENAI_TRAINING_KEY=sk-xxx
GPT4KEY=sk-xxx
# db
MONGODB_URI=mongodb://username:password@0.0.0.0:27017/?authSource=admin
MONGODB_NAME=fastgpt
PG_HOST=0.0.0.0
PG_PORT=8100
PG_USER=root
PG_PASSWORD=psw
PG_DB_NAME=dbname

View File

@@ -2,6 +2,9 @@
第一次开发,请先[部署教程](../deploy/docker.md),需要部署数据库.
client: FastGpt 网页项目
admin: 管理端
## 环境变量配置 (可能更新不及时,以 docker-compose 里的变量为准)
复制.env.template 文件,生成一个.env.local 环境变量文件夹,修改.env.local 里内容。

Binary file not shown.

After

Width:  |  Height:  |  Size: 247 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 355 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 411 KiB

View File

@@ -0,0 +1,22 @@
# 通过 OpenAPI 接入第三方应用
## 1. 获取 API 秘钥
注意复制,关掉了需要新建~
![imgs](./img1.png)
## 2. 组合秘钥
利用刚复制的 API 秘钥加上 AppId 组合成一个新的秘钥,格式为: API 秘钥-AppId例如`fastgpt-z51pkjqm9nrk03a1rx2funoy-642adec15f04d67d4613efdb`
## 3. 替换三方应用的变量
OPENAI_API_BASE_URL: https://fastgpt.run/api/openapi (改成自己部署的域名)
OPENAI_API_KEY = 组合秘钥
**[chatgpt next](https://github.com/Yidadaa/ChatGPT-Next-Web) 示例**
![imgs](./chatgptnext.png)
**[chatgpt web](https://github.com/Chanzhaoyu/chatgpt-web) 示例**
![imgs](./chatgptweb.png)