Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update openAI.ts #129

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
81 changes: 51 additions & 30 deletions src/utils/openAI.ts
Original file line number Diff line number Diff line change
@@ -1,39 +1,60 @@
import { GoogleGenerativeAI } from '@fuyun/generative-ai'
'@fuyun/generative-ai'导入{ GoogleGenerativeAI }

const apiKey = (import.meta.env.GEMINI_API_KEY)
const apiBaseUrl = (import.meta.env.API_BASE_URL)?.trim().replace(/\/$/, '')
const apiKey = ( import .meta .env .GEMINI_API_KEY ) _ _ _
const apiBaseUrl = ( import .meta .env .API_BASE_URL ) ? . _ _ 修剪()。替换( /\/$/ , '' )

const genAI = apiBaseUrl
? new GoogleGenerativeAI(apiKey, apiBaseUrl)
: new GoogleGenerativeAI(apiKey)
const genAI = apiBaseUrl
?新的GoogleGenerativeAI ( apiKey, apiBaseUrl )
:新的GoogleGenerativeAI ( apiKey )

export const startChatAndSendMessageStream = async(history: ChatMessage[], newMessage: string) => {
const model = genAI.getGenerativeModel({ model: 'gemini-pro' })
导出 const startChatAndSendMessageStream = async (历史记录: ChatMessage [ ] , newMessage : string ) => {
常量 模型= genAIgetGenerativeModel ( { 模型: 'gemini-pro' } )

const chat = model.startChat({
history: history.map(msg => ({
role: msg.role,
parts: msg.parts.map(part => part.text).join(''), // Join parts into a single string
})),
generationConfig: {
maxOutputTokens: 8000,
},
})
常量 聊天= 模型。开始聊天({
历史:历史。地图(味精=> ( {
角色:消息。角色,
部分:味精。部分。地图(部分=>部分.文本)。join ( '' ) , // 将各个部分连接成一个字符串
} ) ) ,
生成配置:{
“温度”:0.95,
“顶部_p”:1,
“top_k”:1,
“最大输出令牌”:2048,
} ,
安全设置:[
{
“类别”:“HARM_CATEGORY_HARASSMENT”,
“阈值”:“BLOCK_NONE”
} ,
{
“类别”:“HARM_CATEGORY_HATE_SPEECH”,
“阈值”:“BLOCK_NONE”
} ,
{
“类别”:“HARM_CATEGORY_SEXUALLY_EXPLICIT”,
“阈值”:“BLOCK_NONE”
} ,
{
“类别”:“HARM_CATEGORY_DANGEROUS_CONTENT”,
“阈值”:“BLOCK_NONE”
}
] ,
} )

// Use sendMessageStream for streaming responses
const result = await chat.sendMessageStream(newMessage)
// 使用 sendMessageStream 来传输响应
const 结果=等待聊天。发送消息流(新消息)

const encodedStream = new ReadableStream({
async start(controller) {
const encoder = new TextEncoder()
for await (const chunk of result.stream) {
const text = await chunk.text()
const encoded = encoder.encode(text)
controller.enqueue(encoded)
const 编码流=新的ReadableStream ( {
异步 启动(控制器) {
const 编码器= new TextEncoder ( )
for await ( const chunk of result.stream ) { _
const text =等待块。文本()
const 编码= 编码器。编码(文本)
控制器。入队(编码)
}
controller.close()
},
})
控制器。关闭()
} ,
} )

return encodedStream
返回编码流
}