Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ITGD-4664 #1

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
28 changes: 28 additions & 0 deletions app/client/platforms/openai.ts
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,9 @@ interface RequestPayload {
frequency_penalty: number;
top_p: number;
max_tokens?: number;
stream_options?: {
include_usage?: boolean;
};
}

export class ChatGPTApi implements LLMApi {
Expand Down Expand Up @@ -128,6 +131,17 @@ export class ChatGPTApi implements LLMApi {
// Please do not ask me why not send max_tokens, no reason, this param is just shit, I dont want to explain anymore.
};

// nccLog 記錄的東西可以自行調整
const nccLog = {
stream: requestPayload.stream,
model: requestPayload.model,
temperature: requestPayload.temperature,
presence_penalty: requestPayload.presence_penalty,
frequency_penalty: requestPayload.frequency_penalty,
top_p: requestPayload.top_p,
usage: null,
};

// add max_tokens to vision model
if (visionModel && modelConfig.model.includes("preview")) {
requestPayload["max_tokens"] = Math.max(modelConfig.max_tokens, 4000);
Expand Down Expand Up @@ -155,6 +169,12 @@ export class ChatGPTApi implements LLMApi {
);

if (shouldStream) {
chatPayload.body = JSON.stringify({
...requestPayload,
stream_options: {
include_usage: true,
},
});
let responseText = "";
let remainText = "";
let finished = false;
Expand Down Expand Up @@ -242,6 +262,7 @@ export class ChatGPTApi implements LLMApi {
const text = msg.data;
try {
const json = JSON.parse(text);
if (json.usage) nccLog.usage = json.usage;
const choices = json.choices as Array<{
delta: { content: string };
}>;
Expand Down Expand Up @@ -270,6 +291,13 @@ export class ChatGPTApi implements LLMApi {
},
onclose() {
finish();
// 每一次訊息發出後最後會跑到這裡,這裡可以送出 nccLog 資料
console.log("使用者已送出資料", nccLog);
window._elog.push({
web: "104_next_chat",
track: ["chat"],
ext: nccLog,
});
},
onerror(e) {
options.onError?.(e);
Expand Down
9 changes: 8 additions & 1 deletion app/layout.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -36,9 +36,16 @@ export default function RootLayout({
<html lang="en">
<head>
<meta name="config" content={JSON.stringify(getClientConfig())} />
<meta name="viewport" content="width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=no" />
<meta
name="viewport"
content="width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=no"
/>
<link rel="manifest" href="/site.webmanifest"></link>
<script src="/serviceWorkerRegister.js" defer></script>
<script
src="https://static.104.com.tw/104i/js/api/log/e104.log.latest.js"
defer
></script>
</head>
<body>
{children}
Expand Down