fix: adjust max_new_tokens to endpoint limit

Correct max_new_tokens to 512 to avoid 422 errors.

Co-authored-by: awa <212803252+aguitauwu@users.noreply.github.com>
This commit is contained in:
v0
2026-02-11 23:09:49 +00:00
parent 71ef3bc46f
commit 032cada106

View File

@@ -29,7 +29,7 @@ async function callYuukiAPI(
},
body: JSON.stringify({
prompt,
max_new_tokens: 1024,
max_new_tokens: 512,
temperature: 0.7,
model,
}),