Releases: zuisong/gemini-openai-proxy
Releases · zuisong/gemini-openai-proxy
v0.14.1
v0.14.0
New Features
-
Add support for
gemini-*
model names. Now we can usegemini-1.5-flash-8b-exp-0924
,gemini-exp-1114
,gemini-1.5-flash-8b
, etc. -
Add /v1/embeddings endpoint support.
curl https://gemini-openai-proxy.deno.dev/v1/embeddings \ -H "Content-Type: application/json" \ -H "Authorization: Bearer $YOUR_GEMINI_API_KEY" \ -d '{ "input": "Your text string goes here", "model": "text-embedding-3-small" }'
What's Changed
- build(deps): bump docker/build-push-action from 5 to 6 by @dependabot in #59
- build(deps): bump @hono/node-server from 1.11.2 to 1.11.4 by @dependabot in #60
- support /v1/embeddings endpoint by @zuisong in #68
- mapping gpt-4o-mini to gemini-1.5-flash-8b-exp-0827 by @zuisong in #74
- build(deps): bump @hono/node-server from 1.12.2 to 1.13.1 by @dependabot in #78
- build(deps-dev): bump esbuild from 0.23.1 to 0.24.0 by @dependabot in #77
- build(deps): bump denoland/setup-deno from 1 to 2 by @dependabot in #80
- run ci when open pull_request by @zuisong in #85
- fix gemini error Unable to submit request because it has an empty text parameter error by @zuisong in #86
Full Changelog: v0.13.0...v0.14.0
v0.13.0
v0.12.0
What's Changed
- 修复bug:请求错误时的报错信息可能泄露apikey by @greenjerry in #39
- 支持模型映射 by @zuisong in #41
- build(deps): bump @hono/node-server from 1.9.1 to 1.10.1 by @dependabot in #42
- 🌱 chore: function call improvement by @zuisong in #47
- Add vercel deploy support by @Radiquum in #48
- New gemini-1.5-flash-latest model by @vuchaev2015 in #49
New Contributors
- @greenjerry made their first contribution in #39
- @Radiquum made their first contribution in #48
- @vuchaev2015 made their first contribution in #49
Full Changelog: v0.11.0...v0.12.0
0.11.0
- Resolve CORS error: Update server configurations to enable Cross-Origin access.
- Add support for function calls in non-stream mode.
- Migrate to itty-router to reduce package size.
- Utilize the official Deno Docker image for improved consistency.
v0.10.0
-
gemini-openai-proxy can now act as a reverse proxy for google gemini, which can be useful for people in region that don't have access to the google gemini
it will request
https://generativelanguage.googleapis.com
curl \ "http://localhost:8000/v1/models/gemini-pro:generateContent?key=$YOUR_GEMINI_API_KEY" \ --header 'Content-Type: application/json' \ --data '{"contents":[{"parts":[{"text":"Hello"}]}]}'
-
Service settings can now be passed via apikey, currently the first supported setting is
useBeta
, which can be set like soit will use
v1beta
version gemini api, https://ai.google.dev/docs/api_versionscurl http://localhost:8000/v1/chat/completions \ -H "Authorization: Bearer $YOUR_GEMINI_API_KEY#useBeta" \ -H "Content-Type: application/json" \ -d '{ "model": "gpt-3.5-turbo", "messages": [{"role": "user", "content": "Hello"}], "temperature": 0.7,"stream":true }'
Full Changelog: v0.9.0...v0.10.0