-
-
Notifications
You must be signed in to change notification settings - Fork 12.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Request] 自定义嵌入模型 Custom Embedding model #3785
Comments
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. |
@ProAlexUSC 你好!我是Dosu,我是一个可以帮助你解决 bug、回答问题,并成为贡献者的机器人。如果你有任何问题,我会很乐意帮助你。 You can add support for other embedding models or remap existing models within the system. The {
"name": "netease-youdao/bce-embedding-base_v1",
"renamed": "text-embedding-3-small",
...
} This will allow the system to use the |
后续会支持的,但是这个模型的向量必须要能设定 1024 维才能用 |
It will be supported in the future, but the vector of this model must be able to set 1024 dimensions before it can be used. |
Why must be 1024 dimensions? Also, based on OpenAI, "text-embedding-3-small" is 1,536 dimensions. |
@Lostsite Just read the offical docuement. https://openai.com/index/new-embedding-models-and-api-updates/ ![]() https://platform.openai.com/docs/guides/embeddings/use-cases ![]() |
BAAI/bge-m3 这个向量支持1024纬度在siliconflow有这个模型,目前是免费 |
So, must have OpenAI API to use it? Btw, I use self host server mode to deploy with docker-compose. |
智谱的embedding-3也支持1024纬度以上的向量 |
赞,希望支持 ollama 运行的嵌入模型 |
Like, I hope to support the embedded model run by ollama |
目前也是遇到了这个问题,服务端只能用openai的词嵌入向量化,国内很多模型也支持价格也不贵,希望可以支持更多家的向量化模型,比如ChatGLM的 |
We are currently encountering this problem as well. The server can only use openai's word embedding vectorization. Many domestic models also support it and the price is not expensive. We hope to support more vectorization models, such as ChatGLM's. |
How can I get lobe-chat to a call a Rest API that exposes my local hosted embedding model ?? |
hello, is there any plain for support my own custom embedding, i think this is not a very hard problem |
@AlexBlack2202 No. It's a hard work for the architecture design to make it easy to intergate more embedding providers. Here is PR for it: #4370. Stay tuned |
Any news on this? |
在我看来,这个功能甚至应该是优先级最高的功能之一,因为它影响了所有无法使用OpenAI或只能本地运行 embedding 的用户 |
In my opinion, this feature should even be one of the highest priority features since it affects all users who don't have access to OpenAI or can only run embedding locally |
Hope to support this feature as soon as possible. |
Any news on this? |
This issue is closed, If you have any questions, you can comment and reply. |
I tried my best to do it and it was successful. From this perspective, it should be no difficulty in implementing the official implementation, just change the name, and don’t understand why there is no progress after half a year. |
🥰 需求描述
目前嵌入模型仅支持 text-embedding-3-small,能否支持其他模型,或者提供模型 remap 的能力?
At present, the embedded model only supports text-embedding-3-small. Can it support other models, or provide the ability for model remapping?
🧐 解决方案
我在 OpenAI 模型配置下,使用其他兼容 OpenAI 接口的大模型提供商,它提供了
netease-youdao/bce-embedding-base_v1
模型。希望能使用这个模型作为嵌入模型,或者直接重命名
netease-youdao/bce-embedding-base_v1
为text-embedding-3-small
Under the OpenAI model configuration, I am using a large model provider that is compatible with the OpenAI API, which offers the
netease-youdao/bce-embedding-base_v1
model.I hope to use this model as the embedding model, or directly rename
netease-youdao/bce-embedding-base_v1
totext-embedding-3-small
.📝 补充信息
No response
The text was updated successfully, but these errors were encountered: