Embedding Limit #84
-
Good Day, I must say that this embedding too actually does what it says, I was able to embed 8 million tokens in just 3 hours on my Macbook pro m1 which Ollama embedding has taken over 2 days now that I just had to end it. But what happened is when I used a chat model to polish reference to human like responses, it was unable to pick some stuffs out of the context. So my question would be, does it have a limit in how much it embeds, that it left some stuffs out? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hey! No, we don't leave anything. The default model has a 512 tokens length size, and we honor that. We'd encourage that you keep the chunk size to be smaller than 512 tokens! |
Beta Was this translation helpful? Give feedback.
Hey! No, we don't leave anything. The default model has a 512 tokens length size, and we honor that. We'd encourage that you keep the chunk size to be smaller than 512 tokens!