Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WizardCoder-15B-V1.0-q4f16_1 failing to load on WebLLM #12

Open
jcosta33 opened this issue Aug 21, 2023 · 0 comments
Open

WizardCoder-15B-V1.0-q4f16_1 failing to load on WebLLM #12

jcosta33 opened this issue Aug 21, 2023 · 0 comments

Comments

@jcosta33
Copy link

jcosta33 commented Aug 21, 2023

Following the available examples in the WebLLM repo such as the next-simple-chat:

I have added the model URL and ID,

{ model_url: "https://huggingface.co/mlc-ai/mlc-chat-WizardCoder-15B-V1.0-q4f32_1/resolve/main/", local_id: "WizardCoder-15B-V1.0-q4f32_1", }

then added the libmap

"WizardCoder-15B-V1.0-q4f32_1": "https://raw.githubusercontent.com/mlc-ai/binary-mlc-llm-libs/main/WizardCoder-15B-V1.0-q4f16_1-webgpu.wasm",

but I end up getting this error immediately after loading the model on the browser:

Init error, Error: Unknown conv template wizard_coder_or_math

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant