Problem Description
Could you, please, add following models to webllm chat?
Solution Description
From my understanding the missing part is Custom model library right?
I can try to do it myself and prepare PR if it is feasible for someone with no understanding of the field, unfortunately I'm not sure if it is possible to do using Intel Macbook.
Alternatives Considered
It seems like WebLLM(WebGPU) is single viable option to run it on Intel Mac (since mlc-ai is not an option: mlc-ai/mlc-llm#3078).
Additional Context
No response