Phi-3.5-Mini on WebGPU ⚡

Blazing fast inference with WebGPU and WebLLM running locally in your browser.

Load Model