1In-browser LLM inference engine with WebGPU and OpenAI API (opens in new tab)blog.mlc.ai16CharlieRuan1y ago4
2Gemma locally on iOS, Android, web browsers, and GPUs with a single framework (opens in new tab)old.reddit.com5CharlieRuan2y ago1
3Running LLM (phi-2) locally on latest Google Chrome Android (opens in new tab)twitter.com6CharlieRuan2y ago1