Back to News
Advertisement
Advertisement

⚡ Community Insights

Discussion Sentiment

92% Positive

Analyzed from 370 words in the discussion.

Trending Topics

#browser#model#same#models#firefox#don#demos#download#cdn#let

Discussion (15 Comments)Read Original on HackerNews

agent3714 minutes ago
Very cool. Did you happen to try other models like Qwen and was there a difference as opposed to Gemma ?
logicallee35 minutes ago
I love this idea. Unfortunately, it says "Unsupported browser/GPU" for me. This is Desktop Chrome version 147 (page says it requires 134+) and I have a 1060 card with 6 GB of RAM on this specific device, so it should fit. I have more than 4 GB of free RAM as well.
teamchong16 minutes ago
sorry it’s not working for you. I built this as a personal project for self-learning, but I plan to take a look at this issue next weekend. you can check out a video demo of it here: https://github.com/user-attachments/assets/71ae6e5c-a5ec-4d0...
logicallee7 minutes ago
That's amazing. Very good result. Thanks for sharing.
COOLmanYTabout 3 hours ago
no firefox support?
teamchongabout 1 hour ago
firefox has webgpu already, but the subgroups extension isn't in yet. every matmul / softmax kernel here leans on subgroupShuffleXor for reductions, that's the blocker. same reason mlc webllm and friends don't run on firefox either. once mozilla ships it this should work
hhthrowaway1230about 3 hours ago
so multiple of these browser wasm demos make me re-download the models, can someone make a cdn for it or some sort u uberfast downloader? just throw some claude credits against it ty!
wereHamsterabout 2 hours ago
CDN wouldn't help much. These days browsers partition caches by origin, so if two different tools (running on different domains) fetch the same model from the CDN, the browser would download it twice.
embedding-shapeabout 2 hours ago
Adding a file input where users can upload files to the frontend directly from their file manager would probably work as a stop-gap measure, for the ones who want something quick that let people manage their own "cache" of model files.
logicalleeabout 1 hour ago
Would you be okay with it using your upload at the same time, then a p2p model would work. (This is potentially a good match for p2p because edge connections are very fast, they don't have to go across the whole Internet). You could be downloading from uploaders in your region. Let me know if you would be okay with uploading at the same time, then this model works and I can build it for you for people to use this way.
Rekindle8090about 3 hours ago
What? downloaded for me at 2gbps
hhthrowaway1230about 3 hours ago
Ah let me clarify, many of the in the browser demos make me download certain models even if I already have them It would be great if there was a way that I don't have to redownload them across demos so that I just have a cache. or an in browser model manager. hope this makes sense.

Or indeed use some sort of huggingface model downloader (if that exist with XET)

varun_chabout 1 hour ago
I think this would sit best at the browser level. I’m not sure there’s a nice way for multiple websites to share a cache like that.
hhthrowaway1230about 3 hours ago
also maybe a good usecase to finally have P2P web torrents :)
hhthrowaway1230about 3 hours ago
Yeah that's great but I'm in a cafe outside burning my phone data. ty!