Back to News
Advertisement
IIgor_Wiwi about 7 hours ago 1 commentsRead Article on locallllm.fly.dev

ZH version is available. Content is displayed in original English for accuracy.

I built localLLLM: a small community project for running local models.

Live: https://locallllm.fly.dev

The goal is simple: if someone has model + OS + GPU + RAM, they should get steps that actually work (ideally one liner)

I need help populating and validating guides.

If you run local models, please submit one working recipe (or report what failed). Would love to hear general feedback as well!

Advertisement

Discussion (1 Comments)Read Original on HackerNews

modinfoabout 7 hours ago
This is nice idea! But opensource it and instant to save data to db, just save to markdown on github, then everyone can send PRs to edit/add instructions. More freedom and simpler.