Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Easiest way to run LLMs locally (sitepoint.com)
1 point by zain37 3 months ago | hide | past | favorite | 1 comment


Ollama + open-webui = awesome!

You can even download, install and run models from the open-webui and keep a history, just like chatgpt.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: