You must log in or # to comment.
This is a really great use of LLM! Seriously great job! Once it’s fully self-hostable (including the LLM model), I will absolutely find it space on the home server. Maybe using Rupeshs fastdcpu as the model and generation backend could work. I don’t remember what his license is, though.
Edit: added link.
Thanks! I’m already eyeing ollama for this.
No license?
In case OP doesn’t know, if a repo hasn’t got a licence it’s implied it’s licensed under “all rights reserved”, so not open source! You need to https://choosealicense.com/
it’s implied it’s licensed under “all rights reserved”, so not open source!
Oh, I actually did not know that. I’ll try to remember adding a License right from the get-go from now on, thanks :)