Skip Navigation

For people self hosting LLMs.. I have a couple docker images I maintain

https://github.com/noneabove1182/text-generation-webui-docker (updated to 1.3.1 and has a fix for gqa to run llama2 70B)

https://github.com/noneabove1182/lollms-webui-docker (v3.0.0)

https://github.com/noneabove1182/koboldcpp-docker (updated to 1.36)

All should include up to date instructions, if you find any issues please ping me immediately so I can take a look or open an issue :)

You're viewing a single thread.

10 comments
10 comments