How To Host AI Locally

AI chatbots are taking over the world. But if you want to guarantee your privacy when using them, running an LLM locally is going to be your best bet. In this video I collaborate with The Hated One to show you how, and to explain some AI terminology so that you understand what's going on.

00:00 Your Data is Used to Train Chatbots
01:11 Understanding AI Models
01:28 LLM
02:23 Parameters
03:34 Size
04:51 AI Engine and UX
07:15 Tutorial, courtesy of The Hated One
08:07 Ollama
12:40 Open Web UI installation
13:23 Docker
15:30 Setting up Open Web UI
16:38 Choose to Take Control of Your Data

The biggest advantage of open source models is that you can fine tune them using your own instructions while keeping all data private and confidential. Why trust your data to someone else when you don’t have to?

A huge thank you to The Hated One for his tutorial.
You can find his playlist for staying anonymous here:
https://www.youtube.com/watch?v=G_QsQp7DyjE&list=PLR_ghQEN2SgDLpuLMLa_0hjJZgrlHoi0a

Brought to you by NBTV team members: The Hated One, Reuben Yap, Lee Rennie, Cube Boy, Sam Ettaro, Will Sandoval and Naomi Brockwell


Beware of scammers, I will never give you a phone number or reach out to you with investment advice. I do not give investment advice.


To make a tax-deductible (in the US) donation to NBTV, visit https://www.nbtv.media/support

Sign up for the free NBTV newsletter here: https://cryptobeat.substack.com/

Next
Next

Shorts - When The NSA Asks You To Rate Their Services