How To Host AI Locally

in at •  2 months ago


    https://odysee.com/@NaomiBrockwell:4/Local-LLM:d

    AI chatbots are taking over the world. But if you want to guarantee your privacy when using them, running an LLM locally is going to be your best bet. In this video I collaborate with The Hated One to show you how, and to explain some AI terminology so that you understand what's going on.

    00:00 Your Data is Used to Train Chatbots
    01:11 Understanding AI Models
    01:28 LLM
    02:23 Parameters
    03:34 Size
    04:51 AI Engine and UX
    07:15 Tutorial, courtesy of The Hated One
    08:07 Ollama
    12:40 Open Web UI installation
    13:23 Docker
    15:30 Setting up Open Web UI
    16:38 Choose to Take Control of Your Data

    The biggest advantage of open source models is that you can fine tune them using your own instructions while keeping all data private and confidential. Why trust your data to someone else when you don’t have to?

    A huge thank you to The Hated One for his tutorial.
    You can find his playlist for staying anonymous here:

    Brought to you by NBTV team members: The Hated One, Reuben Yap, Lee Rennie, Cube Boy, Sam Ettaro, Will Sandoval and Naomi Brockwell

    To support NBTV, visit:
    https://www.nbtv.media/support
    (tax-deductible in the US)

    Visit our shop!
    https://Shop.NBTV.media

    Our eBook "Beginner's Introduction To Privacy:
    https://amzn.to/3WDSfku

    Beware of scammers, I will never give you a phone number or reach out to you with investment advice. I do not give investment advice.

      Authors get paid when people like you upvote their post.
      If you enjoyed what you read here, create your account today and start earning FREE VOILK!