Saturday, March 30, 2024

Getting Started with Ollama - Your Own AI Assistant Running Locally


Hey everyone, Daniel from Tiger Triangle Technologies here! Today, I'm here to talk about running your own AI assistant locally on your machine.

Let's face it, AI assistants are all the rage these days. But what if you don't want your conversations with your toaster plotting world domination to be floating around in the cloud? Or maybe you just want a little more control over what your AI assistant is doing, like making sure it uses all it's training data to find the best cat videos, not just the ones approved by its corporate overlords. Well, that's where local AI assistants come in!

Reasons for a Local AI Chatbot

There are a bunch of reasons why you might want to run your own AI assistant locally. For one, it's all about privacy and transparency. You can be sure that your conversations are staying private. Plus, you can use your assistant even when you're offline, which is perfect for those times when you're on a plane and need step-by-step instructions on how to put on that inflatable life jacket because you ignored the flight attendant again.

And let's not forget about the cost! Running your own AI assistant locally is way cheaper than paying for a subscription service. Plus, you get to choose the model you want to run, which means you can avoid any ideological censorship that might come with some pre-trained models, like accidentally getting stuck with an assistant that refuses to give you the recipe for dangerously spicy mayo.

What You Will Learn

Now, I know what you're thinking: "This sounds great, Daniel, but how do I do it?" Well, that's the easy part! There's a fantastic open-source project called Ollama that makes it super simple to get started.

In this video, I'm going to walk you through getting started with Ollama and show you how to set up your own local AI assistant. We'll be using Llama 2, a large language model from Meta, but Ollama works with a bunch of different models. Are you wanting one for a general chatbot? No problem! Do you need a coding assistant? There's a lot of them to choose from! Are you a complete math nerd? There's a model for you!

Here's a sneak peek of what you'll learn in the video:

  1. How to install Ollama (spoiler alert: it's super easy, but if you watch the video you get to see it done in hyper speed)
  2. How to run your AI assistant through the command line
  3. How to prompt your AI assistant for information and complete tasks
  4. System requirements

By the end of this video, you'll have a firm foundation to build your local team of AI assistants (more like your own zoo if you go by the model names)! And the best part? You'll be able to impress your friends with your newfound AI and machine learning hacking skills, or at least convince them you have control over a hive of humanoid robots.

Let's Go

So, what are you waiting for? Grab your favorite cup of caffeine, sit back, and let's get started with Ollama!

P.S. Just a heads up, this video involves a lot of technical stuff. So, if you find yourself getting a little lost, don't worry! Just hit pause, take a breather, and come back when you're ready. And hey, if you get stuck, leave a comment below the video and I'll do my best to help you out.