AI Assistant
Welcome! In this guide, you'll discover an AI chatbot that can interact with the NEAR ecosystem
This AI agent can:
- Explore and explain what happened in a transaction when given a transaction hash
- Request tokens from the testnet faucet
- Mint and send a special NFT though a wallet it controls to a user
- Answer general questions about the NEAR architecture (powered by realtime search results)
Created by our community member Reza, this project was one of our AI track winners at the ETHGlobal Brussels 2024 hackathon
Prerequisites
Let's start by setting up the environment to run the AI assistant locally.
Tools
Before starting, make sure you have the following tools installed:
- Mac
- Linux
# Install Node.js using nvm (more option in: https://nodejs.org/en/download)
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.7/install.sh | bash
nvm install latest
# Install python using miniconda (includes the package manager pip)
brew install --cask miniconda
conda init "$(basename "${SHELL}")"
pip install poetry
# Install llama.cpp
brew install llama.cpp
Please help by contributing these steps for Linux!
AI Model
In this tutorial we will be using the NousResearch/Hermes-2-Pro-Llama-3-8B-GGUF model, which is hosted on Hugging Face.
# Install the Hugging Face library
pip install huggingface_hub
# Login to your Hugging Face account
huggingface-cli login
# get the model from Hugging Face
huggingface-cli download NousResearch/Hermes-2-Pro-Llama-3-8B-GGUF Hermes-2-Pro-Llama-3-8B-Q4_K_M.gguf --local-dir model
We use the small Q4_K_M model to reduce the time and resources needed to run the AI agent
Execute the Model
You should now have a folder named ./model with the GGUF file ./model/Hermes-2-Pro-Llama-3-8B-Q4_K_M.gguf, lets use llama.cpp to run it.
# run the model with llama.cpp
llama-server -m ./model/Hermes-2-Pro-Llama-3-8B-Q4_K_M.gguf
Open your browser at http://localhost:8080, if you see an interface similar to this one you are ready to go 🚀

You can use a different model with llama.cpp if you wish! Just make sure it supports function calling