Local Character AI website AetherChat, with RVC character voice and local AI chat completion.

time
10 months ago
view
5 views

original uri https://youtu.be/NDW7xa-9zv8 link from https://github.com/nexusjuan12/AetherChat and https://www.reddit.com/r/selfhosted/comments/1i6fjlk/local_deployable_characterai_knockoff_requires/

Hello today I am proud to present my project Aetherchat. I have been working for around 8 weeks on the final project and have decided to release an early version to the community as a fun project. This project can be deployed either locally or remotely on a platform such as Vast.ai. If deploying locally you will need a CUDA capable Nvidia graphics card and the operating system must be Ubuntu 22.04. You will need port 5000 and port 8081 availble for the webserver and for the Kobold API. The entire project is deployed via a single setup script available in the github repository.

The repo https://github.com/nexusjuan12/AetherChat

download the setup.sh from this repository place it in the base directory of the installation location. Use the following command to make it executable then execute it

chmod +x setup.sh ./setup.sh

this will being the installation.

Once the script completes you will be provided with two commands to activate the conda environment paste those into the terminal

source ~/.bashrc conda activate aetherchat

Then type the following command into the terminal to start the webserver

python webserver.py

You should see text indicating that the webserver is available at local host port 8081 Then we'll open a fresh terminal window no need to activate a conda environment this time and we'll start the Kobold API which will be called by our webserver for chat completion paste the following command to start Kobold with the provided sample model or use your own model. There are many to choose from on Huggingface.co simply place your model in the models directory under Kobold and substitute the filename in the following command.

cd Kobold

./koboldcpp /root/Kobold/models/L3.1-Dark-Planet-SpinFire-Uncensored-8B-D_AU-Q4_k_m.gguf --port 5000 --host 0.0.0.0 --usecublas

You should see text confirming that the API is available at local host port 5000

If deployed locally you should be able to open a browser window and navigate to http://localhost:8081 http://127.0.0.1:8081 if you deployed on Vast or another platform you will need to check the port mapping and ip address corresponding to your instance. Find the port mapped to 8081 and the local ip address and we'll copy and paste those into the web browser and you should find your self at the main page.

There is a provided administrator account the login and password are admin/admin.

Loading comments...
affpapa
sigma-africa
sigma-asia
sigma-europe

Licensed