feat(chatops/takumi): give it ollama powers
All checks were successful
lint / check (push) Successful in 24s
Check meta / check_meta (pull_request) Successful in 16s
Check meta / check_dns (pull_request) Successful in 19s
build configuration / build_and_cache_storage01 (pull_request) Successful in 1m15s
build configuration / build_and_cache_rescue01 (pull_request) Successful in 1m13s
build configuration / build_and_cache_compute01 (pull_request) Successful in 1m35s
build configuration / build_and_cache_krz01 (pull_request) Successful in 2m4s
build configuration / build_and_cache_geo01 (pull_request) Successful in 1m4s
build configuration / build_and_cache_geo02 (pull_request) Successful in 1m3s
build configuration / build_and_cache_vault01 (pull_request) Successful in 1m18s
lint / check (pull_request) Successful in 23s
build configuration / build_and_cache_web02 (pull_request) Successful in 1m11s
build configuration / build_and_cache_web01 (pull_request) Successful in 1m45s
build configuration / build_and_cache_bridge01 (pull_request) Successful in 1m4s

Those are not superpowers and should be used sparringly and responsibly.
LLMs are not bulletproof. They are mostly bullshit generators.

But even a bullshit generator has usefulness.

Signed-off-by: Ryan Lahfa <ryan@dgnum.eu>
This commit is contained in:
Ryan Lahfa 2024-10-10 18:12:57 +02:00
parent f20353b727
commit f40323bfc0
2 changed files with 16 additions and 1 deletions

View file

@ -47,7 +47,9 @@ let
ps = python3Pkgs.makePythonPath [
ircrobots
tortoise-orm
python3Pkgs.ollama
python3Pkgs.aiohttp
python3Pkgs.loadcredential
];
in
{

View file

@ -1,6 +1,11 @@
#!/usr/bin/env python3
import asyncio
from ircrobots.interface import IBot
from ollama import Client as OllamaClient
from loadcredential import Credentials
import base64
from irctokens.line import build, Line
from ircrobots.bot import Bot as BaseBot
from ircrobots.server import Server as BaseServer
@ -56,6 +61,10 @@ def bridge_stripped(possible_command: str, origin_nick: str) -> str | None:
return possible_command if possible_command.startswith(TRIGGER) else None
class Server(BaseServer):
def __init__(self, bot: IBot, name: str, llm_client: OllamaClient):
super().__init__(bot, name)
self.llm_client = llm_client
def extract_valid_command(self, line: Line) -> str | None:
me = self.nickname_lower
if line.command == "PRIVMSG" and \
@ -106,7 +115,11 @@ class Server(BaseServer):
class Bot(BaseBot):
def create_server(self, name: str):
return Server(self, name)
credentials = Credentials()
base64_encoded_password = base64.b64encode(credentials["OLLAMA_PROXY_PASSWORD"])
token = f"takumi:{base64_encoded_password}"
llm_client = OllamaClient(host='https://ollama01.beta.dgnum.eu', headers={'Authorization': f'Basic {token}'})
return Server(self, name, llm_client)
async def main():
bot = Bot()