Skip to content

Instantly share code, notes, and snippets.

@miku
miku / llm-wiki.md
Created May 13, 2026 12:16 — forked from karpathy/llm-wiki.md
llm-wiki

LLM Wiki

A pattern for building personal knowledge bases using LLMs.

This is an idea file, it is designed to be copy pasted to your own LLM Agent (e.g. OpenAI Codex, Claude Code, OpenCode / Pi, or etc.). Its goal is to communicate the high level idea, but your agent will build out the specifics in collaboration with you.

The core idea

Most people's experience with LLMs and documents looks like RAG: you upload a collection of files, the LLM retrieves relevant chunks at query time, and generates an answer. This works, but the LLM is rediscovering knowledge from scratch on every question. There's no accumulation. Ask a subtle question that requires synthesizing five documents, and the LLM has to find and piece together the relevant fragments every time. Nothing is built up. NotebookLM, ChatGPT file uploads, and most RAG systems work this way.

@miku
miku / ai-agents.md
Created April 30, 2026 07:50 — forked from devinschumacher/ai-agents.md
AI Agents: A Comprehensive List of The Best AI Agents
title AI Agents - A Comprehensive Database of The Best AI Agents
tags
ai agents
ai
artificial intelligence

AI Agents: A Comprehensive Database of The Best AI Agents

@miku
miku / psql-srv.py
Created March 30, 2026 07:39 — forked from eatonphil/psql-srv.py
postgres "server" wire protocol example (ported python3)
# th30z@u1310:[Desktop]$ psql -h localhost -p 55432
# Password:
# psql (9.1.10, server 0.0.0)
# WARNING: psql version 9.1, server version 0.0.
# Some psql features might not work.
# Type "help" for help.
#
# th30z=> select foo;
# a | b
# ---+---
@miku
miku / clone-all-gists.sh
Last active March 3, 2026 14:07 — forked from chenchun/clone-all-gists.sh
clone all gists and better search them with your own tools #github #gists
#!/bin/bash
set -eu -o pipefail
token="${GITHUB_TOKEN:?Set GITHUB_TOKEN before running this script}"
clone_or_pull() {
local page="$1"
local tmpfile
tmpfile=$(mktemp)
trap "rm -f '$tmpfile'" RETURN
@miku
miku / README.md
Last active January 16, 2026 15:36

Primo Search

Browser automation tool to search FU Berlin's Primo library catalog and capture results.

Setup

uv venv .venv
uv pip install playwright --python .venv/bin/python
.venv/bin/playwright install chromium
@miku
miku / radioscript.go
Created November 20, 2025 23:33
radioscript
// radioscript allows to capture redio stream (this is a mostly complete script, taken out of an not yet published project)
package main
import (
"bufio"
"crypto/sha1"
"errors"
"flag"
"fmt"
"io"
@miku
miku / celluloidTV.m3u
Created November 4, 2025 09:50 — forked from Axel-Erfurt/celluloidTV.m3u
Livestreams deutscher TV-Sender
#EXTM3U
#EXTINF:-1,ARD
https://daserste-live.ard-mcdn.de/daserste/live/hls/de/master.m3u8
#EXTINF:-1,ARD ONE
https://mcdn-one.ard.de/ardone/hls/master.m3u8
#EXTINF:-1,ARD Alpha
https://mcdn.br.de/br/fs/ard_alpha/hls/de/master.m3u8
#EXTINF:-1,ARD Tagesschau
https://tagesschau.akamaized.net/hls/live/2020115/tagesschau/tagesschau_1/master.m3u8
#EXTINF:-1,ZDF
@miku
miku / README.md
Created September 25, 2025 14:44 — forked from Artefact2/README.md
GGUF quantizations overview

Which GGUF is right for me? (Opinionated)

Good question! I am collecting human data on how quantization affects outputs. See here for more information: ggml-org/llama.cpp#5962

In the meantime, use the largest that fully fits in your GPU. If you can comfortably fit Q4_K_S, try using a model with more parameters.

llama.cpp feature matrix

See the wiki upstream: https://github.com/ggerganov/llama.cpp/wiki/Feature-matrix

@miku
miku / ollama_fast_speech_text_speech.py
Created September 9, 2025 09:28 — forked from lucataco/ollama_fast_speech_text_speech.py
speech to text to speech using Ollama
""" To use: install Ollama, clone OpenVoice, run this script in the OpenVoice directory
brew install portaudio
brew install git-lfs
git lfs install
git clone https://github.com/myshell-ai/OpenVoice
cd OpenVoice
git clone https://huggingface.co/myshell-ai/OpenVoice
cp -r OpenVoice/* .
@miku
miku / ML_Workflow.md
Created August 30, 2025 23:51 — forked from ZohebAbai/ML_Workflow.md
Universal Workflow of a Machine Learning Problem

Universal Workflow for Approaching a Machine Learning Problem

Define the Problem and Assemble a Dataset :

  • What is your input data?
  • What are you trying to predict?
  • What type of problem is it - Supervised? Unsupervised? Self-Supervised? Reinforcement Learning?
  • Be aware of the hypotheses that you are making at this stage: