Tokenization
const text = "hello world"; const tokens = text .split(/\s+/); // → ["hello","world"]
Animations, code snippets and interactive demos in one place. The project is starting up — something bigger will appear here soon.
const text = "hello world"; const tokens = text .split(/\s+/); // → ["hello","world"]
from transformers import pipeline nlp = pipeline("text-generation") nlp("Hello, ")
fetch('/api/llm', { method: 'POST', body: JSON.stringify({ prompt: "Hello" }) })
context_window: 32k params: 7.2B latency: ~180ms // updated: 2026-04-23