Zero dependencies. One script tag. Multi-user chat, streaming LLM integration, 11 built-in themes, markdown with syntax highlighting and math, message visibility for tool calls, history export/import, RTL support — all in a single HTML file with no build step. View source on any example to see how.
Theme switching, toggle title/input, alternating colors. The simplest way to use QuikChat.
Using QuikChat as an ES module with import.
Two independent chat widgets that send messages to each other.
Bold, italic, code blocks, tables, lists, links. Uses quikchat-md build with quikdown. No network needed.
Syntax highlighting, math (MathJax), diagrams (Mermaid), maps (Leaflet), SVG, CSV tables. Renderers load from CDN on demand.
Local LLM chat using Ollama. Basic request/response pattern.
Conversational context via historyGet() — the LLM remembers what was discussed.
GPT-4o with token streaming. Works with any OpenAI-compatible API.
QuikChat is vanilla JS — it works with any framework. No framework-specific packages needed.
useRef + useEffect — live demo and integration pattern.
Template ref + onMounted — live demo and integration pattern.
Ref variable + onMount — live demo and integration pattern.
bind:this + onMount — code example for Svelte 4/5.
ViewChild + ngAfterViewInit — code example.
Pre-built React wrapper component with useRef and forwardRef.
Read-only log display with live feed, role-based color coding, and tag-based filtering (All / Errors / Warnings).
Deployment timeline with timestamps, automated events (system) and human actions (user) color-coded by role.
Hide internal tool calls with visible: false and reveal them with a toggle using messageSetVisibleByTag().
Export a full chat session as JSON with historyExport() and restore it into another widget with historyImport().
Load 10,000 or 50,000 messages and scroll smoothly. Virtual scrolling activates automatically at 500 messages.
Side-by-side English (LTR) and Arabic (RTL) chat widgets with a button to swap directions at runtime.