Install

CDN (no build step)

<link rel="stylesheet" href="https://unpkg.com/quikchat/dist/quikchat.css">
<script src="https://unpkg.com/quikchat/dist/quikchat.umd.min.js"></script>

npm

npm install quikchat

ESM import

import quikchat from 'quikchat';

Quick Start

Create a container, include QuikChat, and initialize with a callback:

<div id="chat" style="height: 400px;"></div>
<script>
const chat = new quikchat("#chat",
  (chat, msg) => {
    chat.messageAddNew(msg, "me", "right");
    // send msg to your backend / LLM and post the response:
    chat.messageAddNew("Got it!", "bot", "left");
  },
  {
    theme: "quikchat-theme-light",
    titleArea: { title: "My Chat", align: "left", show: true }
  }
);
</script>

The callback fires when the user clicks Send or presses Shift+Enter. Messages are not auto-echoed, so you control what appears and when.

Try it base ~5 KB

This is the base build — no markdown, no extras, just a clean chat widget. Type a message and press Send. Use the buttons to toggle UI elements and switch themes.

Three Build Tiers

Same API, different levels of content rendering:

BuildScript tagSize (gzip)What you get
Basequikchat.umd.min.js~5 KBChat widget, zero dependencies
Markdownquikchat-md.umd.min.js~9 KB+ basic markdown (bold, italic, code, tables)
Fullquikchat-md-full.umd.min.js~14 KB+ syntax highlighting, math, maps, diagrams, SVG

All three are a single script tag with zero npm dependencies. See the landing page demo for the full build in action.

LLM Integration

QuikChat tracks full message history. Feed it directly to any OpenAI-compatible API:

const history = chat.historyGet().map(m => ({
  role: m.role,
  content: m.content
}));

const response = await fetch("https://api.openai.com/v1/chat/completions", {
  method: "POST",
  headers: {
    "Content-Type": "application/json",
    "Authorization": "Bearer " + API_KEY
  },
  body: JSON.stringify({ model: "gpt-4o", messages: history })
});

Works with OpenAI, Ollama, LM Studio, Anthropic, Mistral, or any provider with a compatible API. See the Ollama and OpenAI examples for complete working code.

Next Steps