Ollama QuikChat Demo

This is a demo of two chat widgets connected to the local LLM via Ollama. One chat widget is connected to the Ollama completion API, and the other is connected to the Ollama streaming API.

Neither chat has conversational memory, so each respose is independent of the previous chat message.

Note: This demo only works if running locally with Ollama running on port 11434. It will not work on the github demo pages.