This is a demo of two chat widgets connected to the local LLM via Ollama. One chat widget is connected to the Ollama completion API, and the other is connected to the Ollama streaming API.
Neither chat has conversational memory, so each respose is independent of the previous chat message.