www.ai.infinus.ca Open in urlscan Pro
38.154.241.74  Public Scan

URL: https://www.ai.infinus.ca/
Submission: On July 16 via api from US — Scanned from CA

Form analysis 0 forms found in the DOM

Text Content

CHAT WITH OLLAMA


MODEL CONTROLS

System Prompt
Hostname
Model:
History: Select a chat
Delete Chat Save Chat

Send

ENTER YOUR NAME


Close Save

UNABLE TO ACCESS OLLAMA SERVER

Ollama-ui was unable to communitcate with Ollama due to the following error:

Failed to fetch

--------------------------------------------------------------------------------

How can I expose the Ollama server?

By default, Ollama allows cross origin requests from 127.0.0.1 and 0.0.0.0.

To support more origins, you can use the OLLAMA_ORIGINS environment variable:

OLLAMA_ORIGINS=https://www.ai.infinus.ca ollama serve


OR use OLLAMA_ORIGINS=*

Also see: https://github.com/jmorganca/ollama/blob/main/docs/faq.md