Skip to content

Access the Inference Service Chat UI

Browse to llm.jetstream-cloud.org. At the login page, click “Continue with ACCESS single sign-on”. Log in with the “ACCESS CI (XSEDE)” identity provider, the same way that you log into other Jetstream2 interfaces like Exosphere.

screenshot of Open WebUI signup page

Once you’re signed in, there are several ways to interact.

  • You can chat with it via text.
  • You can provide audio input (which it will transcribe to text), or start a “call” where you speak your prompt and it will speak a response.
  • You can upload a file and ask questions about its contents.
  • You can enable web search, and the LLM will use the web to help answer your prompt.
    • (Currently, this may result in an error or no results found, as DuckDuckGo rate-limits our web search API calls.)
  • You can generate an API token to use Open WebUI as an API proxy
  • You can set up Retrieval-Augmented Generation with your own source documents.

Consult the Open WebUI documentation for more detail.