Skip to content

Misc. bug: New web UI is lame with number of existing conversations #16347

@gnusupport

Description

@gnusupport

Name and Version

llama-cli --version
ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no
ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
ggml_cuda_init: found 1 CUDA devices:
Device 0: NVIDIA GeForce RTX 3090, compute capability 8.6, VMM: yes
version: 2279 (de41f2b)
built with cc (Debian 14.2.0-19) 14.2.0 for x86_64-linux-gnu

Operating systems

Linux

Which llama.cpp modules do you know to be affected?

llama-server

Command line

/usr/local/bin/llama-server --jinja -fa on -c 32768 -ngl 999 -v --log-timestamps --host 192.168.1.68 -m /mnt/nvme0n1/LLM/quantized/openai_gpt-oss-20b-Q8_0.gguf

Problem description & steps to reproduce

The new web UI is quite different than old, and lacks responsiveness. If I open it with new instance of the browser, or in the anonymous window, it looks fine and good.

However, my old storage has many conversations, not too many, but many.

And in that case the new web UI is totally unresponsive. I can see the chat area, and the upper left and right icons, but:

  • left icon (to show or hide conversations) is not working (I guess it needs time to start working, but did not count how long time)
  • right icon (settings), also doesn't work
  • when starting to write anything in the chat area, I get empty result, ending with totally empty conversation

The way to revive it is that I start clicking on existing conversations, provided they open finally. Then maybe I say "hello" and maybe it starts working.

The new web UI is very uncertain, and has lack of functionality.

First Bad Commit

No response

Relevant log output

Metadata

Metadata

Assignees

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions