Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Groups
  • Search
  • Get Qt Extensions
  • Unsolved
Collapse
Brand Logo
  1. Home
  2. Qt Development
  3. General and Desktop
  4. Ollama-run models not appearing in “Prompts” or “Review” dropdowns in Qt AI Assistant
Forum Updated to NodeBB v4.3 + New Features

Ollama-run models not appearing in “Prompts” or “Review” dropdowns in Qt AI Assistant

Scheduled Pinned Locked Moved Unsolved General and Desktop
2 Posts 1 Posters 73 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • M Offline
    M Offline
    mat7747
    wrote last edited by
    #1

    What happens

    1. I start the Ollama server with:

    ollama run codellama:7b-code
    ollama run phi3

    1. Verifying with ollama ps shows both models as running.

    2. In Qt Creator → Preferences → AI Assistant → General, I see the “through Ollama” entries under Code completion, and I can successfully select and use them for code completion.

    3. However, under the Prompts and Review dropdowns, no “through Ollama” options appear at all.


    What I’ve checked

    Model names exactly match those listed in ollama ps (including the colon in codellama:7b-code).

    Qt Creator was fully restarted after each change.

    Advanced tab does not show a “+” button, so I manually edited aiassistant.json in my Qt settings folder:

    {
    "customProviders": [
    {
    "name": "phi3 via Ollama",
    "apiType": "chat",
    "serverUrl": "http://localhost:11434",
    "model": "phi3"
    }
    ]
    }

    But “phi3 via Ollama” still does not appear under Prompts or Review.

    I have confirmed that Prompts/Review require chat-compatible models, and phi3 should qualify.


    Questions

    1. Is this a current limitation or bug in the Qt AI Assistant plugin?

    2. Has anyone successfully used Ollama-run models (especially non-CodeLlama ones) in the Prompts or Review features? If so, what extra steps are needed?

    3. Are there any hidden settings or log files I should inspect to troubleshoot model discovery for chat features?

    Thank you in advance for any insights or suggestions!

    1 Reply Last reply
    0
    • M Offline
      M Offline
      mat7747
      wrote last edited by
      #2
      This post is deleted!
      1 Reply Last reply
      0

      • Login

      • Login or register to search.
      • First post
        Last post
      0
      • Categories
      • Recent
      • Tags
      • Popular
      • Users
      • Groups
      • Search
      • Get Qt Extensions
      • Unsolved