Unclear format/naming convention to integcrate my local Mistral Server

Infos:

  • Used Zammad version: 7.0.0-1773065885.71cc9d5f.noble
  • Used Zammad installation type: package
  • Operating system: 24.04.4 LTS
  • Browser + version: latest Firefox

Expected behavior:

  • The expected format/naming convention for the Model of my LLM gets listed.

Actual behavior:

  • it’s just a clear field and i didn’t find any documentation required format or naming convention.

Steps to reproduce the behavior:

  • Go to AI → Provider
  • Choose Custom (OpenAI Compatible)
  • Write down any Model there is not a indication how i should write the LLM Model, if it is implemented or I wrote it in the wrong format/naming convention.
  • when i activate the AI Assistant i get the following output (i used multiple formats already)
server: nginx/1.24.0 (Ubuntu)
date: Thu, 12 Mar 2026 13:28:49 GMT
content-type: application/json
content-length: 123
connection: close

{
  "object": "error",
  "message": "The model `Mistral 3.1 small` does not exist.",
  "type": "NotFoundError",
  "param": null,
  "code": 404
}

Would be nice if i could get a list or a link to the format or naming convention or what is implemented yet so i can check what is wrong.

You should write the LLM name like you used tool is expecting it :slight_smile: We can not say something about it in the end.

Normally it’s something in this directions: mistral-small3.2

1 Like

Thanks for your input i solved it.

FYI:
in my case I could get the model with this curl:

curl https://MY_LOKAL_AI/v1/models
-H “Authorization: Bearer $API_KEY”

The formatting in my case was:

/models/Mistral-Small-3.1-24B-Instruct-2503-HF-FP8-dynamic

and not:
Mistral-Small-3.1-24B-Instruct-2503-HF-FP8-dynamic

Have a nice day.

Nice that you found it :slight_smile:

In some future iteration, the idea is also to have a dropdown selection, but for the start, we decided on a simpler solution, because maybe we will change this section a little bit more. That you can have multiple providers and also select per feature, some provider/LLM. Let’s see what the future brings :smiley:

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.