-
-
Notifications
You must be signed in to change notification settings - Fork 55
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Seed and temperature settings don't work as expected #394
Comments
Hi, thanks for your report
|
5532d9a |
Cool, thanks. Btw with the 0 seed, try it out. From what I'm seeing, when you regenerate the bot's response, first regen gives a different response, but then every new regen is the same, as if the random seed gets stuck. But if I close Alpaca and regenerate the response again, again it is different once, and then again the same. I'm using only external Ollama instance atm so maybe it's something in the API, but when I restart Alpaca, the server and model stays running, so... Idk. Not a big deal, I'm just not sure if that's exactly the intent. I'd expect random seed to either generate a new thing every time, or keep it for the entire session/chat, this is kinda inbetween. Just curious. |
I think I misunderstood the API or it might have changed at some point, to make it random I shouldn't send 0 I should just not specify any seed, I'm working on fixing it so that when it detects that the user selected 0 it doesn't send a seed |
Alright I tested this fix and it seems to be working as expected (4 regenerations with same model, all different results) |
Alpaca 2.8 Flatpak, fedora 41
Alpaca has settings for temperature and seed, which I assume should mirror these settings in Ollama.
I don't think these work as they should.
First off, Ollama supports temp from 0 to 2.0, while in Alpaca it goes only to 1.0.
And lowering the temp doesn't seem to have much effect. Between 0 and 1 there's barely any difference, if anything the variations may be just random. While changing this in Ollama by just .1 can be significant.
As for seed, while setting a specific seed (other than 0) seems to work, if it's at 0, I often get the exact same response with the same prompt. (But not always, which is weird.) This shouldn't happen when the seed is random, and I've not seen it with basic Ollama.
Or do these settings do something else than what I think they should?
Also, a request: would you consider adding more options? Especially context length, that would be super useful, as it seems the default is very small (AFAIK Ollama defaults to 2k unless specified in the model). And perhaps the system prompt, which can change the "chat" experience the most and would allow to compare different models better.
The text was updated successfully, but these errors were encountered: