Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Seed and temperature settings don't work as expected #394

Open
ihatemakinganaccount opened this issue Dec 7, 2024 · 5 comments
Open

Seed and temperature settings don't work as expected #394

ihatemakinganaccount opened this issue Dec 7, 2024 · 5 comments

Comments

@ihatemakinganaccount
Copy link

Alpaca 2.8 Flatpak, fedora 41

Alpaca has settings for temperature and seed, which I assume should mirror these settings in Ollama.

I don't think these work as they should.

First off, Ollama supports temp from 0 to 2.0, while in Alpaca it goes only to 1.0.

And lowering the temp doesn't seem to have much effect. Between 0 and 1 there's barely any difference, if anything the variations may be just random. While changing this in Ollama by just .1 can be significant.

As for seed, while setting a specific seed (other than 0) seems to work, if it's at 0, I often get the exact same response with the same prompt. (But not always, which is weird.) This shouldn't happen when the seed is random, and I've not seen it with basic Ollama.

Or do these settings do something else than what I think they should?

Also, a request: would you consider adding more options? Especially context length, that would be super useful, as it seems the default is very small (AFAIK Ollama defaults to 2k unless specified in the model). And perhaps the system prompt, which can change the "chat" experience the most and would allow to compare different models better.

@Jeffser
Copy link
Owner

Jeffser commented Dec 9, 2024

Hi, thanks for your report

  • Thanks for telling me about the actual temp limits, I'm going to change it rn
  • Setting the seed to 0 is supposed to act as "random seed", that's intended to work that way
  • I'll see if I can add more options, thanks for the suggestions

@Jeffser
Copy link
Owner

Jeffser commented Dec 9, 2024

5532d9a
Changed the temp thing

@ihatemakinganaccount
Copy link
Author

ihatemakinganaccount commented Dec 10, 2024

@Jeffser

Cool, thanks.

Btw with the 0 seed, try it out. From what I'm seeing, when you regenerate the bot's response, first regen gives a different response, but then every new regen is the same, as if the random seed gets stuck. But if I close Alpaca and regenerate the response again, again it is different once, and then again the same.

I'm using only external Ollama instance atm so maybe it's something in the API, but when I restart Alpaca, the server and model stays running, so... Idk.

Not a big deal, I'm just not sure if that's exactly the intent. I'd expect random seed to either generate a new thing every time, or keep it for the entire session/chat, this is kinda inbetween. Just curious.

@Jeffser
Copy link
Owner

Jeffser commented Dec 15, 2024

I think I misunderstood the API or it might have changed at some point, to make it random I shouldn't send 0 I should just not specify any seed, I'm working on fixing it so that when it detects that the user selected 0 it doesn't send a seed

@Jeffser
Copy link
Owner

Jeffser commented Dec 15, 2024

Alright I tested this fix and it seems to be working as expected (4 regenerations with same model, all different results)

0d4a09f

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants