Skip to content

Conversation

Mihaiii
Copy link
Contributor

@Mihaiii Mihaiii commented Oct 31, 2023

Update server code after #3841 was merged.

@Green-Sky
Copy link
Collaborator

Green-Sky commented Nov 1, 2023

I think it works? While testing I realized that the "show probabilities" feature is totally bonkers. It not only shows only the split between the configured probabilities, and thus making 1 show 100% all the time, it also forces to have atleast N samples at sampling time -> if 2nd token has a 1% probability, it can still be picked, even with "you need to be at least 90%" min p....

@ivanstepanovftw
Copy link
Contributor

ivanstepanovftw commented Nov 1, 2023

you need to be at least 90% min p...

... after first 4 samplers and penalties that are enabled by default

@ggerganov
Copy link
Member

@Green-Sky Can you show an example? It seems the "show probabilities" works fine (testing on master):

image

@Green-Sky
Copy link
Collaborator

I disable everthing else: top-k to 0, top-p to 1.0 and since temp is applied after min-p and the other samplers, i set it to 1.0.
Set "Show Probabilities" to a larger number to exacerbate the effect.

image
image

@jhen0409 jhen0409 merged commit 57ad015 into ggml-org:master Nov 9, 2023
olexiyb pushed a commit to Sanctum-AI/llama.cpp that referenced this pull request Nov 23, 2023
* Update server.cpp with min_p after it was introduced in ggml-org#3841

* Use spaces instead of tabs

* Update index.html.hpp after running deps.sh

* Fix test - fix line ending
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants