SkillDistinct4940@alien.topBtoLocalLLaMA•Your settings are (probably) hurting your model - Why sampler settings matterEnglish
1·
1 year agoI’m using OpenAI gpt models. I’m struggling to get consistent and same response for an app I’m trying to make which requires the llm to response deterministic and same each time according to the prompt that we feed into it. I’m getting mixed results currently with defaults
What top p and temperature settings should I provide it?
Would giving just temperature 0 the right thing?
Do I need to give top p too?
I’m using OpenAI gpt models. I’m struggling to get consistent and same response for an app I’m trying to make which requires the llm to response deterministic and same each time according to the prompt that we feed into it. I’m getting mixed results currently with defaults
What top p and temperature settings should I provide it?
Would giving just temperature 0 the right thing?
Do I need to give top p too?