It is the instruct model. You can see underneath the prompt box that it’s the deepseek-coder-1.3b-instruct_Q5_K_s model. I used the prompting template in the model, and it slightly improved answers.
But if I ask if to write some code, it almost never does and says something gibberish.
Does your GPU/CPU quality affect the AI’s output? My device is potato.
2 ideas
- use deepseek-coder-1.3b-instruct not the base model
- check that you use the correct prompting template for the model
It is the instruct model. You can see underneath the prompt box that it’s the deepseek-coder-1.3b-instruct_Q5_K_s model. I used the prompting template in the model, and it slightly improved answers.
But if I ask if to write some code, it almost never does and says something gibberish.
Does your GPU/CPU quality affect the AI’s output? My device is potato.