Interesting service, I’m definitely going to try it. I’d like to fine tune a 7B for function calling, and if possible, mimic openai’s function description template so I can share them between model calls. I’ve experimented with injecting the function descriptions with a preamble to a user’s prompt and it works ok (with Mistral 7B Instruct) but with many edge cases. I suspect I need to fine tune to get it to improve. How would I go about structuring my user prompts in the training dataset? Would something like this work?:
{"messages": [{"role": "system", "content": "You are a helpful navigation assistant that calls the appropriate function base on a user's input."}, {"role": "user", "content": "Go to Paris, France"}, {"role": "assistant", "content": "{"lat": 48.856667, "lng":2.352222}]}
Interesting service, I’m definitely going to try it. I’d like to fine tune a 7B for function calling, and if possible, mimic openai’s function description template so I can share them between model calls. I’ve experimented with injecting the function descriptions with a preamble to a user’s prompt and it works ok (with Mistral 7B Instruct) but with many edge cases. I suspect I need to fine tune to get it to improve. How would I go about structuring my user prompts in the training dataset? Would something like this work?: