Copilot runs with GPT4-turbo. It is not trained differently than openai’s GPT4-turbo, but it has different system prompts than openai, which tend to make it more easy to just quit discussion. I have never seen openai to say that I will stop this conversation, but copilot does it daily.
So by “different system prompts”, you mean Microsoft injects something more akin to their own modifiers into the prompt before passing it over to OpenAI?
(The same way somebody might modify their own prompt, “explain metaphysics” with their own modifiers like “in the tone of a redneck”?)
I assumed OpenAI could slot in extra training data as a whole extra component, but that also makes sense to me… And would probably require less effort.
Yeah, pretty much like that, in Azure and paid openai both let you modify the system prompt also. There is also a creativity (temperature) property that can be modified. When too high, it will hallucinate more, if too low, it will give same output everytime.
Retraining the model costs like hundred million and weeks of computing power.
Copilot runs with GPT4-turbo. It is not trained differently than openai’s GPT4-turbo, but it has different system prompts than openai, which tend to make it more easy to just quit discussion. I have never seen openai to say that I will stop this conversation, but copilot does it daily.
So by “different system prompts”, you mean Microsoft injects something more akin to their own modifiers into the prompt before passing it over to OpenAI?
(The same way somebody might modify their own prompt, “explain metaphysics” with their own modifiers like “in the tone of a redneck”?)
I assumed OpenAI could slot in extra training data as a whole extra component, but that also makes sense to me… And would probably require less effort.
Yeah, pretty much like that, in Azure and paid openai both let you modify the system prompt also. There is also a creativity (temperature) property that can be modified. When too high, it will hallucinate more, if too low, it will give same output everytime.
Retraining the model costs like hundred million and weeks of computing power.