Strange Behavior: Assistant Still Follows Removed Prompt Instructions
I've been working with different models and prompts for my assistant, and I've noticed something strange I'd like to get your thoughts on. Since switching from GPT-4o to other models (due to the pricing changes), I've been testing various prompts to see what works. Yesterday, I started using GPT-4o mini with a much simpler, cleaner prompt. To my surprise, the bot is still executing actions from the old prompt. These instructions no longer exist in the current prompt at all. For example, in the previous prompt, there was an instruction for the bot to check the weather on the day of a patient's appointment and mention it to them. In a few conversations today, I saw that the bot is still performing this action and talking about the weather, even though that instruction has been completely removed. @Assistable Team , could there be an issue with how prompts are updated? This could be a huge reason why we're struggling to get our assistants to follow the instructions we're giving them. Has anyone else experienced something similar? Any insights or suggestions would be appreciated!