Hi, I’m getting an issue with the free OpenAI plugin - “Error: This model’s maximum context length is 4097 tokens. However, your messages resulted in 5355 tokens. Please reduce the length of the messages”. It seems to be a limit of the gpt-3.5-turbo model - can I used one of the gpt-4 models with the free plugin, and if so, which one? Thank you!
Hello @debbie,
Thank you for your message and sorry for the delayed reply over the weekend
Yes, the message you get is indeed due to a limitation of GPT-3.5-turbo model. Regarding your question " can I used one of the gpt-4 models with the free plugin, and if so, which one?" The free OpenAI plugin by Zeroqode was configured to work with GPT-3,GPT-3.5-turbo models.
If you’re specifically looking to work with larger contexts or different functionalities they are offered by OpenAi ChatGpt-4 plugin by zq.
Hope this was helpful
For any other plugin-related questions feel free to reach out to us at any time.
Best regards.