You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the last line of code, the reply is added to the messages, which eventually increases the length of the messages, The maximum length of the token is 4097 while appending the reply in the messages it will cross the limit and through the error of the token limit.
so the messages are list
all you have to add
del messages[1]
it will delete the very last input and the message would remain in the same form.
if anyone had this issue , you can solve this
The text was updated successfully, but these errors were encountered:
Hey There, very nice work, I just notice a thing that I faced today maybe it would help others
messages = [
{"role": "system", "content": "You are a kind helpful assistant."},
]
` message = input("User : ")
if message:
messages.append(
{"role": "user", "content": message},
)
chat = openai.ChatCompletion.create(
model="gpt-3.5-turbo", messages=messages
)
In the last line of code, the reply is added to the messages, which eventually increases the length of the messages, The maximum length of the token is 4097 while appending the reply in the messages it will cross the limit and through the error of the token limit.
so the messages are list
all you have to add
del messages[1]
it will delete the very last input and the message would remain in the same form.
if anyone had this issue , you can solve this
The text was updated successfully, but these errors were encountered: