Skip to content

Token Limit #1

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
106AbdulBasit opened this issue Mar 14, 2023 · 0 comments
Open

Token Limit #1

106AbdulBasit opened this issue Mar 14, 2023 · 0 comments

Comments

@106AbdulBasit
Copy link

Hey There, very nice work, I just notice a thing that I faced today maybe it would help others

messages = [
{"role": "system", "content": "You are a kind helpful assistant."},
]

` message = input("User : ")
if message:
messages.append(
{"role": "user", "content": message},
)
chat = openai.ChatCompletion.create(
model="gpt-3.5-turbo", messages=messages
)

reply = chat.choices[0].message.content
print(f"ChatGPT: {reply}")
messages.append({"role": "assistant", "content": reply})`

In the last line of code, the reply is added to the messages, which eventually increases the length of the messages, The maximum length of the token is 4097 while appending the reply in the messages it will cross the limit and through the error of the token limit.

so the messages are list

all you have to add

del messages[1]
it will delete the very last input and the message would remain in the same form.

if anyone had this issue , you can solve this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant