Fix ChatBot Cutting of Responses
Currently, I have limited the amount of tokens the API can use per response. This results in the response being cut off some times, resulting in user confusion as found by the summative user research.
I need to solve this.
This could be done by increasing the amount of tokens, editing the settings or telling the chatbot to keep their responses under 300 tokens.