GPT-4 is the flavour of the season when it comes to text-based Artificial Intelligence bots. As more users try their hand at GPT-4, it is revealing small chinks in its armour and what's more the chatbot seems to agree.
Reportedly, while answering a question on pet shop recording concerns', GPT-4 spelled 'infringing' as 'infrishing' much to the confusion of the user. The screenshot of the exchange was posted on Reddit.
When the user asked the meaning of 'infrishing', GPT-4 apologised and admitted to it being a 'typographical error'. It went on to say that the correct word would be 'infringing' and explained its meaning.
Soon Reddit users started weighing in on the reasons for the typo by GPT-4. While one said 'ChatGPT is just human', another said the chatbot is an overworked employee. A user reasoned that GPT-4's training data may be 'riddled' with typos.