The personal data of several ChatGPT users has been publicly visible for a while on the text-based artificial intelligence platform, ChatGPT.
The information was revealed by the developers of OpenAI, adding the bug was fixed, but that it had allowed several users of the platform to see the exchanges made with the algorithm.
Sam Altman, CEO of the platform, indicated via a series of tweets that the flaw, although significant, would have affected only a small number of users. The problem would have been solved, however, note that the history of exchanges with the AI will no longer be accessible for users affected by the bug.
we had a significant issue in ChatGPT due to a bug in an open source library, for which a fix has now been released and we have just finished validating.
a small percentage of users were able to see the titles of other users’ conversation history.
we feel awful about this.
— Sam Altman (@sama) March 22, 2023
This has prompted more than one Internet user to wonder about the severity of similar flaws, especially since ChatGPT is based on the collection of information during exchanges with users, to generate adequate responses. The risk of seeing sensitive information shared by the AI is not to be ruled out, since it is still only at a basic stage of its development.