From doing research and simple question answers to improving customer service, everything can be done with ChatGPT. Continue reading →
ChatGPT has grabbed the attention of millions of internet users in just a few months. With over 100 million active users in only two months of launch, it has become the fastest-growing consumer application.
From doing research and simple question answers to improving customer service, everything can be done with ChatGPT. Overall, it is gradually becoming a go-to resource for individuals and businesses. However, all the shiny aspects of ChatGPT are also raising concerns about data privacy. So, let’s explore in detail the privacy risks with ChatGPT and address whether it can disclose our personal information and passwords.
ChatGPT is powered by the GPT-3 (Generative Pretrained Transformer 3) model, a powerful and the largest language-processing AI model trained to produce human-like text. OpenAI, the developer of ChatGPT, trained the GPT-3 model by text databases from the internet. It included an astonishing 570 GB of data gathered from books, articles, posts, Wikipedia, web texts, and plenty of other sources.
To be more specific, OpenAI fed 300 billion words into the system to train the GPT-3 model. So, if you have written a product review, an article, or any comment, then it is likely that the information is also fed to the GPT-3 model.
The way OpenAI has trained ChatGPT is a privacy concern due to many reasons. For instance, when OpenAI fed 570 GB of data, it didn’t ask any one of us whether it could use our data. This reflects a violation of privacy as the data the company has gathered can be sensitive and can be used to get our location and other personal details. Even if OpenAI has used publicly available data, the contextual integrity breach still exists, which requires it to avoid revealing users’ data outside the original context.
Moreover, all the data that OpenAI has scrapped from the internet, the company didn’t pay any website owner or companies. With Microsoft’s $10 billion investment and whooping users, OpenAI valuation can reach $29 billion. Moreover, its ChatGPT Plus subscription plan is expected to generate $1 billion in revenue by 2024. All this is possible because of the massive data OpenAI gathered from the internet, including our personal information.
To address whether ChatGPT can disclose our personal information and passwords, it is important to list the major privacy concerns with ChatGPT. So, below are a few concerning privacy issues with ChatGPT worth knowing:
ChatGPT has no procedure that allows the users to check whether the company is storing any personal information. This is one of the rights that the European GDPR (General Data Protection Regulation) mentions, but it seems like it is not addressed by OpenAI properly till now. In fact, there is also an ongoing debate on whether the company adheres to GDPR requirements. OpenAI claims to comply with GDPR, but many doubt the company’s credibility in it.
The GPT-3 model is trained from massive internet data, which also includes copyrighted content. It is very well evident from ChatGPT itself. For example, if you ask ChatGPT to write a few sentences about a copyrighted book, it will certainly provide a few sentences from that book. So, it is likely that it is trained from plenty of copyrighted content without getting consent.
Whatever you search in ChatGPT becomes part of ChatGPT’s database. ChatGPT requires that in order to train the model further and improve its response. But it has a concerning aspect to it. For example, if a developer asks ChatGPT to check a code, it gets stored in the database. Now if someone else asks ChatGPT to write a similar code, it can take some code elements from that developer’s personal code and share it with the other person. In short, its “always remembering” aspect provides a risk of unintentional exposure to sensitive information.
The privacy policy of OpenAI also raises some serious concerns. If you look into the privacy policy, you will see that OpenAI clearly mentions that it may share your personal information in certain circumstances with third parties without further notice. Moreover, OpenAI also automatically collects information, like IP address, content preference, browser type, device name, etc. So, it clearly shows that OpenAI gathers a lot of user data and can use/share for different purposes.
Recently, a Microsoft employee asked in the internal forum whether ChatGPT was allowed for use in the workplace. Microsoft’s CTO office responded to the query and said that you can use ChatGPT or any other offering from OpenAI as long as you don’t share sensitive data. Similarly, Amazon has also notified employees with similar warnings. In fact, ChatGPT itself says to avoid sharing sensitive information.
When tech giants don’t trust ChatGPT for sensitive data protection, how can we believe that our personal information and passwords are secure in it? Since ChatGPT remembers everything to improve its model and responses, and the company also mentions disclosing personal data with third parties without further notice, we can say that ChatGPT can likely disclose our personal information and passwords intentionally or unintentionally.
If you directly ask ChatGPT about this, it will present a detailed answer on how OpenAI ensures several security measures to protect users’ data. However, the privacy policy and the deeper look into ChatGPT’s overall framework tell the whole other story. To sum up, it is not clear how secure our personal data is in ChatGPT, so it is highly recommended to avoid sharing any sensitive data in ChatGPT.
In today’s data-driven world, the demand for skilled data analysts is skyrocketing. Companies across various…
In recent years, removable bollards have become important tools for managing access and security. These…
Your web application, that you put your heart and soul into developing, not working? Then…
Effective communication is at the heart of every successful operation. Whether interacting with customers, partners,…
Today, companies are increasingly turning to learning management systems (LMS) to help with employee training.…
Those custom Velcro patches may seem like mere accessories for your uniform or team gear,…