YouTube player

Data security service, Cyberhaven, recently release a report stating  that sensitive data makes up 11% of what employees paste into ChatGPT and that the average company leaks sensitive data to ChatGPT hundreds of times each week. In one case, an executive copied the firm’s 2023 strategy document into ChatGPT and asked it to create a Powerpoint presentation. In another, a doctor inputted the patient’s name and details of their condition to ask ChatGPT to craft a letter for the patient’s insurance company. The security concern here is that sensitive data ingested into ChatGPT could resurface when prompted by the right queries. Some companies have taken action by either restricting the use of ChatGPT or issuing warnings to take care when using generative AI services.

https://www.darkreading.com/risk/employees-feeding-sensitive-business-data-chatgpt-raising-security-fears

https://www.cyberhaven.com/blog/4-2-of-workers-have-pasted-company-data-into-chatgpt/


This segment was created for the It’s 5:05 podcast

https://505updates.com/april-5-2023/