While most people are now pivoting to AI tools like ChatGPT to make their tasks more achievable, many companies are also wary of the risks posed by such tools. For instance, Samsung recently banned its employees from using ChatGPT-like AI tools after a sensitive code leak.
It turns out that a Samsung employee uploaded sensitive code to ChatGPT, forcing the company to ban the use of such tools, Bloomberg reported. Samsung believes that data transmitted to such AI platforms including Google Bard, Bing, and ChatGPT is stored on external servers - making it extremely hard to retrieve and delete.
Owing to this, it could end up become disclosed to others, as we've seen in the ChatGPT bug that showed random users' histories to each other. In addition, companies like OpenAI use data fed by users to train their systems. AI tools require large data sets to become smarter, accurate, and more responsive. But there are data dangers at play too, and that's why Italy has banned ChatGPT, only easing its stance recently.
Also read:?Call Annie: New App Brings ChatGPT To Life With Human-Like Conversations And An Avatar
Last month, Samsung conducted a survey to assess the use of AI tools, with 65% respondents saying that such services pose a security risk. Earlier in April, Samsung engineers accidentally leaked internal source by uploading it to ChatGPT. While it's unclear what the exact content of this information was, it's suggestive of the "security risks presented by generative AI."
Besides Samsung, many Wall Street banks have enforced a partial or complete ban on ChatGPT, including JPMorgan Chase, Bank of America, and Citigroup. Now, generative AI systems cannot be used on Samsung's company computes, tablets, and phones. This doesn't mean you cannot use ChatGPT and other generative AI tools on Samsung devices that are sold in the market.
Also read:?EU Takes Action On ChatGPT Concerns, Establishes Task Force For AI Privacy Policy
The company warned its employees that using ChatGPT and feeding intellectual property to the tool could result in them getting fired. Even then, ChatGPT has recently added an "incognito" mode that will delete data after 30 days.
What do you think about this move? Let us know in the comments below.?For more in the world of?technology?and?science, keep reading?Indiatimes.com.?