Security News > 2023 > July > How Google Keeps Company Data Safe While Using Generative AI Chatbots
Find out how Google Cloud approaches AI data, what privacy measures your business should keep in mind when it comes to generative AI and how to make a machine learning application "Unlearn" someone's data.
Google Cloud approaches using personal data in AI products by covering such data under the existing Google Cloud Platform Agreement.
Google Cloud makes three generative AI products: the contact center tool CCAI Platform, the Generative AI App Builder and the Vertex AI portfolio, which is a suite of tools for deploying and building machine learning models.
Behzadi pointed out that Google Cloud works to make sure its AI products' "Responses are grounded in factuality and aligned to company brand, and that generative AI is tightly integrated into existing business logic, data management and entitlements regimes."
Businesses using public AI chatbots "Must be mindful of keeping customers as the top priority, and ensuring that their AI strategy, including chatbots, is built on top of and integrated with a well-defined data governance strategy," Behzadi said.
In late June 2023, Google announced a competition for something a bit different: machine unlearning, or making sure sensitive data can be removed from AI training sets to comply with global data regulation standards such as the GDPR. This can be challenging because it involves tracing whether a certain person's data was used to train a machine learning model.
News URL
https://www.techrepublic.com/article/google-generative-ai-chatbots-company-data/
Related news
- Ireland's Watchdog Launches Inquiry into Google's AI Data Practices in Europe (source)
- EU kicks off an inquiry into Google's AI model (source)
- Google Cloud Document AI flaw (still) allows data theft despite bounty payout (source)
- Google’s AI Tool Big Sleep Finds Zero-Day Vulnerability in SQLite Database Engine (source)
- Google claims Big Sleep 'first' AI to spot freshly committed security bug that fuzzing missed (source)