Security News > 2020 > February > Why 40% of privacy compliance tech will rely on AI by 2023
With privacy laws and data breaches coming into focus in 2019, security leaders are looking for new ways to keep personal information safe.
The heightened conversation around data security has resulted in mounting pressure on privacy professionals, who are ultimately responsible for keeping an organization's data secure.
"Another reason would be the data in scope, personal data. The identifiability of a record depends on the context and meaning. Personal data is much more than just names, addresses and SSNs. AI technology is capable of recognizing patterns and contextualized or identifiable data, discovering that data faster than conventional, systems," Willemsen added.
SEE: 5 things developers should know about data privacy and security.
"Especially when organizations operate a complex architecture where personal data is hard to manage across systems, if they are insufficiently able to control the personal data lifecycle, or if they have a high privacy risk exposure-e.g., when expecting to receive large SRR volumes unable to be handled today-organizations may want to investigate AI-based solutions," he added.
News URL
Related news
- Strong privacy laws boost confidence in sharing information with AI (source)
- Securing AI’s new frontier: Visibility, governance, and mitigating compliance risks (source)
- How companies can address bias and privacy challenges in AI models (source)
- Data Governance in DevOps: Ensuring Compliance in the AI Era (source)