Security News > 2020 > February > Why 40% of privacy compliance tech will rely on AI by 2023

With privacy laws and data breaches coming into focus in 2019, security leaders are looking for new ways to keep personal information safe.
The heightened conversation around data security has resulted in mounting pressure on privacy professionals, who are ultimately responsible for keeping an organization's data secure.
"Another reason would be the data in scope, personal data. The identifiability of a record depends on the context and meaning. Personal data is much more than just names, addresses and SSNs. AI technology is capable of recognizing patterns and contextualized or identifiable data, discovering that data faster than conventional, systems," Willemsen added.
SEE: 5 things developers should know about data privacy and security.
"Especially when organizations operate a complex architecture where personal data is hard to manage across systems, if they are insufficiently able to control the personal data lifecycle, or if they have a high privacy risk exposure-e.g., when expecting to receive large SRR volumes unable to be handled today-organizations may want to investigate AI-based solutions," he added.
News URL
Related news
- Open source strikes back: Nextcloud Hub 10 challenges Big Tech’s monopoly on AI and privacy (source)
- Italy Bans Chinese DeepSeek AI Over Data Privacy and Ethical Concerns (source)
- Political campaigns struggle to balance AI personalization and voter privacy (source)
- South Korea Suspends DeepSeek AI Downloads Over Privacy Violations (source)
- Privacy tech firms warn France’s encryption and VPN laws threaten privacy (source)
- Understanding the AI Act and its compliance challenges (source)