NUS team develops tool that can assess vulnerability of AI systems to attacks

Assistant Professor Reza Shokri (standing in middle) with members of his NUS research team that developed the Machine Learning Privacy Meter, (from far left) master's student Mihir Khandekar, 24, doctoral student Chang Hongyan, 24, research assistant
Assistant Professor Reza Shokri (standing in middle) with members of his NUS research team that developed the Machine Learning Privacy Meter, (from far left) master's student Mihir Khandekar, 24, doctoral student Chang Hongyan, 24, research assistant Aadyaa Maddi, 22, and doctoral student Rishav Chourasia, 24. ST PHOTO: TIMOTHY DAVID

National University of Singapore (NUS) researchers have developed a tool to safeguard against a new form of cyber attack that can recreate the data sets containing personal information used to train artificial intelligence (AI) machines.

The tool, called the Machine Learning (ML) Privacy Meter, has been incorporated into the developer toolkit that Google uses to test the privacy protection features of AI algorithms.

Please or to continue reading the full article.

Get unlimited access to all stories at $0.99/month

  • Latest headlines and exclusive stories
  • In-depth analyses and award-winning multimedia content
  • Get access to all with our no-contract promotional package at only $0.99/month for the first 3 months*

*Terms and conditions apply.

A version of this article appeared in the print edition of The Straits Times on November 10, 2020, with the headline 'NUS team develops tool that can assess vulnerability of AI systems to attacks'. Print Edition | Subscribe