TY - JOUR
T1 - Vibrational Spectroscopy Can Be Vulnerable to Adversarial Attacks
AU - Liu, Jinchao
AU - Osadchy, Margarita
AU - Wang, Yan
AU - Wu, Yingying
AU - Li, Enyi
AU - Hu, Xiaolin
AU - Fang, Yongchun
N1 - Publisher Copyright:
© 2024 American Chemical Society.
PY - 2024/10/22
Y1 - 2024/10/22
N2 - Nondestructive detection methods based on vibrational spectroscopy have been widely used in many critical applications in a variety of fields such as the chemical industry, pharmacy, national defense, security, and so on. As these methods/applications rely on machine learning models for data analysis, studying the threats associated with adversarial examples in vibrational spectroscopy and defenses against them is of great importance. In this paper, we propose a novel adversarial method to attack vibrational spectroscopy, named SynPat, where synthetic peaks produced by a physical model are placed at key locations to form adversarial perturbations. Our new attack generates perturbations that successfully deceive machine learning models for Raman and infrared spectrum analysis while they blend much better into the spectra and hence are unnoticeable to human operators, unlike the existing state-of-the-art adversarial attacking methods, e.g., images and audio. We verified the superiority of the proposed SynPat by an imperceptibility test conducted by human experts and of defense experiments by an AI detector. To the best of our knowledge, this is a first thorough study on the robustness of vibrational spectroscopic techniques against adversarial samples and defense mechanisms. Our extensive experiments show that machine learning models for vibrational spectroscopy, including conventional and deep models for Raman or infrared classification and regression, are all vulnerable to adversarial perturbations and thus may pose serious security threats to our society.
AB - Nondestructive detection methods based on vibrational spectroscopy have been widely used in many critical applications in a variety of fields such as the chemical industry, pharmacy, national defense, security, and so on. As these methods/applications rely on machine learning models for data analysis, studying the threats associated with adversarial examples in vibrational spectroscopy and defenses against them is of great importance. In this paper, we propose a novel adversarial method to attack vibrational spectroscopy, named SynPat, where synthetic peaks produced by a physical model are placed at key locations to form adversarial perturbations. Our new attack generates perturbations that successfully deceive machine learning models for Raman and infrared spectrum analysis while they blend much better into the spectra and hence are unnoticeable to human operators, unlike the existing state-of-the-art adversarial attacking methods, e.g., images and audio. We verified the superiority of the proposed SynPat by an imperceptibility test conducted by human experts and of defense experiments by an AI detector. To the best of our knowledge, this is a first thorough study on the robustness of vibrational spectroscopic techniques against adversarial samples and defense mechanisms. Our extensive experiments show that machine learning models for vibrational spectroscopy, including conventional and deep models for Raman or infrared classification and regression, are all vulnerable to adversarial perturbations and thus may pose serious security threats to our society.
UR - http://www.scopus.com/inward/record.url?scp=85206473437&partnerID=8YFLogxK
M3 - Article
C2 - 39392227
AN - SCOPUS:85206473437
SN - 0003-2700
JO - Analytical Chemistry
JF - Analytical Chemistry
ER -