Spectral imaging and a one-class classifier for detecting elastane in cotton fabrics

Ella Mahlamäki*, Inge Schlapp-Hackl, Tharindu Koralage, Michael Hummel, Mikko Mäkelä

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

Abstract

Elastane detection is important for textile recycling as elastane fibers can hamper mechanical and chemical fiber recycling. Here, we report the use of near-infrared imaging spectroscopy and class modelling to detect 2-6% elastane in consumer cotton fabrics to provide alternatives to current detection methods, which are invasive and time-consuming. Our method automatically identified outlier fabrics and measurements with class-specific clustering and showed higher classification accuracies by averaging across individual pixel spectra to reduce sampling uncertainty. The final classification results showed median test set true positive and true negative rates of 89-97% based on randomized resampling. Class modelling offers clear benefits compared to commonly used discriminant classifiers as it allows modelling new classes using only a set of target samples without requiring representative training objects from all the other classes. Overall, these results open the possibility for fast non-invasive detection of small amounts of elastane in cotton, taking us a step closer to a circular economy of textiles.

Original languageEnglish
Pages (from-to)2295-2301
Number of pages7
JournalAnalyst
Volume150
Issue number11
DOIs
Publication statusPublished - 22 Apr 2025
MoE publication typeA1 Journal article-refereed

Funding

This work was financially supported by the Strategic Research Council established within the Research Council of Finland under grant agreement no. 352698 and by the Research Council of Finland Flagship Programme: Finnish Center for Artificial Intelligence FCAI.

Fingerprint

Dive into the research topics of 'Spectral imaging and a one-class classifier for detecting elastane in cotton fabrics'. Together they form a unique fingerprint.

Cite this