Abstract
Self-absorption seriously affects the accuracy and stability of quantitative analysis in laser-induced breakdown spectroscopy (LIBS). To reduce the effect of self-absorption, we investigated the temporal evolution of the self-absorption effect by establishing exponential calibration curves. Meanwhile, the temporal evolution mechanism of the self-absorption effect was also investigated. The results indicated that self-absorption was weak at the early stage of plasma expansion. For determination of manganese (Mn) in steel, as an example, the concentration of upper bound of linearity (C int ) was 2.000 wt. % at the early stage of plasma expansion (in a time window of 0.2-0.4 μs)-much higher than 0.363 wt. % at a traditional optimization time window (2-3 μs). The accuracy and stability of quantitative analysis at the time window of 0.2-0.4 μs was also much better than at the time window of 2-3 μs. This work provides a simple method for improving quantitative analysis performance and avoiding the self-absorption effect in LIBS.
Original language | English (US) |
---|---|
Pages (from-to) | 4261-4270 |
Number of pages | 10 |
Journal | Optics Express |
Volume | 27 |
Issue number | 4 |
DOIs | |
State | Published - 2019 |
ASJC Scopus subject areas
- Atomic and Molecular Physics, and Optics