Real-time Identification of Respiratory Movements through a Microphone

Abstract

This work presents a software application to identify, in real time, the respiratory movements -inspiration and expiration- through a microphone. The application, which has been developed in Matlab and named ASBSLAB for the GUI version and ASBSLABCONSOLE for the command-line version, is the result of a research and experimentation process. A total of 48 minutes of breathing movements from four subjects was recorded and 18 acoustic features were extracted to generate the data model. A first level of identification, based on the classification of tiny audio segments, was designed using kNN supervised method. The second level of identification implements a state machine that takes the results ordered in the time from kNN as input and identifies the whole respiratory movement, achieving a level of positive identifications above 95%. As computation time is a handicap, the application let the user choose easily the sample rate, the audio segment size and the set of acoustic features to use in the identification process. In addition, based on the number of features selected, this works suggests those that achieve best results.
  • Referencias
  • Cómo citar
  • Del mismo autor
  • Métricas
ALSHAER, H., PANDYA, A., BRADLEY, T.D., RUDZICZ, F. "Subject independent identification of breath sounds components using multiple classifiers," Acoustics, Speech and Signal Processing (ICASSP), 2014 IEEE International Conference on, pp. 3577 - 3581, 2014.

AVALUR, D.S. "Human Breath Detection using a Microphone," M.S. Thesis, Faculty of mathematics and natural science, Univ. of Groningen, Groningen, NL, 2013.

BRIAN, R., KAIN, A. S., "Automatic Classification of Breathing Sounds During Sleep," in Proceedings of The 38th IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). Vancouver, Canada: IEEE, 2013.

CHARBONNEAU, G., RACINEUX, J.L., SUDRAUD, M., TUCHIAS, E. "Digital processing techniques of breath sounds for objective assistance of asthma diagnosis," in Proc. ICASSP, Paris, France, 1982, pp. 736-738.

CHOWDHURY, S.K., MAJUMDER, A.K."Digital spectrum analysis of respiratory sound," IEEE Trans. Biomed. Eng., vol. BME-28, pp. 784-788, 1981.

HARRINGTON, P. Machine Learning in Action. United States: Manning Publications Co., 2012.

MASON, L. "Signal Processing Methods for Non-Invasive Respiration Monitoring," Thesis, Dept. Eng. Sci., University of Oxford, Oxford, UK, 2002.

MOEDOMO, R.L., MARDIYANTO, M.S., AHMAD, M., ALISJAHBANA, B., DJATMIKO, T."The breath sound analysis for diseases diagnosis and stress measurement," System Engineering and Technology (ICSET), 2012 International Conference on, pp. 1-6, 2012

MOUSSAVI, Z., YADOLLAHI, A. "Automatic breath and snore sounds classification from tracheal and ambient sound recordings," Medical Engineering & Physics, pp. 985-990, 2010.

OUD, M., DOOIJES, E.H. "Automated breath sound analysis," Engineering in Medicine and Biology Society, 1996. Bridging Disciplines for Biomedicine. Proceedings of the 18th Annual International Conference of the IEEE, vol. 3, pp. 990-992, 1996.

PATEL, U. "Computerized Respiratory Sound Analysis: An Exploration of Methods," Thesis, Department of Physics, CWRU School of Engineering, Cleveland, Ohio, 2011.

PETSATODIS, T., BOUKIS, C., MAGLOGIANNIS, I., DOUKAS, C. "Automated sleep breath disorders detection utilizing patient sound analysis," Biomedical Signal Processing and Control, pp. 256-264, 2012.

YOUNG, S. J., WOODLAND, P.C., AND BYRNE, W. J. The HTK BOOK, Entropic Ltd., Jan. 1999.

YUAN, K., COYLE, P., CURRAN, D. "Using Acoustic Sensors to Discriminate between Nasal and Mouth Breathing," International Journal of Bioinformatics Research and Applications, vol. 7, no. 4, Sept-Dec 2011.
Castro, J., & Marti-Puig, P. (2015). Real-time Identification of Respiratory Movements through a Microphone. ADCAIJ: Advances in Distributed Computing and Artificial Intelligence Journal, 3(3), 64–75. https://doi.org/10.14201/ADCAIJ2012116475

Downloads

Download data is not yet available.
+