International Journal of Applied Science and Engineering
Published by Chaoyang University of Technology

Ricardo Catanghal Jr*

College of Computer Studies, University of Antique, Antique, Philippines


 

Download Citation: |
Download PDF


ABSTRACT


In the field of sound recognition, the research and study of sound event detection are still active, with the vast majority of the papers focusing mostly on the domain of speech and music. This paper presents and discusses a framework for a study room event detection system. Feature extraction techniques are utilized and discussed to obtain the parametric type representation for the analysis of the sound for intelligent homes machine listening systems specifically for study room. The conduct of sound analysis within the category of the sounds the least accurate was the door knock, but the accuracy of 95.00%, currently in the field is acknowledged as good, making the parameters fit for detecting surrounding sounds. The performance of the CNN in detecting environmental sounds was analyzed using the parameters that were defined, with an overall accuracy of 96.8%. The result was promising for machine learning that detects sounds that can be applied as technology for an innovative learning environment.


Keywords: Smart homes study room, Innovative learning technologies, Machine learning, Artificial neural network (ANN).


Share this article with your colleagues

 


REFERENCES


  1. Agrawal, D.M., Sailor, H.B., Soni, M.H., Patil, H.A. 2017. Novel teobased gammatone features for environmental sound classification. European Signal Processing Conference, 1809–1813.

  2. Ahmad, S., Agrawal, S., Joshi, S., Taran, S., Bajaj, V., Demir, F. 2020. Environmental sound classification using optimum allocation sampling based empirical mode decomposition. Physica A: Statistical Mechanics and its Applications, 537.

  3. Audio Analytica, 2018. The 2018 Smart Home Report: AI attitudes and expectations. audioanalytica.com.

  4. Catanghal, R.A., Palaoag T., Dayagdag, C. 2019. Environmental acoustic transformation and feature extraction for machine hearing. Proc. IOP Conference Series: Materials Science and Engineering, 482, 012007.

  5. Catanghal, R.A. 2019. A framework for home machine listening system with convolutional neural network. International Journal of Simulation: Systems, Science & Technology.

  6. Catanghal, R.A. 2020. Behavior analysis of convolutional neural network for environmental sound. Journal of Innovative Technology Convergence, 2, 103–110.

  7. Foggia, F., Petkov, N., Saggese, A., Strisciuglio, N., Vento, M. 2016. Audio surveillance of roads: a system for detecting anomaloussounds. IEEE Transactions on Intelligent Transportation Systems, 17, 279–288.

  8. O’Shea, K., Nash, R. 2015. An introduction to convolutional neural networks. arXiv preprint arXiv:1511.08458

  9. Salamon, J., Jacoby, C., Bello, J.P. 2014. A dataset and taxonomy for urban sound research. in Proceedings of the ACM International Conference on Multimedia. ACM, 1041–1044.

  10. Stowell, D., Giannoulis, D., Benetos, E., Lagrange, M., Plumbley. M.D. 2015. Detection and classification of acoustic scenes and events. IEEE Transactions on Multimedia, 17, 1733–1746.

  11. Yamakawa, N., Takahashi, T., Kitahara, T., Ogata, T., Okuno, H. 2011. Environmental sound recognition for robot audition using matchingpursuit. Modern Approaches in Applied Intelligence, 1–10.


ARTICLE INFORMATION


Received: 2021-02-01
Revised: 2021-04-11
Accepted: 2021-05-11
Publication Date: 2021-06-21


Cite this article:

Catanghal Jr, R. 2021. Sound detection for study room monitoring and evaluation. International Journal of Applied Science and Engineering, 18, 2021040. https://doi.org/10.6703/IJASE.202106_18(4).005

  Copyright The Author(s). This is an open access article distributed under the terms of the Creative Commons Attribution License (CC BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are cited.


We use cookies on this website to improve your user experience. By using this site you agree to its use of cookies.