Intelligent drummer module based on beat-tracking

Main Article Content

Juan Camilo Gómez Villamil

Article Details

Section
Research Articles

Abstract

Most of the beat-tracking research tends to work on the exploration of theoretical strategies and not on the development of automatic devices that can be functional in real musical environments. As a consequence of the above, there is a scarcity of electronic devices for musical backing based on the beat-tracking technique. So, we have developed an automatic musical backing device based on beat-tracking with the real-time operation.

[1] V. Alluri et al., “Musical expertise modulates functional connectivity of limbic regions during continuous music listening,” Psychomusicology Music. Mind, Brain, vol. 25, no. 4, pp. 443–454, 2015, https://doi.org/10.1037/pmu0000124.
[2] P. Toiviainen, I. Burunat, E. Brattico, P. Vuust, and V. Alluri, “The chronnectome of musical beat,” Neuroimage, no. September, pp. 1053–8119, 2019, https://doi.org/10.1016/j.neuroimage.2019.116191.
[3] A. Holzapfel, M. E. P. Davies, J. R. Zapata, J. L. Oliveira, and F. Gouyon, “Selective sampling for beat tracking evaluation,” IEEE Trans. Audio, Speech Lang. Process., vol. 20, no. 9, pp. 2539–2548, 2012, https://doi.org/10.1109/TASL.2012.2205244.
[4] W. T. Fitch, “The biology and evolution of music: A comparative perspective,” Cognition, vol. 100, no. 1, pp. 173–215, 2006, https://doi.org/10.1016/j.cognition.2005.11.009.
[5] H. Honing, “Without it no music: Beat induction as a fundamental musical trait,” Ann. N. Y. Acad. Sci., vol. 1252, no. 1, pp. 85–91, 2012, https://doi.org/10.1111/j.1749-6632.2011.06402.x.
[6] D. P. W. Ellis, “Beat tracking by dynamic programming,” J. New Music Res., vol. 36, no. 1, pp. 51–60, 2007, https://doi.org/10.1080/09298210701653344.
[7] M. Istvanek, Z. Smekal, L. Spurny, and J. Mekyska, “Enhancement of conventional beat tracking system using Teager-Kaiser energy operator,” Appl. Sci., vol. 10, no. 1, pp. 1–20, 2020, https://doi.org/10.3390/app10010379.
[8] D. Rosenthal, “Emulation of Human Rhythm Perception,” Computer Music Journal, vol. 16, no. 1. pp. 64, 1992, https://doi.org/10.2307/3680495.
[9] M. Goto and Y. Muraoka, “A Real-time Beat Tracking System for Audio Signals,” Work. Comput. Audit. Scene Anal. Music, pp. 68–75, 1995.
[10] M. Goto and Y. Muraoka, “Real-time Beat Tracking for Drumless Audio Signals,” Speech Commun., vol. 27, no. 3--4, pp. 311–335, 1999.
[11] S. Dixon, “Automatic extraction of tempo and beat from expressive performances,” Int. J. Phytoremediation, vol. 21, no. 1, pp. 39–58, 2001, https://doi.org/10.1076/jnmr.30.1.39.7119.
[12] S. W. Hainsworth and M. D. Macleod, “Particle filtering applied to musical tempo tracking,” EURASIP J. Appl. Signal Processing, vol. 2004, no. 15, pp. 2385–2395, 2004, https://doi.org/10.1155/S1110865704408099.
[13] J. P. Bello, L. Daudet, S. Abdallah, C. Duxbury, M. Davies, and M. B. Sandler, “A tutorial on onset detection in music signals,” IEEE Trans. Speech Audio Process., vol. 13, no. 5, pp. 1035–1046, 2005, https://doi.org/10.1109/TSA.2005.851998.
[14] W. A. Sethares, R. D. Morris, and J. C. Sethares, “Beat tracking of musical performances using low-level audio features,” IEEE Trans. Speech Audio Process., vol. 13, no. 2, pp. 275–285, 2005, https://doi.org/10.1109/TSA.2004.841053.
[15] A. P. Klapuri, A. J. Eronen, and J. T. Astola, “Analysis of the Meter of Acoustic Musical Signals,” vol. 14, no. 1, pp. 342–355, 2006, https://doi.org/10.1109/TSA.2005.854090.
[16] M. E. P. Davies and M. D. Plumbley, “Context-dependent beat tracking of musical audio,” IEEE Trans. Audio, Speech Lang. Process., vol. 15, no. 3, pp. 1009–1020, 2007, https://doi.org/10.1109/TASL.2006.885257.
[17] D. P. W. Ellis and G. E. Poliner, “IDENTIFYING ‘ COVER SONGS ’ WITH CHROMA FEATURES AND DYNAMIC PROGRAMMING BEAT TRACKING Daniel P . W . Ellis and Graham E . Poliner Columbia University , New York NY 10027 USA,” New York, pp. 1429–1432, 2007, https://doi.org/10.1109/ICASSP.2007.367348.
[18] Y. Shiu and C. C. J. Kuo, “Musical beat tracking via Kalman filtering and noisy measurements selection,” Proc. - IEEE Int. Symp. Circuits Syst., pp. 3250–3253, 2008, https://doi.org/10.1109/ISCAS.2008.4542151.
[19] N. Degara, E. A. Rua, A. Pena, S. Torres-Guijarro, M. E. P. Davies, and M. D. Plumbley, “Reliability-informed beat tracking of musical signals,” IEEE Trans. Audio, Speech Lang. Process., vol. 20, no. 1, pp. 278–289, 2012, https://doi.org/10.1109/TASL.2011.2160854.
[20] I. Al-Hussaini et al., “Predictive Real-Time Beat Tracking from Music for Embedded Application,” Proc. - IEEE 1st Conf. Multimed. Inf. Process. Retrieval, MIPR 2018, pp. 297–300, 2018, https://doi.org/10.1109/MIPR.2018.00068.
[21] H. Ramírez, “SISTEMA DE TRANSCRIPCIÓN AUTOMÁTICA MUSICAL PARA INSTRUMENTOS DE CUERDA PERCUTIDA,” vol. 23, no. 3, pp. 35–37, 2019.
[22] S. Balaji, “Waterfall vs v-model vs agile : A comparative study on SDLC,” WATEERFALL Vs V-MODEL Vs Agil. A Comp. STUDY SDLC, vol. 2, no. 1, pp. 26–30, 2012.
[23] RaspberryPi, “Raspberry Pi 3 Model B+,” pp. 1–5, Accessed: Sep. 04, 2020. [Online]. Available: www.raspberrypi.org/products/raspberry.
[24] L. Urbansky and U. Zölzer, “A digital radio-frequency condenser microphone with amplitude modulation,” Proc. - IEEE Int. Symp. Circuits Syst., vol. 2019-May, pp. 1–5, 2019, https://doi.org/10.1109/ISCAS.2019.8702611.
[25] D. Caicedo, J. C. Martínez, and J. Andrade, “Control de iluminación con reconocimiento remoto de voz,” Ing. Solidar., vol. 7, no. 13, pp. 35–45, 2011.
[26] F. Shaikh, “Learn Audio Beat Tracking for Music Information Retrieval (with Python codes),” pp. 1, Accessed: Mar. 05, 2020. [Online]. Available: https://www.analyticsvidhya.com/blog/2018/02/audio-beat-tracking-for-music-information-retrieval/.
[27] S. Böck and G. Widmer, “Maximum filter vibrato suppression for onset detection,” DAFx 2013 - 16th Int. Conf. Digit. Audio Eff., pp. 1–7, 2013.
[28] S. Böck, F. Krebs, and M. Schedl, “Evaluating the online capabilities of onset detection methods,” Proc. 13th Int. Soc. Music Inf. Retr. Conf. ISMIR 2012, pp. 49–54, 2012, https://doi.org/10.5281/zenodo.1416036.
[29] I. de percepción Computacional, “GitHub - CPJKU/onset_db: Onset data set which can be used to tune/evaluate onset detection algorithms,” pp. 1, Accessed: Aug. 20, 2020. [Online]. Available: https://github.com/CPJKU/onset_db.
[30] A. Modules, “Embedded Systems - Adaptive Modules,” pp. 1, Accessed: Feb. 27, 2020. [Online]. Available: https://adaptivemodules.com/info/embedded-systems/.
[31] C. Koulamas and M. T. Lazarescu, “Real-time embedded systems: Present and future,” Electron, vol. 7, no. 9, pp. 10–12, 2018, https://doi.org/10.3390/electronics7090205.
[32] G. Zaccone, “Python parallel programming,” pp. 15, 2015.
[33] J. H. M. Dassen and C. Stickelman, “The Debian GNU/Linux FAQ,” pp. 1–3, 2019.
[34] M. Miller, “The Complete Idiot’s Guide to Music History,” pp. 1–336, 2008.
[35] M. E. P. Davies, N. Degara, and M. D. Plumbley, “Evaluation Methods for Musical Audio Beat Tracking Algorithms,” Audio, no. October, pp. 17, 2009, https://doi.org/10.13140/2.1.4703.4568.
[36] S. Dixon, “Evaluation of the audio beat tracking system BeatRoot,” J. New Music Res., vol. 36, no. 1, pp. 39–50, 2007, https://doi.org/10.1080/09298210701653310.

DB Error: Unknown column 'Array' in 'where clause'