Main Article Content
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
As the author of the article, I declare that is an original unpublished work exclusively created by me, that it has not been submitted for simultaneous evaluation by another publication and that there is no impediment of any kind for concession of the rights provided for in this contract.
In this sense, I am committed to await the result of the evaluation by the journal Ingeniería Solidaría before considering its submission to another medium; in case the response by that publication is positive, additionally, I am committed to respond for any action involving claims, plagiarism or any other kind of claim that could be made by third parties.
At the same time, as the author or co-author, I declare that I am completely in agreement with the conditions presented in this work and that I cede all patrimonial rights, in other words, regarding reproduction, public communication, distribution, dissemination, transformation, making it available and all forms of exploitation of the work using any medium or procedure, during the term of the legal protection of the work and in every country in the world, to the Universidad Cooperativa de Colombia Press.
Most of the beat-tracking research tends to work on the exploration of theoretical strategies and not on the development of automatic devices that can be functional in real musical environments. As a consequence of the above, there is a scarcity of electronic devices for musical backing based on the beat-tracking technique. So, we have developed an automatic musical backing device based on beat-tracking with the real-time operation.
 P. Toiviainen, I. Burunat, E. Brattico, P. Vuust, and V. Alluri, “The chronnectome of musical beat,” Neuroimage, no. September, pp. 1053–8119, 2019, https://doi.org/10.1016/j.neuroimage.2019.116191.
 A. Holzapfel, M. E. P. Davies, J. R. Zapata, J. L. Oliveira, and F. Gouyon, “Selective sampling for beat tracking evaluation,” IEEE Trans. Audio, Speech Lang. Process., vol. 20, no. 9, pp. 2539–2548, 2012, https://doi.org/10.1109/TASL.2012.2205244.
 W. T. Fitch, “The biology and evolution of music: A comparative perspective,” Cognition, vol. 100, no. 1, pp. 173–215, 2006, https://doi.org/10.1016/j.cognition.2005.11.009.
 H. Honing, “Without it no music: Beat induction as a fundamental musical trait,” Ann. N. Y. Acad. Sci., vol. 1252, no. 1, pp. 85–91, 2012, https://doi.org/10.1111/j.1749-6632.2011.06402.x.
 D. P. W. Ellis, “Beat tracking by dynamic programming,” J. New Music Res., vol. 36, no. 1, pp. 51–60, 2007, https://doi.org/10.1080/09298210701653344.
 M. Istvanek, Z. Smekal, L. Spurny, and J. Mekyska, “Enhancement of conventional beat tracking system using Teager-Kaiser energy operator,” Appl. Sci., vol. 10, no. 1, pp. 1–20, 2020, https://doi.org/10.3390/app10010379.
 D. Rosenthal, “Emulation of Human Rhythm Perception,” Computer Music Journal, vol. 16, no. 1. pp. 64, 1992, https://doi.org/10.2307/3680495.
 M. Goto and Y. Muraoka, “A Real-time Beat Tracking System for Audio Signals,” Work. Comput. Audit. Scene Anal. Music, pp. 68–75, 1995.
 M. Goto and Y. Muraoka, “Real-time Beat Tracking for Drumless Audio Signals,” Speech Commun., vol. 27, no. 3--4, pp. 311–335, 1999.
 S. Dixon, “Automatic extraction of tempo and beat from expressive performances,” Int. J. Phytoremediation, vol. 21, no. 1, pp. 39–58, 2001, https://doi.org/10.1076/jnmr.220.127.116.1119.
 S. W. Hainsworth and M. D. Macleod, “Particle filtering applied to musical tempo tracking,” EURASIP J. Appl. Signal Processing, vol. 2004, no. 15, pp. 2385–2395, 2004, https://doi.org/10.1155/S1110865704408099.
 J. P. Bello, L. Daudet, S. Abdallah, C. Duxbury, M. Davies, and M. B. Sandler, “A tutorial on onset detection in music signals,” IEEE Trans. Speech Audio Process., vol. 13, no. 5, pp. 1035–1046, 2005, https://doi.org/10.1109/TSA.2005.851998.
 W. A. Sethares, R. D. Morris, and J. C. Sethares, “Beat tracking of musical performances using low-level audio features,” IEEE Trans. Speech Audio Process., vol. 13, no. 2, pp. 275–285, 2005, https://doi.org/10.1109/TSA.2004.841053.
 A. P. Klapuri, A. J. Eronen, and J. T. Astola, “Analysis of the Meter of Acoustic Musical Signals,” vol. 14, no. 1, pp. 342–355, 2006, https://doi.org/10.1109/TSA.2005.854090.
 M. E. P. Davies and M. D. Plumbley, “Context-dependent beat tracking of musical audio,” IEEE Trans. Audio, Speech Lang. Process., vol. 15, no. 3, pp. 1009–1020, 2007, https://doi.org/10.1109/TASL.2006.885257.
 D. P. W. Ellis and G. E. Poliner, “IDENTIFYING ‘ COVER SONGS ’ WITH CHROMA FEATURES AND DYNAMIC PROGRAMMING BEAT TRACKING Daniel P . W . Ellis and Graham E . Poliner Columbia University , New York NY 10027 USA,” New York, pp. 1429–1432, 2007, https://doi.org/10.1109/ICASSP.2007.367348.
 Y. Shiu and C. C. J. Kuo, “Musical beat tracking via Kalman filtering and noisy measurements selection,” Proc. - IEEE Int. Symp. Circuits Syst., pp. 3250–3253, 2008, https://doi.org/10.1109/ISCAS.2008.4542151.
 N. Degara, E. A. Rua, A. Pena, S. Torres-Guijarro, M. E. P. Davies, and M. D. Plumbley, “Reliability-informed beat tracking of musical signals,” IEEE Trans. Audio, Speech Lang. Process., vol. 20, no. 1, pp. 278–289, 2012, https://doi.org/10.1109/TASL.2011.2160854.
 I. Al-Hussaini et al., “Predictive Real-Time Beat Tracking from Music for Embedded Application,” Proc. - IEEE 1st Conf. Multimed. Inf. Process. Retrieval, MIPR 2018, pp. 297–300, 2018, https://doi.org/10.1109/MIPR.2018.00068.
 H. Ramírez, “SISTEMA DE TRANSCRIPCIÓN AUTOMÁTICA MUSICAL PARA INSTRUMENTOS DE CUERDA PERCUTIDA,” vol. 23, no. 3, pp. 35–37, 2019.
 S. Balaji, “Waterfall vs v-model vs agile : A comparative study on SDLC,” WATEERFALL Vs V-MODEL Vs Agil. A Comp. STUDY SDLC, vol. 2, no. 1, pp. 26–30, 2012.
 RaspberryPi, “Raspberry Pi 3 Model B+,” pp. 1–5, Accessed: Sep. 04, 2020. [Online]. Available: www.raspberrypi.org/products/raspberry.
 L. Urbansky and U. Zölzer, “A digital radio-frequency condenser microphone with amplitude modulation,” Proc. - IEEE Int. Symp. Circuits Syst., vol. 2019-May, pp. 1–5, 2019, https://doi.org/10.1109/ISCAS.2019.8702611.
 D. Caicedo, J. C. Martínez, and J. Andrade, “Control de iluminación con reconocimiento remoto de voz,” Ing. Solidar., vol. 7, no. 13, pp. 35–45, 2011.
 F. Shaikh, “Learn Audio Beat Tracking for Music Information Retrieval (with Python codes),” pp. 1, Accessed: Mar. 05, 2020. [Online]. Available: https://www.analyticsvidhya.com/blog/2018/02/audio-beat-tracking-for-music-information-retrieval/.
 S. Böck and G. Widmer, “Maximum filter vibrato suppression for onset detection,” DAFx 2013 - 16th Int. Conf. Digit. Audio Eff., pp. 1–7, 2013.
 S. Böck, F. Krebs, and M. Schedl, “Evaluating the online capabilities of onset detection methods,” Proc. 13th Int. Soc. Music Inf. Retr. Conf. ISMIR 2012, pp. 49–54, 2012, https://doi.org/10.5281/zenodo.1416036.
 I. de percepción Computacional, “GitHub - CPJKU/onset_db: Onset data set which can be used to tune/evaluate onset detection algorithms,” pp. 1, Accessed: Aug. 20, 2020. [Online]. Available: https://github.com/CPJKU/onset_db.
 A. Modules, “Embedded Systems - Adaptive Modules,” pp. 1, Accessed: Feb. 27, 2020. [Online]. Available: https://adaptivemodules.com/info/embedded-systems/.
 C. Koulamas and M. T. Lazarescu, “Real-time embedded systems: Present and future,” Electron, vol. 7, no. 9, pp. 10–12, 2018, https://doi.org/10.3390/electronics7090205.
 G. Zaccone, “Python parallel programming,” pp. 15, 2015.
 J. H. M. Dassen and C. Stickelman, “The Debian GNU/Linux FAQ,” pp. 1–3, 2019.
 M. Miller, “The Complete Idiot’s Guide to Music History,” pp. 1–336, 2008.
 M. E. P. Davies, N. Degara, and M. D. Plumbley, “Evaluation Methods for Musical Audio Beat Tracking Algorithms,” Audio, no. October, pp. 17, 2009, https://doi.org/10.13140/2.1.4703.4568.
 S. Dixon, “Evaluation of the audio beat tracking system BeatRoot,” J. New Music Res., vol. 36, no. 1, pp. 39–50, 2007, https://doi.org/10.1080/09298210701653310.