AcousticBrainz aims to automatically analyze the world’s music, in partnership with MusicBrainz and “provide music technology researchers and open source hackers with a massive database of information about music.” This effort is crowd sourced neatness, which means people from all over the world are contributing data by having their computer crunch through their MusicBrainz-IDed music libraries and automatically uploading all the low-level features it extracts.
I construct today’s review from low- and high-level data recently extracted from a particular music track AcousticBrainz. Can you guess what it is? What characteristics it has? (“Probabilities” are in parentheses.) The answer will be revealed below tomorrow.
This female-gendered (0.81) vocal (0.78) track is not likely danceable (0.87), but it has a high probability of being electronic (1.0) and/or ambient (0.57) and/or classical (0.45) and/or jazz (0.31). It is in C major, with a tempo of about 148 bpm, and has a Tango rhythm (0.91). It has a bright timbre, is probably atonal (0.83), and labeled probably happy (0.63), but most likely not relaxed (0.96).
The track is “With God on Our Side (feat. Joan Baez)” by Bob Dylan.