Paper of the Day (Po’D): Revisiting Inter-Genre Similarity Edition

Hello, and welcome to the Paper of the Day (Po’D): Revisiting Inter-Genre Similarity Edition. Some work from my visits to Portugal earlier this year has finally been given the green light: B. L. Sturm and F. Gouyon, “Revisiting Inter-Genre Similarity“, IEEE Signal Processing Letters, 2013 (accepted). My one-line description of this work is:

Be wary of an idea that sounds good and intuitive until analysis shows it to be good.

This paper addresses a former Po’D: Automatic classification of musical genres using inter-genre similarity edition. Our attempts at reproducing the results in that work are here, here, and here. After finding that our results were nowhere near those published, we sought answers through analysis. That is where this paper begins.

In short, we show that while the idea proposed in the original publication sounds good and intuitive, it is plainly not a good idea. (This is, I think, a great example of how intuition can seriously lead one astray.) Once we put the inter-genre similarity approach in the context of naive Bayesian classification, it becomes clear why it can’t be superior to the much simpler approach of naive Bayesian classification. We add some empirical experiments to drive home this point. We make available the code to reproduce all figures in our paper exactly.

In fact, it appears that the reviewers put a lot of weight on the pains we took to make our paper reproducible. A few of the reviewers actually dug into some of it to experiment with different parameters. Here are a few comments from reviewers revealing to this point.

Some of the previous reviews [the first version was rejected, and the comment is about our revision and comments to the previous reviews] expressed surprise at the big discrepancy between the results obtained in this paper and the original results and believe that there might be some issue with the implementation. In a case like that I think the reproducible implementation is the one that should be taken seriously.

Overall I think there is a big emphasis on novelty and new results in engineering but
reproducibility and repetition of experiment are a central foundation of good science and engineering and papers like this should be encouraged rather than discouraged.

A big pro of the paper at hand is that the authors foster reproducability and even make available their source code. This has already been mentioned by the reviewers, but I would like to highlight and appreciate it again. This is really great practice of good science, but unfortunately not always seen in the signal processing and music-IR domains, unlike in other domains.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s