public

The Method To The Madness: How We Arrived At Our AOTY List

Everybody loves talking about their year-end lists, but no one talks about perhaps the most important part: How they arrived at said lists! The bigger a staff group gets for

7 years ago

Everybody loves talking about their year-end lists, but no one talks about perhaps the most important part: How they arrived at said lists! The bigger a staff group gets for a site, the harder it gets to aggregate their year-end lists. One possible way is to just get people together and have them argue it out, have editors yell louder than everyone else and end up with some sort of list, but that gets complicated and frustrating way too fast, no one ends up happy, and it wastes way too much time. We did something like that for our twopart year-end list for the podcast, but since our staff roster has 27 people, it’s quite intractable. As such, I resorted to science. No, seriously. Let me tell you how I computed our AOTY list.

Last year, Nick painstakingly created our list by taking everyone’s top 50 lists, assigning a point score to each position in the list (more on that later) and then ranking albums by their total sum score. That’s a pretty good approach, but it requires way too much busywork and the hand-picked weights he gave each album are kind of arbitrary and not necessarily optimal to achieve consensus. So I did some digging, read about 40 pages’ worth of papers on aggregate ranked consensus, and narrowed it down to a set of methods. There are basically two big approaches to aggregation of this sort. You either bias towards things a few people really, really liked, or towards things most people liked a decent bit. In other words, you highlight peaks of very high ranks or a more even distribution of relatively high ranks. There isn’t really an objectively better way to do things, but we’ll see how each plays out. For the record Nick’s approach was more geared towards the former.

Either way, we first need to get a list of albums people liked. Nick just had everyone submit their top 50-or-whatever, and hand-entered scores for all of them. The goal this year was to reduce busywork, so I decided to automate as much of the process as possible. I first asked everyone to create a joint list of all albums that they may consider putting anywhere near their top 50. We used Google Docs to arrive at this list, and the end result had a bit over 450 albums in it. That’s a lot!

I then created a Surveymonkey poll where people would rank their top 50 from those items. Despite many people asking for the ability to make lists as long as 50 items (I suggested 25 items, as I feel like having really long lists lets people get sloppy with their listing instead of highlighting things they REALLY like, but that’s just me), only about a handful out of 27 people actually submitted 50-long lists. Many people submitted lists ranging anywhere from 10 to 25 items (I mandated a minimum of 10). I then took the results of that poll, and imported it into MATLAB to play around with. I then implemented a bunch of models, mostly from “Distance-based and ad hoc consensus models in ordinal preference ranking” by Cook et al., and Nick’s previous model.

Distance Minimization

While I tried a variety of approaches, I want to highlight the results of a few. First, the hilariously bad one. This approach, which I call “Distance minimization”, seeks to minimize the rank of an album on the final list compared to what everyone else rated it. So, for example, if I rate an album #15, and Nick rated it #7, the final rank for it would be somewhere around #11, because that’s the rank that minimizes the distance in the final list from both our lists. This was hilariously bad, because while it’s a great method for closed-list voting, when people can vote on a different 50-subset of 450 albums, it doesn’t work. When someone excludes an album, it means they don’t like it, but this method doesn’t take that into account. If I rate an album #1 and Nick rates it #50, it means we both like it, just that I like it more than Nick. The end score for that album should probably be pretty high. But instead this method would put it around #25, because it thinks Nick likes the album less than anything else. Additionally, if someone rates an album #1, and no one else cares about it, that means the album would be #1 in the aggregate list, because no other score would pull its position down. So, essentially, this approach doesn’t work when we want our list to be a list of things that we all like. It only works when everyone is voting on the same 50 albums. Anyway, here’s what our list would have looked like if I did this:

  1. ‘Yndi Halda – Under Summer’
  2. ‘The Body – No One Deserves Happiness’
  3. ‘Corima – Amaterasu’
  4. ‘Kanye West – The Life of Pablo’
  5. ‘Whispered – Metsutan: Songs of the Void’
  6. ‘Orphx – Pitch Black Mirror’
  7. ‘Car Seat Headrest – Teens of Denial’
  8. ‘Aenaon – Hypnosophy’
  9. ‘Axon-Neuron – Metamorphosis’
  10. ‘Chance the Rapper – Coloring Book’
  11. ‘Com Truise – Silicon Tare’
  12. ‘Dangers – The Bend In The Break’
  13. ‘Alcest – Kodama’
  14. ‘Slice the Cake – Odyssey to the Gallows/West’
  15. ‘Katatonia – The Fall of Hearts’
  16. ‘Deftones – Gore’
  17. ‘Öz ürügülü – Fashion and Welfare’
  18. ‘Dance Gavin Dance – Mothership’
  19. ‘Kashiwa Daisuke – Program Music II’
  20. ‘Bon Iver – 22, A Million’
  21. ‘Sturgill Simpson – A Sailor’s Guide to Earth’
  22. ‘John Zorn – The Classic Guide to Strategy, Vol. 4’
  23. ‘Meshuggah – The Violent Sleep of Reason’
  24. ‘Winterhorde – Maestro’
  25. ‘Thank You Scientist – Stranger Heads Prevail’
  26. ‘Childish Gambino – Awaken, My Love!’
  27. ‘The Avalanches – Wildflower’
  28. ‘Roly Porter – Third Law’
  29. ‘Agoraphobic Nosebleed – Arc’
  30. ‘Spirit Adrift – Chained to Oblivion’
  31. ‘How To Dress Well – Care’
  32. ‘Dark Tranquillity – Atoma’
  33. ‘Coma Cluster Void – Mind Cemeteries’
  34. ‘Clipping – Splendor and Misery’
  35. ‘Frank Ocean – Blonde’
  36. ‘Blazon Stone – War of the Roses’
  37. ‘Skee Mask – Shred’
  38. ‘Anohni – Hopelessness’
  39. ‘Maeth – Shrouded Mountain’
  40. ‘Thrice – To Be Everywhere Is to Be Nowhere’
  41. ‘Amygdala – Population Control’
  42. ‘Vektor – Terminal Redux’
  43. ‘Esperanza Spalding – Emily’s D+Evolution’
  44. ‘Gorguts – Pleiades Dust’
  45. ‘Protest the Hero – Pacific Myth’
  46. ‘Lady Gaga – Joanne’
  47. ‘Leon Vynehall – Rojus (Designed to Dance)’
  48. ‘The 1975 – I Like It When You Sleep…’
  49. ‘Black Tusk – Pillars of Ash’
  50. ‘Cobalt – Slow Forever’

This list would basically make a few people very happy and everyone else very unhappy.

Pairwise Comparison

Then, let’s look at what we finally went with, pairwise comparison. Essentially, for every pair of albums, I look at whether each person scored album A over album B or not, and award the album a point depending on the result of that. I add these points up based on that, then sort the results. This ended up being pretty reasonable, because it takes into account each person’s internal taste and how that stacks up against everyone else. There was one problem with this approach though. The varying list lengths. Let’s say my list is 10 items. My #1 album would “win” comparisons against 9 other albums, earning 9 points. Nick’s list is 50 items, so his #1 item would “win” 49 comparisons, and get 49 points. This would make his #1 worth more than my #1. That’s pretty unfair, so I scaled a point’s worth by the length of a person’s list. So, winning comparisons on lists with 10 albums would earn an album 1 point per comparison, and winning comparisons on a list with 50 albums would earn an album 0.2 points per comparison. The end result here ended up being pretty representative of our aggregate tastes, and we went with this eventually. You can see the final results here. This falls into the latter category of list that I described earlier. Makes as little an amount of people the least amount of upset.

Nick’s Approach / Static Assignment

Finally, I tried Nick’s approach for comparison. He assigned a static number to each rank, and added that number to an album’s score for its position in each person’s list. Numbers are, from 1 to 50: [150, 145, 140, 135, 130, 125, 120, 115, 110, 105, 101, 97, 93, 89, 85, 81, 77, 73, 69, 65, 62, 59, 56, 53, 50, 47, 44, 41, 38, 35, 33, 31, 29, 27, 25, 23, 21, 19, 17, 15, 14, 13, 12, 11, 10, 9, 8, 7, 6, 5]. This very heavily weights a person’s top 10. Thankfully, the list resulting from this approach wasn’t too different from the final list we went with. Especially the top 10 is quite similar. Props to Nick for coming up with a method that works pretty well! The difference comes into player deeper into the 50, with pairwise comparison highlighting albums that more people liked. Here’s what the list could have looked like if we followed Nick’s method:

  1. ‘Car Bomb – Meta’
  2. ‘Vektor – Terminal Redux’
  3. ‘Oathbreaker – Rheia’
  4. ‘The Dillinger Escape Plan – Dissociation’
  5. ‘Obscura – Akroasis’
  6. ‘Cult of Luna & Julie Christmas – Mariner’
  7. ‘David Bowie – Blackstar’
  8. ‘Meshuggah – The Violent Sleep of Reason’
  9. ‘Alcest – Kodama’
  10. ‘Gorguts – Pleiades Dust’
  11. ‘Thank You Scientist – Stranger Heads Prevail’
  12. ‘Ihsahn – Arktis’
  13. ‘Astronoid – Air’
  14. ‘Clipping – Splendor and Misery’
  15. ‘Fallujah – Dreamless’
  16. ‘Wormrot – Voices’
  17. ‘Aesop Rock – The Impossible Kid’
  18. ‘Periphery – Periphery III’
  19. ‘Plini – Handmade Cities’
  20. ‘Danny Brown – Atrocity Exhibition’
  21. ‘Insomnium – Winter’s Gate’
  22. ‘Nails – You Will Never Be One of Us’
  23. ‘Haken – Affinity’
  24. ‘Devin Townsend Project – Transcendence’
  25. ‘Aenaon – Hypnosophy’
  26. ‘Dark Tranquillity – Atoma’
  27. ‘Slice the Cake – Odyssey to the Gallows/West’
  28. ‘Virvum – Illuminance’
  29. ‘Swans – The Glowing Man’
  30. ‘Inter Arma – Paradise Gallows’
  31. ‘Trap Them – Crown Feral’
  32. ‘Wormed – Krighsu’
  33. ‘Every Time I Die – Low Teens’
  34. ‘Gojira – Magma’
  35. ‘Ulcerate – Shrines of Paralysis’
  36. ‘Textures – Phenotype’
  37. ‘Saor – Guardians’
  38. ‘Black Queen – Fever Daydream’
  39. ‘Neurosis – Fires Within Fires’
  40. ‘Deathspell Omega – The Synarchy of Molten Bones’
  41. ‘Anciients – Voice of the Void’
  42. ‘Protest the Hero – Pacific Myth’
  43. ‘Radiohead – A Moon Shaped Pool’
  44. ‘A Sense Of Gravity – Atrament’
  45. ‘Cyborg Octopus – Learning to Breathe’
  46. ’65daysofstatic – No Man”s Sky: Music For An Infinite Universe’
  47. ‘Shokran – Exodus’
  48. ‘O”Brother – Endless Light’
  49. ‘First Fragment – Dasein’
  50. ‘Deftones – Gore’

In the end, there is no ultimately correct way to do this, and this was an interesting experiment for me. The end result has pretty much everyone happy (except for a few members whose taste is completely divorced from the rest of the blog), as evidenced by fellow staffers commenting on how many of their top 50 (or more like top 20) albums actually made it into the final list. It seems our staff is generally less upset with the list year, and it took less busywork on our part in terms of compiling and computing the final list, so that’s a big plus too!

Conclusion

I think transparency in how these lists are decided on is important, and I wish other sites would talk about how they made their lists as well. It would go a long way towards demystifying the process and grounding their lists. Plus, I’d be interested in knowing what they do. I hope with this post I was able to clarify our approach a little bit! I’d also love to talk about this in more depth and share more statistics, so let me know in the comments if you want to know what was the highest album that received a single vote, or what was the lowest rank someone’s #1 placed in, or whatever. If there’s demand, I can make a separate posts containing bizarre statistics about our AOTY list.

Noyan

Published 7 years ago