What would a perfect beer awards process look like?

If people dislike CAMRA’s Champion Beer of Britain model, what other approaches might there be to judging the best beers?

Off the back of our post earlier this week, various people elaborated on their objections to the CAMRA approach:

“…from a consumer perspective, which is where CAMRA should be coming from, it’s misleading for the brewery to market a beer based on winning CBOB if that’s not what you get in your glass.” (James)

“Festival tasting isn’t the same as in a pub. A few mouthfuls of a beer aren’t enough, especially if you’ve drunk something completely different previously.” (Steve)

“As a GBBF staffer I was able to get myself a taster of Abbot from the actual cask that was judged… the idea that it’s the second best beer in the land is just ludicrous.” (Ben)

“…it’s unsustainable to get volunteers to judge these awards throughout the industry.” (Anonymous beer writer)

To summarise these objections, and others:

  • the beer at judging isn’t the same beer consumers encounter
  • the process is opaque leaving space for suspicion
  • the results are ‘wrong’, proving that the process must be, too
  • the judges aren’t qualified

What other models are there?

When we asked about this via Patreon, John ‘The Beer Nut’ Duffy said:

“I’ve been running one for several years on a drinkers’ choice basis: get everyone who’s interested to name their top three beers of the last year and award points accordingly.”

We can see pros and cons here. On the upside, you’ll be likely to get a more interesting list of candidates, based on people’s actual experiences in the real world, throughout the year.

One downside might be, again, a tendency to reward more mainstream, readily-available beers from breweries with better distribution.

Also, “anyone who’s interested” rings alarm bells for us. Whose opinions do you miss? And to whose views do you end up giving undue weight?

As it happens, this is more or less how the first round of voting in CBOB works: branches send out emails inviting people to pick their favourite beers from a big old list of local candidates.

You might also take Ratebeer or Untappd awards or lists as an extreme example of “anyone who’s interested” and, as we know, those are not uncontroversial either. You might summarise the industry response to Untappd as “Get back in your box, plebs!”

Critics’ choice

Another related approach might be the The Sight & Sound model.

For the most recent round of its best-films-of-all-time poll, the BFI’s film magazine asked around 1,600 professional film critics, filmmakers and other industry types to nominate 10 films each.

The results were hugely controversial.

Some felt too many ‘obscure’ films made the top 100, or too many political choices. Others complained about its continued bias towards films by white men from Britain and America.

And people also asked: “Who chooses the choosers?”

A poll like this feels somewhat objective but, at some point, someone has to pick the people who do the picking. Is this where bias creeps in?

This is more-or-less how most trad beer awards work, and the same criticism applies.

The difference being that because most beer judging requires you to be on site, rather than sending a quick email, the pool of critics is further reduced.

Many discerning palates are cut out of the process because they can’t afford to cover travel costs, or work for free. So the choosers are chosen by… chance? Personal circumstances? Less than ideal.

The Eurovision model

What if we combine a popular vote with the view of expert judges? That’s how it works at the Eurovision song contest these days.

A jury in each country, made up of singers, musicians and songwriters, awards points to each song, based on the semi-finals earlier in the week.

That is then combined with a public vote on the big night.

In theory, this smooths out both the elitist tendencies of the jury (you can’t trust these so-called ‘experts’!) and the mischievous tendencies of the public, who often vote based on national allegiances and/or novelty value.

But guess what? People still get angry about the results of Eurovision. They still feel the wrong country won, and that their country was robbed.

If anything, this system is the worst of both worlds: “We had the right result until they added the numbers from those amateurs/snobs, at which point it veered off course!”

You can’t please everyone

We cannot imagine a system for judging the best beer that won’t cause controversy.

That is half the point of awards, though – to generate conversation and make people think about beer.

Who is seriously using these announcements to decide which beer to drink, or not?

It’s possible, we suppose, that if you got a choice between two similar beers, you might pick the one with a little CBOB medal on its pumpclip.

Or that the first time you see Abbot in a pub after it’s won an award you’re tempted to give it a try.

But, really, they’re just a bit of fun.

Transparency helps

Having said all of that, being really clear about the process is one way to earn people’s trust.

Pete Brown addressed this in response to criticism of the British Guild of Beer Writers’ Awards last year, which we thought was a smart move.

Exposing your process also allows people to highlight areas for improvement, if you really want to hear those suggestions.

The best you can hope for is that people say, “I don’t like the result, but I don’t doubt it was fair.”

CAMRA’s process is not secret, even if it feels a bit obscure. It’s outlined here, in a PDF, described as an internal memo.

How would you improve that process, in practice? Where are its points of weakness?

9 replies on “What would a perfect beer awards process look like?”

Don’t forget that CAMRA also operates the National Beer Scoring system (NBSS) which is a factor in select of pubs for the Good Beer Guide.

Any CAMRA member can submit beer quality scores, although in practice it’s mainly the more active members that do (and by no means all of them).

It is subject to the pros and cons of the less selective methodologies that are mentioned above. One characteristic is self-reinforcement – CAMRA members will tend to visit pubs with scores already high enough to feature in the Good Beer Guide which means more scores get submitted for those pubs and so on. It’s difficult for new or improved pubs to generate critical mass. Also some members are far more critical than others, possibly subjectively so as it’s not blind tasting. Maybe Abbot Ale will see a reactionary decline in its scores this month?

Ten or fifteen years ago one of the American craft beer sites/aggregators ran a “Best Beer In The World” poll; IIRC the top ten included eight imperial stouts from US breweries, including three different barrel-aged versions of one beer. Which I guess is the Jeanne Dielmann problem: your audience of experts/enthusiasts may be experts, but they’re also a social group with its own self-reinforcing preferences and prejudices. I suspect this problem is actually worse with enthusiasts than with experts, ironically – the Sight and Sound 100 isn’t all Jeanne Dielmanns, after all – so if you’re going to open something up to the public, make sure you open it right up.

I think there’s some substance to the claim that “it’s misleading for the brewery to market a beer based on winning CBOB if that’s not what you get in your glass”; the trouble is, this boils down to saying that GK pulled a fast one by looking after the beer properly. It’d be better to turn the criticism round – OK, maybe Abbot is an outstanding beer when it’s on form, so why the hell is it so rarely on form?

Lastly, we’re talking real ale, not craft beer. If we were talking ‘craft’, and if we had a US-style definition of ‘craft brewery’ based on barrelage (appropriately scaled down for a small island), it’s quite possible GK wouldn’t have been eligible to enter, let alone win a medal. The thing about real ale is that anyone can do it, including big corporates. In many ways this is a good thing – it certainly means availability is much less of a problem. Availability of the good stuff in good nick – there’s the rub (see previous point).

The CBoB is the competition I have the most respect for. Any beer can be entered (by drinkers not brand owners!), and it has to get through rounds of blind tasting by experience panels. Has anyone suggested an alterative beer competition they think is better? Most seem to be profit making commercial enterprises which makes me more than a little dubious about them.

Re. “Who chooses the choosers” and “the judges aren’t qualified”

I didn’t judge CBoB this year, but I have done in several previous years, so I can answer those from my own experience and perspective. In fact, I already did so in a Twitter – sorry, X! – thread which I’m rewriting here…

Each CBoB judging table usually has 3 or 4 trained judges, some CAMRA-trained* and some independent, most with experience from other beer competitions too, some with international experience.

Usually each table will also have a brewer or someone else experienced in the trade, and a guest, eg. a mainstream journalist, MP, celeb, etc. The guest is normally the only one who wouldn’t count as “qualified”.

One of the experienced judges is the Chair and has the job of making sure everyone has a voice and that no one is drowned out. After a round-table chat, everyone independently scores the beers, after which scrutineers collect the scoresheets and tally up, so even the Chair doesn’t know the result for sure until later.

*(CAMRA runs a number of formal courses in beer judging and the like.)

Didn’t Kenneth Arrow prove there’s no such thing as a perfect voting system?

I’m not sure the process of selecting the final category finalists can be improved. Changes, certainly, but for the better?

The only real flaw I can see is how easy it is for brewers, large and small, to provide a cask of specially produced beer for sale and judging, and given that this is a prestigious competition, they’d be mad not to in the circumstances. My suggestion is that the breweries or depot should be visited and a cask/casks selected at random from general distribution. My point that the Abbott at GBBF wasn’t from general distribution and was likely to have been produced on their pilot brewery was denied strongly by several senior Camraistas, but my request for a photo of the cask label to be made available strangely fell on deaf ears.

Comments are closed.