This Tweet triggered a conversation on which we eavesdropped with interest:
is it me, or when you read a brewer saying that they don't care for ratings websites, oft means beers have been flamed on there?
— Phil Lowry (@PhilLowry) October 20, 2014
Follow up responses seemed to suggest that Ratebeerians as know enough to be dangerous without offering a useful opinion:
@PeteBrissenden @kempicus @PhilLowry exactly what Pete says. Obv not true of all, but so many raters clearly haven’t got a clue.
— BeerBirraBier (@beerbirrabier) October 20, 2014
Now, we don’t rate beer ourselves, but we find it odd that people in the business of selling beer react so badly to those who do.
As with TripAdvisor, mischief aside (one-review wonders, attempts at extortion) you’re getting a direct output from the minds of the kind of people who are inclined to buy your beer. People pay a fortune for market research and you’re getting it for free!
They might use incorrect terminology, or misunderstand the style you were aiming for, or even judge a beer without having given it a fair tasting… which makes them just like normal people.
On that basis, if a substantial number say they don’t like your beer, however clumsy their efforts to explain why, maybe you should look into it.
Otherwise, the odd one-off bad review really doesn’t matter — it’s a drop in the ocean, a rogue result.
And brewers who find themselves crying “But it’s not fair — we score low because we don’t brew the kind of beer that Ratebeerians like!” should probably take a moment to ask, in that case, why they care what Ratebeerians think. Self-doubt, perhaps?
35 replies on “Those Who Rate”
Every product or service is rated, evens schools and hospitals. It’s entirely understandable that people want to score beers and hopefully the majority will add context with a short written review, providing a more nuanced critique. I don’t have the best palate in the world if I (constructively) critique a beer I educate myself about the brewer’s intentions beforehand, rather than just present a knee-jerk reaction into a vacuum.
Yes! I cannot agree more. Every time I see a brewer complain about ratebeer and ‘not caring about ratings’ I immediately assume its because they have received lower marks then they feel deserved.
It often shows how much a brewer really is ‘brewing the beer we like to drink’ vs brewing a beer they want others to like – as the two are very different propositions in my opinion.
For me, its irrelevant if someone rates a beer after only a few mouthfuls, or if they confuse their off flavours and describe something incorrectly. They are still passing judgement on a beer as most other people would in a pub or bar.
Its a public facing, user-generated website – and so the reviewers do not need to have any kind of qualification other than their own judgement – and so I find the sentiment that ratebeer reviews are full of ‘people who dont have a clue’ confusing and concerning. These are their customers…
Regarding breweries and their feelings towards ratebeer ratings, I have often had the suspicion that some breweries are perhaps playing the ratebeer system as well – to receive higher marks. I have seen numerous beers that I would describe as typical ‘American Pale Ales’ listed as a ‘Golden Ale’ – which can drastically alter the ‘style’ score on ratebeer. A beer with a score of 70 overall will score 94 for style under ‘Golden Ale’. If this same beer scored 70 in the American Pale Ale category however, the style score would now be closer to 60. Maybe this is just me making assumptions where there are none, but part of me wonders if this is breweries (or their marketing departments) trying to play the ratebeer system.
I think it’s unlikely that anyone could get any really useful information from these rating sites. It’s a (I guess) small (certainly) self-selected sample, so can’t have much statistical validity.
For me personally (a viewer of ratebeer, not a rater), my typical uses of the site are:
1.
To view the top beers for a style. If I happen to see a beer in the list that I know is in stock nearby or that I recognise – I am more likely to go out of my way to try that the next time. (Sadly there are too many beers just to try them all – but darn it I will try!)
Likewise, if I look at the best American Pale Ale list on ratebeer – there are 9 beers by Hill Farmstead alone. Which doesnt take much of a leap of faith to make me think Hill Farmstead are probably pretty good…
And so in this sense, ratebeer serves as positive marketing for beers/breweries.
2.
The other use might be if I am in a well stocked bar that has a beer list. If I have time to kill and its not rude – I might quickly check the beer I was considering ordering next to see how it rates. If it was an average to good rating I will go ahead as planned. But if the beer is scored very poorly, it might prompt me to pass on it this time.
In this sense, I guess it could be said that ratebeer stops me from ordering some beers
I’ve never really cared too much about the comments though I must admit
or you could roll a dice.
Some days I do. Luke Rhinehart would be proud 🙂
or even ask for a taster first…..?
There are more raters than bloggers and the Bayesian system removes the outliers for the aggregate score. The system is not designed to be objective, rating beer to style, its very much down to rater’s preferences, so in that way its actually better than formal competitions which only rate a beer against a set of prescribed guidelines
Got mixed feeling about Ratebeer myself. I know Martyn Cornell got barbecued the other year for pointing out that over half of the Ratebeer world top 50 for one year had the word “imperial” in the name. I mention this because it kind of illustrates how the ratings might be skewed.
At Borefts this year (and you can’t really call yourself a beer geek until you’ve been to Borefts – waiting for you two to show one year….) there seemed to be a high Ratebeerian presence and they brought along lots of bottles to swap and rate – however many of them seemed to have what looked like small metal thimbles which they duly filled and rated the beer accordingly. Can you rate a beer from a thimbleful?
Just a couple of random thoughts really.
As a punter I tend to find that a high rating for the style will normally mean a good beer in some sense, but not all good beers get high ratings. The by-style scores are good for avoiding the “imperial” bias a bit, although it still seems to be the case that you can get a high rating in a traditionally understated style by bumping up the hops and alcohol – half of their top ten milds are over 5%, for instance! It’s also probably a bit harsh on cask ale since it’s likely that people will be rating stuff when it isn’t perfectly kept.
I can understand how the whole thing might rankle a bit with brewers, though, since people often seem to skip over the subjective aspect of it. How many times have you seen articles saying that Westy 12 or Pliny the Younger or whatever is “the best beer in the world” rather than “the most popular beer with Ratebeer users”. Or people who are generally in line with Ratebeerian tastes treating it as objective evidence that traditional bitters are “boring”.
Ratebeer is really useful. I have used it so much over the years and still do. It would be churlish to criticise such a valuable resource for all of us in this business. You just need to give appropriate weight to different reviews.
Those written by the clearly inexperienced have their own value – not all customers are seasoned beer lovers, indeed only a few are – but you take the lack of expertise into account.
Then there are those who have all the experience but don’t see the wood for the trees. Overly formulaic review formats are always a dead giveaway – these people are just the trainspotters of the beer world.
Then there are those reviews written by people who are clearly hooked on visiting pubs and festivals, drinking beers and reviewing them on Ratebeer but who clearly have a great time doing so. A few names spring to mind – often people I’ve met because they’re ominpresent at beer fests! I give a lot of weight to what they write on the site. I’d always want to try a beer first myself if it seemed interesting to me but often they save me the trouble. There’s a lot out there, after all.
I suppose we’d understand the upset more if there was evidence that people made buying decisions on the basis of RateBeer (or other forums) alone. Does anyone check scores before deciding what to buy in the pub? Or before putting in an order online?
I use TripAdvisor much more, but even that I use with a combination of other factors – usually word of mouth or whether I liked the look of a place when I went past. And I will always look at the substance of comments rather than a rating.
I have made buying decisions based on ratebeer, either deciding not to go for a beer, or as a validation to buy a beer that is more expensive (‘is it worth me paying this extra?’).
Like you, its all done with a pinch of salt as I have loved low/average scored beers and been indifferent to very highly rated ones.
But even if ratebeer does effect the buying decisions of others – I dont see how this is a bad thing (or even a ‘thing’). It puts beer on the same playing field as anything available on Amazon, and what are the alternatives – trusting professional reviews only?
Maybe I am being hard-nosed about it, I dont know – it just seems that: if a brewer really doesnt care about ratebeer – then carry on doing what they do and trusting their customers feedback and watching their revenue. But if they do care, then by all means – why shouldnt they try and develop a recipe for the ratebeer audience? Albeit this may be a small subset of their target audience – but it is an easy way for brewers to get instant access to what these people think of their beers.
The issue for me seems to be those brewers who are in the middle, and cant really decide whether they care what ratebeer thinks or not.
yes, regularly use ratebeer to determine which beers to avoid especially for unknown breweries, in much the same way as i use trip advisor to pick decent places. i dont just use the overall score i hav some trusted raters i look out for to see if their opinions are good or bad
I’ll generally be more likely to pick something up if I know it’s highly rated – which generally means that I’ve remembered from some point when I was browsing a brewer that I’m interested in or looking at a “top N by style” type list or something.
I’ll also use RB as part of the due diligence procedure before splashing out on something that I expect to be really special.
On the other hand, I don’t go out of my way to look up most normal beers before buying them. Doubly so if they aren’t in a particularly Ratebeer-friendly style.
All that said, I don’t think it’s surprising if people get slightly defensive about bad reviews, whether they’re warranted or not. After all, didn’t you guys yourselves just tweet about not wanting to look at the “critical” part of Phil’s Brew Britannia review?
Yes, it’s perfectly natural. but…
Didn’t we, by Tweeting the link, actually support Phil’s right to be critical of us, even if we chickened out of reading it for the sake of our own fragile egos? I think the best analogy here would be us raging about how people who use Goodreads are idiots who haven’t read enough books to judge our prose style, while continuing to monitor and Tweet angrily about our reviews on Goodreads.
I’m not saying it’s a perfectly analogous situation or anything, and definitely not criticizing you guys. Just pointing out that peoples’ responses to criticism aren’t always particularly rational!
I hope you do read it! (It’s not critical critical.)
Very interesting topic, and some great discussion to read. We’re a brewery in Tallinn, Estonia, and we also import some of our favourite beers into the country (Buxton, Magic Rock, etc).
We’ll very frequently find bars here where the managers are perhaps not so knowledgable – as a result they’ll generally only buy beers based on ratebeer scores.
That makes it a problem for us to get great beer into the hands of consumers, because whilst we’re happy to supply them with their (very excellent) core range, they’re often unwilling to take on board new beers, or anything that isn’t 90+ – when it inevitably takes time to reach these scores. So beers like the new Nth Cloud DIPA, or Wolfscote (both stunningly good!) become a really hard sell for some of the bars – which then means drinkers don’t even get the chance to make a decision on them.
In general ratebeer is a great tool, and it’s fascinating to occasionally see our beers flying around the world, but our situation is that in a new market for craft beer there’s really an over-reliance on ratings and being risk-averse rather than focussing on the beers themselves.
Even though I do rate beers on RateBeer from time to time, I find it quite a frustrating experience because there seems to be an imbalance toward the super strong, hoppy, barrel aged, weird stuff. One thing I find particularly galling is when people rate a pilsner, ordinary bitter, or similar, and all the adjectives would suggest that the beer in question hits all the marks considered essential to the style, and then they give an overall score of 2.5-3 rather than something closer to 4-5. Too much credence is given to the personal impression of the drink rather than the technicalities of the style. Also, I know far too many people who have ‘rated’ thousands of beer, but only ever sampled a couple of ounces of most of them. You can’t know a beer from a splash in a glass.
I don’t find the imbalance towards big / strong / complex / weird stuff that much of a problem. I think that if you gave me a damn-near perfect amber lager and a damn-near perfect barley wine and asked me to put a single numerical score on the experience of drinking each then I’d probably rate the barley wine a lot higher too – it’d just be that much bigger and more memorable an experience.
What’s a problem is when people start thinking that that means that the amber lager is “bad” or “boring” or that people who brew great barley wines are inherently more skilled than people who brew great amber lagers, or that there aren’t times when you’d trade the best barley wine in the world for a decent cold lager. The by-style percentiles and top tens do a bit to counteract that problem, although a) for more understated styles, even these tend to be skewed towards the unrepresentative examples that ignore the understatement and turn everything up to 11 and b) quite a lot of people seem to miss the point anyway.
My understanding is it is far easier to brew the big, barrelled and bold than a delicate ordinary bitter and make it memorable. It’s not just that the RB and BA expression of what is noteworthy is skewed but sometimes I suspect it’s is skewed by being addled. A constant diet of 14% big bombs sooner or later must kill of perception of the delicate. As a result, I presume it all speaks to a constituency that I do not belong to.
Do people actually drink “a constant diet of 14% big bombs”, though?
I don’t really know the subculture, but I’d always assumed that even people who are into tasting and rating loads of thimblefuls of big weird monster beers will tend to actually get through a much higher volume of more workaday stuff when they’re out with mates or watching the telly. Maybe I’m wrong, though…
Between Untappd and Ratebeer, we could probably come up with some stats on that.
I suspect you’re right though — some of the biggest ‘big beer’ lovers out there are also huge fans of, say, Camden Hells.
I’ve had a bit of a look myself. There’s certainly no shortage of ticks for things like Anchor Steam, Sam Adams Boston Lager, Pilsner Urquell, most of the major hefeweizens etc.
Although there’s the obvious limitation that people (presumably) don’t rate everything they drink. I’d guess that they’re more likely to actually put a rating for a beer if it’s something big and unusual that they’re excited about rather than something fairly standard that they’re drinking while doing something else, and they might not bother rating every glass or bottle they have of the same thing. All of which suggests that you’d end up undercounting the less “big” beers, but it’s hard to be sure.
Sorry, data geek at play…
(As an aside, I wonder if anyone’s tried mining ratings for the same beers at different pubs on untappd to see if it shows up which pubs consistently keep stuff better or worse…)
I’d probably rate the barley wine a lot higher too – it’d just be that much bigger and more memorable an experience.
Perhaps the problem is just the fact of using a single numerical rating for everything. I mean, if (made-up examples alert) a really brilliant 12% double imperial bourbon barrel-aged stout scores 9.6 out of 10 (because just wow, OK, I mean you have to taste this, I mean like, wow), and a really brilliant 3.8% mild scores 6 (because yeah, it’s a good example of the style, probably the best example of the style I’ve tasted, yeah, it’s a good beer, I’d have it again), then you can’t really blame anyone who goes away with the impression that the DIBBAS is objectively a Better Beer than the mild.
I very rarely look at RB ratings, as I feel they’re coming from a different world. The point about measures is relevant here. “You wouldn’t want a pint of it” is often (not always) a criticism for me; a really great beer is (generally) one where you’d be happy to drink a pint and order another.
The only point I’d differ from here is using the “want a pint of it” test. Some beers you just couldn’t drink a pint of, however good they may be (like your 12% double imperial etc etc). so (nitpicking pedant alert) I’d slightly amend your closing words to “….drink a pint and/or order another”.
What I will say is , Ratebeer can be useful for getting an impression of what a beer might actually be like in terms of taste, body, texture etc. I’ve used it to whittle down the list of possibles for a beer festival bottled beer bar I run (I’ve certainly written off the odd one when Ratebeer has revealed a few too many infected gushers).
“I’ve certainly written off the odd one when Ratebeer has revealed a few too many infected gushers.”
If multiple Ratebeerians are reporting specific quality/technical issues, brewers *definitely* ought to be listening.
I guess the problem brewers have with ratebeer is that when criticised it’s difficult not to take it personally. Yes, you can argue that if the person rating doesn’t have a clue then you should ignore them, but that’s easier said than done. I can’t really see it making much of a difference to sales in the UK (unless you are lucky enough to make it onto a ‘Best of’ list that gets picked up by the media).
And ratebeer can be really annoying with the tiny measure thing, and obsession with certain brewers and extreme beers. All too often it seems that things are rated on reputation. Admittedly, it is very difficult to be impartial. If you’ve paid 20 quid for a beer you’ve already ‘bought in’ and will naturally want to think it’s amazing (otherwise you’re a dupe for paying 20 quid for an average beer).
“I guess the problem brewers have with ratebeer is that when criticised it’s difficult not to take it personally.”
I guess that Ratebeer probably makes people particularly prickly because the detailed tasting notes and averaged scores and generally rarefied atmosphere encourage both ratebeer users and external commentators to buy into the idea that it’s in some sense objective and definitive.
I’ve come to view Ratebeer in the same way I use say Goodreads – a useful resource to get a feel for a beer if I’m thinking of buying a bottle and there’s no real twitter/blog discussion about it at the time. I find that I put more weight on the reviews rather than the scores as there is a leaning towards the stronger hoppoer beers in much the same way I think Goodreads tends to lean towards YA/Fantasy fiction, and perhaps this is where the analogy breaks down, remember that the RB reviews may be based on just sips or small samples, so take with a considered pinch of salt.
The same principle holds true regarding bad reviews of beer as it does for bad book reviews in my opinion, they’re for the consumer/reader, not the brewer/author.
If you hired a market research firm to gauge consumer opinion of your product and they A) used a skewed sample and a simplistic methodology and B) published the results to your customers and your competitors, I’m not sure you’d count yourself lucky even if they waived the fee.
The truth about online rating systems sits somewhere between ‘amazing peer to peer democracy’ and ‘utter misleading chaos’. It is reasonable for brewers to continue to lob grains of salt into the process, to prevent the conventional wisdom leaning too much towards the former.
When I was in production I did look at the ratings sites every now and then, but I didn’t find the information particularly helpful. For obvious reasons you get the opinion of beer geeks so the highest ratings go the the most beer geek friendly beers, and ratebeer score in no way related to the much more important figure, which I always kept my eyes on: amount of beer sold.
I agree with a lot of the comments above. The ratings are totally scewed towards strong beers – esp stouts it seems. An imperial stout decently brewed seems to always get a great score. One of my favourite beers from a time living in Bavaria is Augustiner Helles – an awesome (but quite simple) beer. It gets 3.05 on RB. Barely above the 2.something which is where I think a ‘bad beer’ should be. I find that shocking.
That said I’ve reviewed 100ish beers on RB – more for my own record than anything else. I also use it to try an develop my own palate – reading the comments and seeing if I can taste those things myself. Some of the comments do make me laugh though……
It’s not really market research (and I say this as someone in the MR profession). Ok, it sort of is. If you wanted to understand the opinions of a particular group. Everyone above seems to have jumped on RateBeer, but it equally applies to Untpped, Beer in the Evening (for pubs) and all those other third party review sites.
These are all useful as filters to things that are terribly wrong or if you seek adulation, but the reviews and comments of the ‘long tail’ of reviewers tend to cluster at these two ends of the satisfaction scale.
I say long tail because in any community of this sort a large percentage of the reviews are submitted by a tiny minority of people. Which means that if you pay heed to RateBeer, for example, you are listening to perhaps the 5% most influential people within a community that might represent only a couple of percent of beer drinkers.
If I were a brewer, while I wouldn’t ignore these types of comments, I would use sales figures to understand what is popular, but then get out to the pub, or find another way of engaging with customers (survey at brewery shop or chatting to as many drinkers as possible, for example) to get a good range of feedback.
Ratings / review sites are good for a quick scan, but for real feedback and customer understanding they should be avoided.