Medallurgy: wine competition gold is as good as chance
About half of the wines entered into at least three wine competitions bring home a gold medal. But of those winning a gold, 84 percent win no further medal at another competition. Thus, “winning gold medals may be more a matter of chance than a predictor of quality.”
Such are the findings from a paper published in the current issue of the Journal of Wine Economics. Robert Hodgson (pictured), the paper’s author, is a professor emeritus of oceanography at Humboldt State University. He also co-owns Fieldbrook Winery in Humboldt county, which “normally produces about 1000 cases per year. Though small, the winery has earned distinction by winning many awards in state and national competitions.”
In fact, it was his personal experience winning medals and then coming up empty handed that led his quantitative analysis of 13 wine competitions as he told Reuters. The paper says that there are about 29 wine competitions in the United States; for the 13 that he studied, entry fees exceeded $1 million.
Other research has shown that consumers’ buying decisions are slightly but positively influenced by medals, which placed sixth out of thirteen variables (ahead of front labels and shelf talkers).
What do you think explains this disparity: something inherent to wine competitions, the nature of blind tasting, or a lack of consensus of quality wine?
Links to abstract and full paper in pdf
On September 1st, 2009 at 3:25 pm ,Dylan wrote:
Regardless of the outcome, I don’t think you’ll find anyone wanting to stop pursuing the gold medal. And if we did, then we’ll see an entire industry shift its definition of success to be based on consumer satisfaction. All of a sudden the unfettered feedback from winery mailing lists could become the greatest source of future sales.
On September 1st, 2009 at 3:28 pm ,Steve Linne wrote:
Very interesting. Look at the whole paper, very enlightening. I will think twice before spending the money.
On September 1st, 2009 at 3:38 pm ,Grant wrote:
Quality of judges is the big x-factor. It is a mostly unpaid gig, so finding competent judges willing to give up their time FOC for the ever growing number of wine competitions is a real challenge. That is one area of inconsistency. The others are blind tasting, where and when a wine is tasted in the line-up ( fatigue, tannin build-up on the palate in huge classes) and the personal style preferences of judges on specific panels.
On September 1st, 2009 at 4:24 pm ,Benito wrote:
I got a phone call from a friend who was trying to choose between two wines. One had won a gold medal, the other two silvers and a bronze. He asked which was better. My reply?
“Pabst Blue Ribbon.” And then I told him to talk to the shop owner and ignore the labels.
I will admit with full honesty that when I was just starting to purchase wine 12 years ago, I was often drawn towards wines with medals (which also happened to be fairly inexpensive), and would even point out the award to friends while pouring. “This got a silver at the Pocatello County Fair in 1993!”
There’s a lot of fear and nervousness among novice wine buyers–What if it’s awful? Did I just waste $15? If I hate it do I have to pretend that I like it? At the time I saw the medals as sort of validation, that the wine had passed some test of quality. Certainly that strategy worked well for PBR from 1893 to the 1950s.
On September 1st, 2009 at 4:34 pm ,Dr. Vino wrote:
Benito – props to you for the Pabst reference! One of my faves too
http://www.drvino.com/2006/09/06/st-emilion-revised-edition/
On September 1st, 2009 at 5:00 pm ,Benito wrote:
I checked the source paper to make sure they didn’t make the same point about PBR, but I neglected to check the prior posts of the good professor. 😉
It’s true though, and most of your big macrobrews (that are flavorless and boring) bear titles like The King of Beers, The Champagne of Beers, The Banquet Beer, etc. Same thing happens with whiskey–there are some real rotgut whiskeys out there with labels covered in medals.
On September 1st, 2009 at 6:16 pm ,Betty wrote:
When I lead in-home wine tastings, I encourage the guests to trust what they like, not what the judges like.
While judges might be in a better position to evaluate wines than many of the rest of us, they are not gods. And when you’re trying upwards of 50 wines at a time, even a god might have some challenges distinguishing the great from the good.
On September 1st, 2009 at 10:08 pm ,The Wine Mule wrote:
As a retail salesperson, I would say that the presence of a Medaille D’Or sticker on a bottle may sway a customer considering two similar bottles. To put it charitably, a medal on a wine might be a tertiary influence. Certainly nowhere near as influential as a big point score from The Spectator, the Advocate, or Decanter.
On September 2nd, 2009 at 5:37 am ,Steve Raye wrote:
I just posted a comment on Alder’s blog irt the same study. The point that this study and your and Alder’s posts make doesn’t take into account the practccal aspects of selling wine at retail.
As a wine marketing guy, my responsibility is to sell wine. And in order to do that we have to get it on the shelf. Which means the retailer and the distributor are the real gatekeepers and arbiters of what options the consumer has in the first place.
So while medals might have little consistency or quantifiable accuracy, they do play a very simple, and very important role, of getting the wine on the shelf. This is especially true for wines from smaller producers that don’t have the leverage at the distributor or retail level. More often than not, a medal on a sell sheet is what makes the difference as to whether a wine makes it to the shelf.
So sure, medals may not be a reliable quality or value metric. But they do play a very important role on the commercial side. It may not be right, or fair, and may even be misleading to the consumer. But it is the way the system works.
And that being the case, wineries are pretty much required to enter as many competitions as they can, because a gold medal, even it’s from the local county fair, can be the difference between getting the wine in front of the consumer or not.
On September 2nd, 2009 at 9:19 am ,1WineDude wrote:
Tyler, the report is bunk.
The conclusions may be right, but not based on analysis of that data! I’m posting a piece on this tomorrow on my blog.
On September 2nd, 2009 at 10:30 am ,Cathy wrote:
Your question: What do you think explains this disparity: something inherent to wine competitions, the nature of blind tasting, or a lack of consensus of quality wine?
Yes, yes and yes, I think.
I see references to medals in tasting rooms most often (Michigan, Canada). They’re of interest, but don’t affect my buying because, well, I get to taste and decide what I like. Retail, I buy wine I already know something about in the grocery store and I buy on recommendations in wine stores, where I’m usually looking for something I don’t know about.
I do most years, though, buy the best in class wines in the Michigan wine competition just to see if I agree with the judges.
On September 2nd, 2009 at 10:25 pm ,Barry wrote:
I learned long ago not to trust medal-winning claims for any domestic wine. Call it prejudice, but I will avoid any wine that claims to have won a medal at the Los Angeles County Fair. Blecchhh!
On September 3rd, 2009 at 4:37 pm ,Tynan Szvetecz wrote:
I wonder if it’s worth asking: what do wine judging competitions have in common with more reputable, even “legitimate” forms of wine review, such as those employed by magazines like Spectator, the Advocate, or Decanter.
Do they not suffer from the same challenges – “where and when a wine is tasted in the line-up ( fatigue, tannin build-up on the palate in huge classes) and the personal style preferences of judges on specific panels.”
And does it therefore all just come down to a vehicle for marketing – getting your foot in the door etc?
On September 3rd, 2009 at 5:08 pm ,Claude Robbins wrote:
This is one of the greatest challenges we face at the International Wine Guild, where we certify wine judges. To properly train judges to maintain consistency over time (and in particular during one long judging even) is a formidable challenge. But it is possible, and some judges are really quite skilled. Unfortunately, 95% of the wine judges at most competitions aren’t properly trained, and therein lies a sizable margin for error.
One note here – we suffer from some cultural shortcomings in the United States that give our citizens less opportunity for proper training than you sometimes grow-up with in Europe. I see a trend towards education and consistency that will build over time, as more and more people in the US engage wine actively and get proper training in dealing with the rigors of judging many wines in one sitting.
For now though, certainly taking judging medals with a grain of salt is wise (unless you won the medal and it will help you sell your wine better of course!).
On September 10th, 2009 at 10:25 am ,Stevie wrote:
I think that this article about wine competitions is marvelous and long overdue. Like I’ve written on my blog, http://weirdcombinations.com/2009/09/taste-in-wine-is-subjective/ ,taste in wine has always been a subjective thing. It’s akin to comparing a Picasso to a Michaelangelo. Both works can be great for completely different reasons yet might not appeal to all of us equally. There are no universal wines that everyone would think great!
On September 21st, 2009 at 9:05 am ,Big oaky monsters, imports, burcak, medals – sipped and spit | Dr Vino's wine blog wrote:
[…] Hodgson! A Fresno State student wine wins “record” 49 medals. [Collegian] Permalink | […]
On November 16th, 2009 at 11:06 am ,WSJ: wine-rating system is badly flawed | Dr Vino's wine blog wrote:
[…] Hodgson’s research on the randomness of gold medals in wine competitions. In case you missed our discussion here and many others on them there internets, you can check out the WSJ article for a recap. The story […]
On November 16th, 2009 at 1:28 pm ,Doug Goodwillie wrote:
As a pretty heavy consumer of wine what really surprises me here is that I have never paid any attention to the results of any of these competitions and very rarely even see them referenced.
That said, I am not at all surprised at the results of this research and the conclusions drawn.
If I’m interested (and I often am) in the consensus of a committee of tasters I’ll visit CT. Beyond that I find that WA, IWA & BH are reliable sources once you are familiar with the preferences and idiosyncracies of the various individual critics providing the TN and the resultant ratings.
On March 23rd, 2012 at 1:39 pm ,The experts strike back? | Dr Vino's wine blog wrote:
[…] Princeton wrote an hilarious essay entitled “On Wine Bullshit.” Bob Hodgson had his two devastating papers about wine competitions. The Wine Trails books suggest high-volume, low-priced wines are all that you’ll ever need. […]
On March 24th, 2012 at 1:48 pm ,Gerald wrote:
Unreliable judges explain the results.
An unreliable judge does not give the same score to the same wine when the wine is judged repeatedly.
Very few judges are reliable. This was demonstrated.
We suspect that many wine critics are unreliable. It would be easy and inexpensive for critics to test reliability.
Scores and medals are marketing tools. Most of us drink the score.
On June 22nd, 2012 at 6:22 am ,Awards for English Wines 2012 – consolidated | wrote:
[…] 2012 wine award results are out, we can decide which wines are best, right? Well, no. Awards can be inconsistent between competitions.  Formal studies having been written which conclude that luck plays a large part. However, as a […]