Cellar Tracker wine scores

Perfectly put on all points.

agree with Marcu$ Stanley above, very well noted on a wine drinkers psyche.

CT comes into play in almost all my purchases but I don’t just look at averages anymore (I know, I was young and dumb). It is extremely useful look read notes and check on wine description and consistency among these notes. I also check and compare a certain vintage’s score with other vintages, again to check on consistency.

Good advice on checking out users with a similar palate and become a fan. I added Keith Levenberg as well and I’ll dive further into that :wink:

great post.

Like some of the above, I use CT to see if a wine is approaching, at or past its peak. I also use it similarly for buying older wines. The ratings for me are an unimportant guide, not to be taken too seriously.

When it comes to affordable new releases of wines I have not tried, I still almost purely use WA ratings. There is so much wine out there, that its hard to find the hidden gems without tasting 10k+ wines per year or at least leveraging someplace that does and you are unlikely to find the most credible reviews of those lesser known affordable wines on CT, if you can find any at all.

After I just try to learn my desired region/varietal and their producers well enough along with the vintages. I’m pretty confident I will be buying Brunellos from Cortonesi, Talenti, Il Poggione, Uccelliera, Casanova di Neri and likely a few others when the 2016s are released.

My favorite wine from Christmas eve and Christmas day was not rated on CT nor WA. 2015 Tommaso Bussola Recioto della Valpolicella Classico. Even when you do find a review on CT for lesser known region or varietal it is likely from someone who has no point of reference and is trying it for the first time or only had a handful, so the review is usually more descriptive of whether the reviewer likes that type of wine, rather than the quality of the wine itself.

Let us say I am interested in buying a young wine that needs 20 years of age. I look on CellarTracker and see scores. Are the scores based on how good the wine tastes now (probably primary and tannic) or are they an estimate of how the wine will taste in 20 years. The likely answer - depends on the poster. So, the average is often a mix.

this one is a big thing I’ve noticed. Cali cabs are almost always scored higher on average than amazing burgundy. a cali cab scored a 94 is typically a meh to me, whereas if a burgundy is a 92+ then its gonna be knock your socks off. I wonder if the cali part has partly to do with the fruit you mentioned, and partly to do with the fact that a lot of those drinkers pay attention to the score as mentioned earlier, whereas burg drinkers probably dont as much.

this is one of the frustrating things about ever using the average. there’s that inevitable “pretty wine… 83” or “needs more time, hard to tell now… 97” throwing things off. if I’m really interested I will typically do a kind of running average on my own in my head and throw out any obvious outliers. but thats again where reading the note and knowing the reviewer help!

I’ve noticed this too and it seems to be particularly pronounced among the Paso aficionados.

A case study on a Keith note leading to success: Earlier this year I was browsing my LWS and saw a good price on 2016 Durfort Vivens, which I only knew as a second growth that received little attention. My first stop was here (WB), where there were few comments posted about Durfort, and none on recent vintages. Second stop, CT, where the average score was a 91.x, with a bunch of high-80s from random people. Pretty low and/or undistinguished for classified 2016 Bordeaux.

However, Keith had scored it a 96, which signaled a lot more than the average CT rating. (Second hint was Jeff Leve scored it a 92, which also meant that it wasn’t likely to be a modern extraction bomb!). I picked up a couple bottles to try… and sure enough, it turned out to be great to me (old school structure, beautiful aromatics and fruit, and not unapproachably austere). I thought it better than Calon Segur at less than half the price, and went back to clean out the bin.

In this case, Keith’s detailed description was helpful for sure, but his score was the eye-catching bit than got me to spend more than 5 seconds on the CT page before moving on.

And even better, that lead to you posting it here and reminding me to buy some! I loved the 2015 and for some reason never grabbed any of the 2016.

exactly how I feel. While laypeople may be knowledgeable, I’d go with a pro’s opinion—same is true with a medical question as a wine one.

1 Like

I do not post scores on CT. For me, it more about what the descriptors state relative to my palate preferences so I want to know more about color, aromas, taste, feel and finish plus any other characteristics the reviewer chooses to include especially flaws.

I would say it would be “Drank with Cheryl and Chris”, for example, with no commentary on the wine or score.

1 Like

I would say it would be “Drank with Cheryl and Chris”, for example, with no commentary on the wine or score.

Got one better. A taster today had all tasting notes with:

To (insert name)
To (insert name) Christmas 2020

That second note adds even more detail!

1 Like

I have this argument with a friend all the time, but my view is that aggregate CT scores are completely useless; individual reviews from people whose palates I trust (even if I don’t agree with those palates) are very useful. If someone I know likes very oaky big northern Rhones gives a wine a huge score, I’d stay away, for example. My usual example for this are 2003 northern Rhones, like Chave and Sorrel, which consistently get superlative CT rankings but which I find downright undrinkable.

1 Like

I think it’s a combination of the “big fruit bomb” effect and the culture of the drinkers. You can tell it’s related to fruit because within California more structured and elegant producers also get lower scores. E.g. Pride Mountain Reserve Cab consistently gets Cellartracker scores that are higher than Ridge Monte Bello. Until recently Caymus Special Selection got scores right around Ridge MB, slightly higher in many years actually. I don’t think most here would share that quality judgement.

And the semi-annual reminder that there is a report button on every tasting note page. We remove these and help the user use CT in a more productive fashion.

I am disappointed that I have not gotten a complaint about one of Keith’s scores ruining the community average in a while. Step it up, man!

1 Like

do you think that has anything to do with the fact that the people writing notes on newer Caymus have bought it because they ‘know’ caymus is a huge name they should enjoy but not much else, so they convince themselves they enjoy it, whereas Monte Bello buyers know its one they should enjoy, but also are experienced enough to be more discerning? the people I know that have opened caymus recently wouldnt know if it was actually better than their everyday or not.

With restaurants in Europe, my favorite source of information is this board. Have gotten great recommendations from the Travel and Restaurant Forum on restaurants.

I literally don’t even look at scores. Plenty of others have explained all the reasons I find them completely and utterly useless. I’d actually turn off viewing the scores at all if that was an option on CT.

A proper tasting note helps tell me whether I want to drink or hold a wine, or whether I might want to buy it. You read enough notes for a wine and even given the variability of people’s palates you’re still able to get a sense of its style, which is the most important facet for me.