CanadianGolfer.com

The great course ratings debate: Tom Vanderlip wades in

Tom Vanderlip is the director of golf at Penninsula Lakes, a pretty solid public golf course in the Niagara Falls region owned by the same family that runs Copper Creek in Toronto. For whatever reason, he also, upon occasion, writes a column called “As seen by TV” (Tom Vanderlip, get it?). Anyway, a fellow golf scribe sent me his latest which amounts to an interesting rant on golf rating and writing. Since he waded into this mess, I’ve decided to take the time to reply.

T V just AS SEEN BY my thoughts…
1. If you play a golf course you’re ranking, you have to PAY! No exceptions. If you’re trying to help the average golfer, you need to know where they are coming from.

This is great in concept, but rather than opening up ratings to “the average golfer,” as Vanderlip says, it actually limits the process to the sort of person who can afford to pay thousands of dollars a year to play places like the one Vanderlip runs. The other issue, of course, is that none of this works for private golf courses. Those places have to offer an invite, or one must be a guest of a member, or the rater will never actually get to play.

2. Somewhere in your article you must mention what you shot and what tees you played. In addition to this, you cannot pick up anywhere! If you fire triple digits, so be it! The difficulty of a course is important to people. If you state that you are a ten handicap and you shoot 95, we can determine that either you had a rough day, or the course is tough.

Funny that Vanderlip assumes the ability to play a golf course and the ability to assess whether it is any good are one and the same. Tom Doak, for example, isn’t a great golfer, but he is one of the best golf course critics in history. Do I care what he shot when he played Cypress Point? Of course not. Golfing ability and your score has nothing to do with one’s ability to determine what is, and what is not, a good golf course. This is actually where Golf Digest and I differ, despite being on their ratings panel. Golf Digest insists only low handicappers rank on their panels and they must play the back tees when ranking a course. This seems one-dimensional to me. Of course, there are people who determine the merits of a golf course based on what they shot that day. Most of those people, in my experience, are golf pros.

3. You or anyone in your immediate family cannot be associated with the publication in question. It seems somewhat ironic the courses with the most ad space always get tons of votes. You never see a golf course with a full page ad finish off the charts.

I assume what Tom is trying to say is that the administrators of a publication who also act as raters can be influenced by advertising sales. For the most part, this is just bullshit. I don’t know any instance of this happening in a respectable golf publication and would like Vanderlip to prove his point with some evidence that ratings can be bought in Score or Ontario Golf or Golf Digest.

4. The ranking is done exclusively by the “average everyday Joe recreational golfer”. Why do we need course architects, golf industry advisors, golf business leaders, golf pro’s, golf writers and such? Not one of these guys has paid for a round since Persimmon Woods, yet they are supposed to determine what “good value”’ is?!? Can you get better than free? Puh-leez! Let the guy that works all week and shells out $50 to $150 for a round tell us what good value is.

I’m a big fan of having the “average golfer” on panels. However, the average golfer who has actually played everything from Highlands Links to Capilano usually turns out not to be the average golfer. Often times they are affluent private club members who have sought out the best Canada has to offer. Vanderlip can chastise “golf pros (no possesive needed, Tom), golf writers and such,” who sit on these panels, but they are often the only ones to see a broad enough spectrum of courses across the country. And since when was golf course ratings about “good value.” Maybe this is a pointed remark about Ontario Golf’s recent “Best Value in Ontario” cover.

5. Have a category that states “Did you have a good time”. Who really cares about shot values, or the drive into the course, the colour of the ball washers, the starter or the front gate? Just ask them if they had a good time and would they recommend to a friend. Isn’t that really enough?

In fact, Score Golf weighed “fun,” as a big factor in its latest rankings.

6. Private and public courses should never appear in the same ranking. You think that a high-end, ultra private course that produces 20,000 rounds of golf should be compared to a small public access golf course with a limited budget, but 45,000 rounds played annually?

I’m wondering where, with the exception of Don Valley, are the public courses that are doign 45,000 rounds. I think, in the era of high-end public golf courses, like Copper Creek, the one owned by his bosses, it is clear that some public courses stack up neatly next to the private tracks. I really don’t think this is that much of an issue.

7. The use of adjectives when writing about a golf course should be done so very carefully. Can a layout really be sophisticated or inspired and can bunkering be magical and robust? Really?

I love this bit. Sure a golf design, like a great painting or a song, can be “inspired,” and “sophisticated.” I don’t know about “robust” bunkering (I think of a computer network as “robust,” not the bunkers at St. George’s), but I’ve seen magical bunkering. Maybe Vanderlip needs to leave the Niagara region a bit more. I guess Tom is a big fan of the Hemmingway school of writing — the fewer adjectives the better. Now that would be a lot of fun to read, wouldn’t it?

That said, far too often golf writing, especially when it comes to courses, is prosaic and over done. So maybe he’s got a point here…

8. The ranker or panelist or whatever we choose to call them, must indicate the time of day and the time of year that they played. Trust me, if a guy is playing in a toque and sweater, he’s not going to have the same experience as a guy playing at 7:30 AM on a Tuesday in June. I know for certain that some rankers have lost track of time and rushed around at the end of the season to play a course.

This is an interesting point, especially given the amount a rating (especially in Score’s part) is determined by course conditions. When a rater plays the course will unduly influence this. I actually tend to think most major golf courses, public or private, are going to be in respectable shape a vast majority of the time. I, for one, think conditioning is an over-rated factor in determining the merits of a course.

9. Rankers should play completely anonymously. Golf. Have a beer. Leave.

See my remarks on point four.

10. If the Ryder Cup and Presidents Cup are held every two years, the Olympics and World Cup every four years, do we really need a new ranking every year? Can we not just get a ranking every five years or so? Wouldn’t it make it that much more special and give new courses a chance to properly mature and develop, instead of getting branded with a bad ranking because there was a particularly bad growing season?

New rankings aren’t done every year by any publication I know of, so I’m not sure where Tom is coming from here. In Ontario, OG does its rating every other year (next summer is the next one), while Score’s was earlier this year. These are separate publications. As for “Best New,” Vanderlip is of course aware that courses participate in these sorts of ratings for exposure in the hope they can win and use it for their marketing.

Just my thoughts,
Tom Vanderlip
[email protected]

Of course all if this is interesting, Tom, given that Pen Lakes’ website has a section called “Reviews and Awards,” where the club notes all of its individual accolades. That includes its Golf Digest 4 1/2 star rating, which the club seems quite proud of.

Apparently ratings are only bad your club isn’t included in them or they can’t be used in your marketing. By the way, what in the world are the “Niagara Golf Awards,” and how determines them?

Overall, I think Vanderlip has a point. Both of Canada’s major golf course ratings (Ontario Golf and Score) are, in my opinion, flawed. There are too many people involved with clear conflicts of interest or who simply don’t take the process seriously. I’ve spent most of the past decade searching out interesting golf courses across Canada and I know lots of others who do the same. I think these people really try to dig into what makes one golf course great and one average. That’s why golf architects Jeff Mingay, Ian Andrew, along with myself and golf entrepreneur Ben Cowan-Dewar compiled our own Best of Canada list for Golfclubatlas. You can find it here.

You can read Tom Vanderlip’s “As seen by TV” here.

Related Articles

About author View all posts Author website

Robert Thompson

A bestselling author and award-winning columnist, Robert Thompson has been writing about business and sports, and particularly golf, for almost two decades. His reporting and commentary on golf has appeared in Golf Magazine, the Globe and Mail, T&L Golf and many other media outlets. Currently Robert is a columnist with Global Golf Post, golf analyst for Global News and Shaw Communications, and Senior Writer to ScoreGolf. The Going for the Green blog was launched in 2004.

15 CommentsLeave a comment

  • Can we call TV’s concept the “Rodney Dangerfield Rankings”.
    If he sends his idea to the Toronto Sun they make make him editor of their annual golf guide.
    Then he can read his Average Guy reviews and try to make sense of them. Challenging! Fair!

  • Tom sounds like someone who thinks he knows exactly how the rankings are done, but it is clear he doesn’t. Furthermore, he sounds like he wants a top 100 “public” ranking, which I don’t fault him for and would probably benefit the tier two public courses who do not compete in the Canada top 100. We do not need another ranking in Canada and there are plenty of courses represented in Score’s top 100 that are public. Ranking matter more to those in the industry than anyone else. If Pen Lakes is as busy as people say, why does he care where it is ranked?

    His article comes off as sour grapes for seemingly being left off the list and his arguments seem to be focused on creating a list that would benefit Pen Lakes.

    Lastly, he underestimates raters ability to separate their game and conditions from their rating. This is exactly what the average golfer does not do, but in my experience raters do a good job.

    Aside from Pen Lakes, it would have been interesting to hear his top 5.

  • Thanks so much for this Robert. I read Tom’s article and laughed my ass off. Who is this guy and how out of touch is he?

  • WOW!
    Seems I hit a nerve with some people. Lets get one thing straight, this wasn’t to boost the ranking of Peninsula Lakes. The Golf course stands on its own. It wasn’t aimed a one specific magazine, web site or newspaper. I just wondered if we could have a list that consisted of those things. Why is it that everything is so holy when it comes to course rankings? Sorry if I had an opinion on something.

    In addition to this I was offended that indicated I only wanted low handicap players. I just asked that you be honest about what you shot and go ahead and play whatever tee you want. I don”t care! Just be honest. You are going to tell me that some courses aren’t extremely difficult and not suited from some players?

    It appears to me that we can all agree on one thing that the system isn’t perfect, but who or what is. i just gave my personal thoughts. As bad as you think they might be, I can assure that I have received a number of e-mails from “Every day Joe Golfer” agreeing with me. I had just finished reading an article on new courses and it came to me how unfair most ranking systems are and I thought why doesn’t someone do a list from the average guy. It could be a great promotion. Think of it? Pay for him or her to fly to Capilano or wherever. Have a contest. i am sure you would get a million people signed up for a chance to be a rater. I am asking these pubs to be creative and come up with something new that really allows the average golfer a chance to have a voice. Everyday I hear people coming through here telling me courses they enjoyed or disliked. A completely unbiased opinion! Interesting.

    TV

    TV

  • TV,
    That is a simply a popularity contest. The Score Rankings are not meant to be a popularity contest but the results of several informed opinions. Not perfect mind you but at least the opinion of someone who has seen many courses. That often limits the panel to people who work in the golf industry. But which “Common Joe” has seen enough great courses to rank them properly. How does one rank courses in Ontario if you have not at least seen the majority of good tracks in the province?
    Having interacted regularly with the opinion of the Average Joe golfer on golf courses, design and value, I have discovered that they don’t have an informed opinion. Difficulty is usually overated, esthetics are overappreciated and strategic elements are underappreciated or misunderstood.
    Value is a moving target depending on disposible income.
    Score publishes it’s Golfers Choice Awards. A popularity contest amoung Score golf readers. Sounds like what your suggesting is already being done.
    See link
    http://www.scoregolf.com/rankings/gca/2005/winners.cfm
    So what’s the problem?

  • Kerry,

    You are going to tell the paying public that they don”t have an informed opinion? Let me tell you they are much more savy and informed than you think. Give them some credit. I talk to these people everyday and I think they have a pretty good idea of what a nice golf course is. Would it kill one of these publications or websites or newspapers to have a contest (by region) to win a chance to be a course rater? How many zillion people do you think would love this opportunity? I just suggested that we give a voice to the people that pay the way for the golf industry. The average member, green feer or league players. Although you don’t seem to recongnize they are the most important element in grand scheme of things. Value is not a moving target….what on earth are you talking about? If you buy something and you think it is worth what you paid for it is good value. Simple as that. It could be a 150 dollar green fee or 45 dollar green fee and if you feel that it was worth the price….be happy!

    TV

  • Relax, TV is not out to get anyone. He is a huge supporter of golf and by no means was his intent of offend anyone. The fact that if they read it weekly, they would see the humour behind it all.

  • TV,
    We disagree on the the public’s savy. I say they are less informed. Their is more than one Toronto area golf forum where the average joes post course reviews and they are pretty universal in their praise of “challenge” and “fairness”. What does that mean? What is a “nice” course?
    Your point seems to demonstrate that value is a moving target. For each individual value is different. For those with higher disposible income they see Copper Creek as good value. For the average joe, it’s often overpriced. So “value” is often related to income of the individual. It moves. That is what I am talking about.
    I understand that golf industry survives on the average joe and they vote every day with their dollar, but it does not mean they understand what makes a golf course great.
    Score Magazine already does what you want, what is the problem? As for getting an average Joe onto the panel, if one is available who has seen most of the courses I am sure Bob Weeks is willing to entertain the idea.
    But if your whole rant boils down to one guy winning a contest to get on the Score Panel, their’s not much meat on that bone.

  • Hey Rob. Regarding point #2 and your response.

    As a long time reader of your work, I can’t forget the articles you wrote for the NP where you would join a business exec on his home course and detail how handily you beat him. We readers were often given a hole by hole account of the day’s events along with the final tally. I’m disappointed you no longer feel it’s important to include such detail. It gave us guys that really know your game something to snicker about. (Did he really say he shot 70 something?). Keep up the good work.

  • Kerry,

    What the hell are we arguing about? You are confusing the points. Let me put them in point form, so maybe it is easier for you to follow along.
    -The average golfer means the average golfer. you seem to associate this person as some trunk slammer from the local muni. This is not the case. The average guy is a member at a private club, a regular green fee player, a league player, a beginner etc. It has absolutely nothing to do with income. I am only referring to everyone other than writers, pros, sales reps, golf association leaders, etc.

    -Trust me…if you think that there are people out there who haven’t have played a great number of these courses you are kidding yourself, because I hear everyday where people have played and what they say and believe me they know good or great or average golf. Even if a course does only 30,000 rounds a year I assure you not all of them are panelists. so somebody is playing all these great courses.

    -I never wanted a guy to win a contest. I just suggested a change to the current format, and judging by the e-mails I have been receiving they would welcome the opportunity.

    You should have a little more faith in the golfing public, of which you are a part, maybe you would get invited to play some of these courses.

    TV

  • TV,
    Golf Digest has a ranking which is solely the public, no panelists, T&L Golf only has those ranking. Surely you are aware of these, as you have the 4 1/2 star GD rating on your website. So do you simply wish there were a Canadian-only version, or that this be the only form of rating?

    There are lots of websites where users can submit their reviews of golf courses, restaurants, movies, etc. Yet there are still movie reviewers, food critics who make a living writing on the subject. You could choose to go to a site where everyday people rank whatever and I can choose to read people who do it for a living.

    The same is true in golf courses. The criteria you spoke about above (especially value) are things that when evaluating the best courses in the country, world, etc. I could care less about. The same ranking that gave your course Pen Lakes 4 1/2 stars, gave Pebble Beach, Bandon Dunes, Kiawah Ocean, Whistling Straits and Pinehurst 5 stars, which I have no problem with. However, such luminaries as Big Creek Golf & Country Club, Little Mountain Country Club, Madden’s on Gull Lake, TPC of Myrtle Beach, and Woodland Hills Golf Course were also among the 5 star winners. There is no way that TPC Myrtle Beach is in the same league as the others mentioned, but as you said no rankings are perfect. Rankings submitted by the “average golfer” generally weigh heavily towards whether the service was good, whether they played well with their friends, whether the course was in good condition, whether they were in a good mood, etc. That is fine.

    However, if those are not issues of greater concern to me, why can Score or whoever has a different approach not be able to fill the void I am looking for in a ranking?

    Lastly, as a CPGA member, I am sure you rarely pay full price for your golf. Do you find that not paying full price (or anything at all) clouds your opinion of the golf courses you play? If someone gets comped every time they play golf, do you think this puts their assessment skills at a disadvantage? Do you think movie critics who get sent movies on DVDs to their house to review cannot be critical as the “average movie-goer” can buys a ticket? I guess I feel they can and that is why I still look at ranking of the best courses every time they come out.

  • First, as a ranker for both Score and Ontario Golf let me say that I think it’s great that Tom (whom I consider a friend) has weighed in on the debate about rankings.
    No ranking system is perfect but I have to say that I think that both Score and Ontario Golf do a better job of it than most of the American golf publications.
    I think it’s downright silly, for example, that golf digest insists its rankers have single digit handicaps.
    I’m not sure what percent of the golfing public have single digit handicaps but I’d bet it’s very small.
    That might explain why a couple of courses that have won Best New Course in Canada in Golf Digest can’t even crack Ontario Golf’s Top 50 in the province.

    I have to say that the point that I really disagree with Tom on is that rankers should report what score they shot.
    When I’m playing a course for rankings purposes I virtually never keep score.
    I don’t want to spend all my efforts concentrating on my game. I want to concentrate on the course and I want to see all of it.
    I don’t want to keep out of every bunker on the course. I don’t need to hit out of every one of them but I sure want to hit out of a few.
    I want to see what the vistas are like (if there are any) so I wander around a bit.
    That’s not to say that I don’t try and play the course as best as I can, but I’m not so preoccupied with that I don’t go through the checklist I have for every new course I play.
    I’m a pretty average player – about a 15 – and when I rank a course I try to look at from all handicap perspectives.
    And I know, personally, how I play doesn’t affect if I like a course.
    When I played Smugglers Glen in Gananoque this year, which was up for best new course, I was hitting it sideways but I really like it.

    I also know I like Tom’s course (Pen Lakes) a lot yet its chewed me up on more than one occassion.

    I think Tom’s ponts -while I don’t agree with some of them – make for good debate

    garry

  • TV,
    I am not confused, just unconvinced.
    I am or have been, a member, a public player, a league player or a beginner. I have been fortunate enough to play many great golf tracks, private and public in many countries. Not all, but more than average. Ask Robert. What courses am I missing an invite to?
    Go to any golf forum like Torontogolfnuts.com and find solid course reviews. They overvalue conditioning, difficulty and judge courses based on many things other than the course itself. It is a forum of average golfers!
    Have some average Joe’s outside the industry played most of the best tracks in Ontario? Perhaps a few but mostly the public clubs. I already stated Bob Weeks would love to hear from them if they have played enough courses to determine if Coral Creek is better than Osprey Valley or Westmount.
    Score golf already has a Best Value in the Golfers Choice Rankings.
    Atlantic: Highlands Links
    Quebec: Carling Lake
    Ontario: Lakeview GC
    West: Kananaskis Country GC
    B.C.: Golden GC
    I don’t have any issue with that ranking on a value basis. I think Kananaskis is a good value but not a particularly strong golf course from a design point of view. But for what an Albertan pays for it, I understand it place. It’s no Jasper Park or Banff Springs from a course perpsective.
    Lakeview is a fine value for the GTA. A few truly interesting holes and some classic par 3’s.
    Perhaps we value different things in a course review and cannot appreciate the same perspective.
    We agree to disagree.
    Good luck!

Leave a Reply

/* ]]> */