Tom Vanderlip is the director of golf at Penninsula Lakes, a pretty solid public golf course in the Niagara Falls region owned by the same family that runs Copper Creek in Toronto. For whatever reason, he also, upon occasion, writes a column called “As seen by TV” (Tom Vanderlip, get it?). Anyway, a fellow golf scribe sent me his latest which amounts to an interesting rant on golf rating and writing. Since he waded into this mess, I’ve decided to take the time to reply.
T V just AS SEEN BY my thoughts…
1. If you play a golf course you’re ranking, you have to PAY! No exceptions. If you’re trying to help the average golfer, you need to know where they are coming from.
This is great in concept, but rather than opening up ratings to “the average golfer,” as Vanderlip says, it actually limits the process to the sort of person who can afford to pay thousands of dollars a year to play places like the one Vanderlip runs. The other issue, of course, is that none of this works for private golf courses. Those places have to offer an invite, or one must be a guest of a member, or the rater will never actually get to play.
2. Somewhere in your article you must mention what you shot and what tees you played. In addition to this, you cannot pick up anywhere! If you fire triple digits, so be it! The difficulty of a course is important to people. If you state that you are a ten handicap and you shoot 95, we can determine that either you had a rough day, or the course is tough.
Funny that Vanderlip assumes the ability to play a golf course and the ability to assess whether it is any good are one and the same. Tom Doak, for example, isn’t a great golfer, but he is one of the best golf course critics in history. Do I care what he shot when he played Cypress Point? Of course not. Golfing ability and your score has nothing to do with one’s ability to determine what is, and what is not, a good golf course. This is actually where Golf Digest and I differ, despite being on their ratings panel. Golf Digest insists only low handicappers rank on their panels and they must play the back tees when ranking a course. This seems one-dimensional to me. Of course, there are people who determine the merits of a golf course based on what they shot that day. Most of those people, in my experience, are golf pros.
3. You or anyone in your immediate family cannot be associated with the publication in question. It seems somewhat ironic the courses with the most ad space always get tons of votes. You never see a golf course with a full page ad finish off the charts.
I assume what Tom is trying to say is that the administrators of a publication who also act as raters can be influenced by advertising sales. For the most part, this is just bullshit. I don’t know any instance of this happening in a respectable golf publication and would like Vanderlip to prove his point with some evidence that ratings can be bought in Score or Ontario Golf or Golf Digest.
4. The ranking is done exclusively by the “average everyday Joe recreational golfer”. Why do we need course architects, golf industry advisors, golf business leaders, golf pro’s, golf writers and such? Not one of these guys has paid for a round since Persimmon Woods, yet they are supposed to determine what “good value”’ is?!? Can you get better than free? Puh-leez! Let the guy that works all week and shells out $50 to $150 for a round tell us what good value is.
I’m a big fan of having the “average golfer” on panels. However, the average golfer who has actually played everything from Highlands Links to Capilano usually turns out not to be the average golfer. Often times they are affluent private club members who have sought out the best Canada has to offer. Vanderlip can chastise “golf pros (no possesive needed, Tom), golf writers and such,” who sit on these panels, but they are often the only ones to see a broad enough spectrum of courses across the country. And since when was golf course ratings about “good value.” Maybe this is a pointed remark about Ontario Golf’s recent “Best Value in Ontario” cover.
5. Have a category that states “Did you have a good time”. Who really cares about shot values, or the drive into the course, the colour of the ball washers, the starter or the front gate? Just ask them if they had a good time and would they recommend to a friend. Isn’t that really enough?
In fact, Score Golf weighed “fun,” as a big factor in its latest rankings.
6. Private and public courses should never appear in the same ranking. You think that a high-end, ultra private course that produces 20,000 rounds of golf should be compared to a small public access golf course with a limited budget, but 45,000 rounds played annually?
I’m wondering where, with the exception of Don Valley, are the public courses that are doign 45,000 rounds. I think, in the era of high-end public golf courses, like Copper Creek, the one owned by his bosses, it is clear that some public courses stack up neatly next to the private tracks. I really don’t think this is that much of an issue.
7. The use of adjectives when writing about a golf course should be done so very carefully. Can a layout really be sophisticated or inspired and can bunkering be magical and robust? Really?
I love this bit. Sure a golf design, like a great painting or a song, can be “inspired,” and “sophisticated.” I don’t know about “robust” bunkering (I think of a computer network as “robust,” not the bunkers at St. George’s), but I’ve seen magical bunkering. Maybe Vanderlip needs to leave the Niagara region a bit more. I guess Tom is a big fan of the Hemmingway school of writing — the fewer adjectives the better. Now that would be a lot of fun to read, wouldn’t it?
That said, far too often golf writing, especially when it comes to courses, is prosaic and over done. So maybe he’s got a point here…
8. The ranker or panelist or whatever we choose to call them, must indicate the time of day and the time of year that they played. Trust me, if a guy is playing in a toque and sweater, he’s not going to have the same experience as a guy playing at 7:30 AM on a Tuesday in June. I know for certain that some rankers have lost track of time and rushed around at the end of the season to play a course.
This is an interesting point, especially given the amount a rating (especially in Score’s part) is determined by course conditions. When a rater plays the course will unduly influence this. I actually tend to think most major golf courses, public or private, are going to be in respectable shape a vast majority of the time. I, for one, think conditioning is an over-rated factor in determining the merits of a course.
9. Rankers should play completely anonymously. Golf. Have a beer. Leave.
See my remarks on point four.
10. If the Ryder Cup and Presidents Cup are held every two years, the Olympics and World Cup every four years, do we really need a new ranking every year? Can we not just get a ranking every five years or so? Wouldn’t it make it that much more special and give new courses a chance to properly mature and develop, instead of getting branded with a bad ranking because there was a particularly bad growing season?
New rankings aren’t done every year by any publication I know of, so I’m not sure where Tom is coming from here. In Ontario, OG does its rating every other year (next summer is the next one), while Score’s was earlier this year. These are separate publications. As for “Best New,” Vanderlip is of course aware that courses participate in these sorts of ratings for exposure in the hope they can win and use it for their marketing.
Just my thoughts,
Of course all if this is interesting, Tom, given that Pen Lakes’ website has a section called “Reviews and Awards,” where the club notes all of its individual accolades. That includes its Golf Digest 4 1/2 star rating, which the club seems quite proud of.
Apparently ratings are only bad your club isn’t included in them or they can’t be used in your marketing. By the way, what in the world are the “Niagara Golf Awards,” and how determines them?
Overall, I think Vanderlip has a point. Both of Canada’s major golf course ratings (Ontario Golf and Score) are, in my opinion, flawed. There are too many people involved with clear conflicts of interest or who simply don’t take the process seriously. I’ve spent most of the past decade searching out interesting golf courses across Canada and I know lots of others who do the same. I think these people really try to dig into what makes one golf course great and one average. That’s why golf architects Jeff Mingay, Ian Andrew, along with myself and golf entrepreneur Ben Cowan-Dewar compiled our own Best of Canada list for Golfclubatlas. You can find it here.
You can read Tom Vanderlip’s “As seen by TV” here.