Testing Methodology

Pillar · Experience

How We Test Golf Gear

A Foresight GCQuad launch monitor, a permanent testing bay, and a minimum of 1,000 shots per club before we write a word. Every product in our reviews database has numbers behind it — not impressions, not vibes, not press-day talking points.

1,000+ products tested · 5 rounds minimum per product · All purchased or returned

Most golf reviews you read were written after a two-hour manufacturer event. Ours are written after weeks of controlled testing, real rounds, and enough data to be useful — not enough to be marketing. Here’s exactly how we do it.

1000+
Shots per club
5
Rounds minimum
3
Testers per product
0
Brand-supplied images

Our Testing Lab

Since 2017 we’ve operated a permanent testing bay in our studio. The core stack:

  • Foresight GCQuad. Our primary launch monitor — radar-agnostic, quad-camera photometric. Calibrated quarterly.
  • TrackMan 4 (as a cross-check). We don’t trust a single data source. When Foresight and TrackMan disagree by more than ~2%, we re-test.
  • SAM PuttLab. For putter roll, loft, face rotation, and delivery.
  • Pressure plate + 3D mocap. When we assess swing trainers, we measure actual weight shift and kinematic sequence — not “it felt good.”
  • Real courses. Indoor data is a baseline. Every club hits at least five real rounds at two different courses before we publish.

Who Does the Testing

Every product is tested by at least three golfers representing different skill levels. Our testing pool:

The three-tester rule

  • A low-handicap tester (0–5) — usually a PGA teaching professional from our partner network.
  • A mid-handicap tester (8–15) — the golfer most likely to buy the product.
  • A higher-handicap or beginner tester (18+) — because forgiveness claims matter most to the people who need them.

Each tester hits the product and its two closest competitors in the same session, in randomised order, with the brand logos masked where possible.

How We Source Products

You’ll find three kinds of product in our reviews. We’re always explicit about which is which.

  1. Purchased retail. The default. We pay full market price — same as any reader would — and the purchase receipt is retained. This is the majority of what we cover.
  2. Review sample, returned. Occasionally a brand sends a sample. We test it, publish, and send it back. The review states “sample provided, returned” at the top.
  3. Review sample, retained. Rare. Used only when the item has no secondary market or the brand cannot accept returns (e.g., custom-fit clubs). The review states “sample retained, valued at £X, not gifted for coverage.”

We have never received — and would never accept — a fee, a trip, an event, a gift, or any form of compensation in exchange for coverage. Our expense policy is published alongside our editorial standards.

The Scoring Rubric

Every club, ball, bag, and accessory is scored on a category-specific rubric published with the review. Weights vary by product class — you’d score a putter on “feel” more than “distance” — but every rubric is visible before you read the score.

Example: Driver rubric (out of 100)

  • Distance (25): peak carry, total distance, ball speed retention on mishits.
  • Forgiveness (25): dispersion across the face, measured off-centre performance.
  • Feel & sound (15): averaged tester ratings at impact.
  • Adjustability (10): hosel, weight, CG options.
  • Looks at address (10): panel rating; personal but shown transparently.
  • Value (15): performance-per-dollar vs. same-tier competitors.

The Photography Rule

Every photograph in our reviews is taken by us. If you see a product image on this site, we physically had the item in hand. We don’t use manufacturer press shots — they’re polished, retouched, and don’t show what you’ll actually unbox.

“Brand-supplied images hide the tiny details — the uneven paint, the plastic seam, the scratch that appears on the third range session. You deserve to see what we saw.”

Data We Publish

Where it’s useful, we publish the raw data behind our conclusions. For club reviews, that includes:

  • Ball speed, launch angle, spin rate — mean and standard deviation across all test shots.
  • Dispersion plots (shots downloaded straight from GCQuad).
  • Tester-by-tester notes, unfiltered.
  • Comparison against the product’s most likely competitors, same conditions.

You can request the underlying session files from any review by emailing us. We keep them indefinitely.

What We Don’t Test

Methodology is about knowing your limits. We don’t test:

  • Golf balls in extreme conditions (below 40°F / 5°C) — our data is warm-temperature only, and we say so.
  • Durability beyond 30 rounds for most clubs. Long-term wear claims are labelled as anecdotal.
  • Performance at Tour swing speeds (120+ mph). We partner with PGA Tour testing outlets when coverage requires it and cite them clearly.

Our testing evolves. When we change methodology — a new launch monitor, a revised rubric, an expanded tester pool — we note it here and flag affected older reviews for re-test.

JC
James Caldwell
Founder · PGA of America Class A
Last updated
15 Apr 2026