The carbon fiber, reinforced, the light-weight shaft was developed in 1969 by the Shakespeare Sporting Goods Company, for whom Frank Thomas (2011) was then Chief Designer. Since that time, carbon-fiber shafts have undergone continuous development, particularly with respect to improved stiffness per weight; shafts are now available in the 50 to 60-gram range. Despite claims to the contrary, this has resulted in only the tiniest of increases in swing speed, and likely no increase in ball speed, when compared to even the heavier hollow steel shafts, which are now almost never used in drivers. However, the fiber layup construction of shafts has allowed the stiffness distribution along the shaft to be customized for different player abilities. In particular, regions of increased flexibility are positioned toward the end of the shaft to promote forward rotation of the club head for increased “dynamic” loft. These issues will be discussed fully in Chapter 3.
As mentioned previously, in 1995 the Callaway Company introduced the first high-strength titanium alloy driver with a thin diaphragm-spring face. It is interesting that Callaway adopted titanium alloy as a means of further increasing the volume, and therefore the moment of inertia, of their hollow Big Bertha™ range of drivers while keeping the weight within the accepted 0.44-pound range. Popular lore has it that the increased ball speed that resulted from the excellent spring quality of the titanium alloy, and allowed the thin driver face to act like a trampoline, was an unexpected bonus. This may, in fact, be true, because Callaway did not file a patent on the trampoline effect, with disastrous consequences for the company. By 1997, Callaway, after spectacular growth from a few million dollars to almost one billion dollars in sales, was selling more of all golf products than almost all other golf companies combined. However, by 1998, eighteen oversize titanium drivers, from thirteen manufacturers, were competing for market share, and the Callaway market share was in decline. It is interesting to conjecture that, with a robust patent, Callaway would likely have dominated the entire golf market up to this time.
As Thomas (2011) reports, one of Callaway’s main competitors informed the USGA that they were finding unusual performance results in their testing of the titanium club. This performance improvement was the subject of a 2001 article by Michal and Novak, the main focus of which was to demonstrate that even better performance might be possible through the use of amorphous metals, often called liquid metals, in driver head design. Michal and Novak used a measure of the trampoline, or more correctly diaphragm, spring quality (Dieter 1983, Ashby 2005, Dewhurst and Reynolds 1997) to compare different materials for clubhead design. Michal and Novak showed that liquid metals, particularly Vitreloy™ by Liquidmetal Technologies (2011), had the potential for superior golf club face performance. However, attempts to produce a golf club with this material were unsuccessful, apparently through a combination of fatigue-failure issues and difficulties of bonding the faces to driver bodies. When the list of candidate materials is reduced to those already applied successfully in club heads, titanium and beryllium copper are shown to be equally best in class for driver diaphragm spring faces. However, when the desire is to combine large head size with diaphragm spring quality, material density must also be considered, which puts high-strength titanium alloy alone at the top of the performance list for driver head design (see Chapter 2 of Design for Manufacture and Assembly, Boothroyd, Dewhurst, and Knight 2011). It is interesting to note that later on, the Callaway Corporation, apparently forgetting the reason for the pioneering breakthrough they had made, developed a carbon-fiber reinforced driver's head and face for which diaphragm spring quality is far inferior to high-strength titanium, which was unsuccessful in the marketplace.
Much more recently, golfer performance has been considerably enhanced through the invention by Fredrik Tuxen (TrackMan A/S, U.S. Patents 2007, 2008, 2009) of a radar monitoring system of ball flight, and his founding of the Trackman™ Company in Denmark. Trackman performs precise measurements of the club head and ball velocities, launch angles, ball spin rates, and the tilt angles of the ball spin axis, in addition to monitoring the complete ball trajectory. It has proved to be a powerful training tool, allowing players to try different combinations of club loft angle, shaft stiffness, and strike attack angle, along with variations of their swing mechanics, to obtain optimum conditions for ball flight. This complemented the development of new multilayer balls designed to allow high launch angles in driving without the generation of excessive backspin. Thomas (2008) envisioned these golf ball developments and, in 1999, proposed an optimized overall distance rule. The intention was to have a maximum overall distance for all balls but using the particular optimum launch conditions for each manufacturer’s ball. This proposal was never adopted, but a rule for overall distance under a standard set of ball flight conditions was adopted in 2004 by both of the governing bodies of golf, the R&A (Royal and Ancient Golf Club of St. Andrews) and the USGA, referred to with slight abbreviation going forward as R&A/USGA.
Tuxen has published a number of articles (2007–2010) over recent years containing comprehensive sets of data on ball launch conditions and ball flight, together with a number of empirical rules relating fundamental aspects of ball striking to ball flight. With the permission of Tuxen, these have provided invaluable data for this work, in some cases providing validation of analytical predictions, for others allowing estimations to be made of model input parameters. Reference will be made to individual newsletters in later chapters.
For a given impact velocity with a given club head mass, the principal reasons for loss of distance when driving a golf ball are the loss of energy in the golf ball itself as well as the twisting of the club head resulting from off-center hits. A golf ball is far from perfectly elastic. When deformed during impacts of approximately 100 miles per hour, with even the latest high-performance balls and titanium drivers, 30 percent or more of the energy transferred to the ball in deformation is typically lost to internal friction in the ball polymer structure rather than recovered in extra ball speed.
The original developers of golf equipment, contending with much less efficient balls than those currently used, presumably discovered by trial and error that greater distance could be obtained by striking the ball with a softer club, which by sharing the deformation of impact would necessarily reduce the ball’s deformation and consequently the amount of energy loss. The best material for this purpose, with sufficient strength to sustain the impact loads, was found to be hardwood, shaped so that the face cuts across the growth rings. This ensures that the stiffness in the direction of impact is the lowest possible; in fact, less than 8 percent of the stiffness at right angles to the wood fibers, as, for example, that for a baseball bat (Wood Handbook 1999). Compression tests carried out by me, on specimens cut normal to the face of persimmon wood blanks from the Louisville Golf Company, gave a stiffness modulus in the direction of impact of only 90,000 pounds per square inch. This is considerably less than the modulus value of 130,000 pounds per square inch used by Michal and Novak (2001), on the basis of which they predicted that the maximum force from a 100 miles per hour impact, with a wooden driver, would be 3,300 pounds; even less than their predicted value of 3,500 pounds for a diaphragm face modern titanium driver. However, Michal and Novak used a “static” rather than “dynamic” solution, so we should only accept the relative magnitudes of the two values.
the problem with the performance of hardwood drivers is that significant energy is lost in internal friction between the wood fibers as well as in friction inside the ball; with the modern ball, this may be a zero-sum tradeoff. Moreover, as discussed previously, the distribution of the approximately 0.44 pounds of mass throughout the bulk of the club head gives a very low moment of inertia compared to modern hollow titanium drivers. Following the recognition of the increased performance of the hollow titanium Callaway driver heads in the late 90s, the R&A/USGA established a new equipment rule that the coefficient of restitution resulting from a 109 miles per hour impact with a golf ball should not exceed 0.83. To enforce this rule, both the clubs and golf balls had to be subjected to independent tests. Both tests use ball cannons equipped with ballistic screens to measure the rebounding ball velocity after impact. For the ball test, a standard titanium alloy circular test plate was adopted.
The plate is 4 inches in diameter, with a 3-inch diameter inner region 0.115 inches thick, to form a spring diaphragm; and with an outer thick flange to provide a total weight of 0.44 pounds, equal to the typical driver head weight. The plate was made from the same titanium alloy and designed to give the same impact performance as the Callaway titanium drivers already in the market. Balls exceeding 0.83 CofR when fired at the plate at 109 miles per hour are deemed nonconforming. With this test in place, balls with the maximum 0.83 CofR value could then be fired at the sweet spot (the point directly in front of the center of mass) of driver's heads to check that the CofR did not exceed 0.83. In addition to the CofR rule, the R&A/USGA now restrict the volume of drivers to 460 cubic centimeters and the moment of inertia component about the vertical axis (vertical MoI) to 5,700gram-centimeter squared. Almost all manufacturers now supply their drivers with the maximum volume. However, attempts to increase the vertical MoI into the 5,000+ gram-centimeter squared range, with rectangular-shaped driver head profiles, did not meet with success in the market; and the manufacturers seem to have now settled on vertical MoI values in the area of 4,600 gram-centimeter squared for 460 cubic centimeter clubs.
While much attention has been focused by the R&A/USGA on the impact efficiency of clubs and balls, surprisingly nothing in the rules concerns the aerodynamic performance of golf balls, except the overall distance restriction. At the time of the introduction of the CofR rule in 2002, the R&A/USGA overall distance standard (ODS) required that a ball struck by a conforming (0.83 CofR) driver at 109 miles per hour should not exceed 297 yards in total distance, including bounce and roll, under carefully controlled standard conditions. In 2004, to reflect the increased driving distance of professional golfers, the test was extended to the requirement that a total distance of 320 yards should not be exceeded with an impact of 120 miles per hour. Currently, the average PGA Tour player drives the ball for a carry distance of 269 yards, with an additional average bounce and roll distance estimated in Chapter 3 to be 40 yards. This average overall distance of 309 yards is achieved at an average impact speed of 112 miles per hour, so it must be assumed that the longest hitters are routinely driving further than the R&A/USGA limit. The loophole in the rules is that the ODS is for a defined launch angle and spin rate of the ball. Therefore, manufacturers are free to design balls that may exceed the ODS but with different optimum launch angles and spin rates than defined for the ODS. If Thomas’s proposed Optimized ODS had been adopted, every new ball on Tour would have been wind-tunnel tested to determine its optimum launch conditions, which then would have been used to assess overall distance.


