Delivering the final presentation at the 2016 AETC were Dave Simon of Ford Performance, left, and Jamie McNaughton of Roush Yates Engines.
Even though a NASCAR engine has little relevance to today’s production engines, Ford is still leveraging technology in passenger vehicles that it learned through its stock car program.
“One of the first big things that came out of NASCAR in terms of tech transfer back to road cars was the analytical model,” recalls Dave Simon, Ford Performance Motorsport engine supervisor who along with Jamie McNaughton of Roush Yates Engines gave the final presentation at the 2016 Advanced Engineering Technology Conference (AETC). “We were taking models built for production engines that ran to 6,000 rpm and then applied them to a 9,000 rpm race engine making three times the power.”
The 3.6-liter EcoBoost twin-turbo V6 developed by Ford Performance for the GT racing program.
© 2016, Nigel Kinrade
The models didn’t work well at first, so the racing engineers changed the methodology and developed more accurate performance formulas to suit their needs.
“That methodology for NASCAR is still being used at Ford Motor Company to develop its high performance engines,” says Simon, adding, “We have a much more streamlined path to transfer technology between race and road.”
While racing is giving back to the production side, it’s also a two-way opportunity for the road car engineering teams to help solve problems on the racing side. During development of the 3.6-liter EcoBoost V6 GT race engine that won Le Mans in 2016, Simon’s team conducted extensive combustion analysis.
Slide demonstrating sophisticated combustion analysis by Ford Performance engineers.
“We went to the production side and asked to use a model for cold-start emissions on a race engine,” says Simon. “They said they don’t do high-load, high-temperature but will try. Before we got on the dyno we had a pretty good idea of what our injector parameters needed to be.”
Production Ford GT road car undergoing assembly.
The technology transfer continued throughout the program. Simon’s crew had already developed a 3.5-liter twin-turbo V6 for the Daytona Prototypes (DP) a few years ago, an engine covered extensively by EngineLabs. When work began on a revised version for the new GT race program, Simon’s engineers were only steps away from the factory powertrain engineers who were developing a second-generation EcoBoost engine for the production GT that sells in the neighborhood of $400,000 each.
“In 2013 with the DP program, one of the first things we paid a lot of attention to was head sealing,” says Simon. “We did a lot of modeling and testing, then carried that knowledge over to GT race program. We also sat down with GT road guys and took them through everything we did, and they ended up making some changes to their head sealing methodology.”
Using the driver simulator to help engineers determine operating range for the engine and other race parameters for the GT engine program.
The GT race engine program was developed under some of the least favorable conditions possible for engineers. The first priority for any new engine is to define the operating range. However, that decision is dependent on staying within rules that had not been finalized, especially with regards to the FIA’s strategy for balancing power between naturally aspirated and turbocharged engines.
“We had a fluid situation for the rules,” says Simon. “Also, the car was still being designed.”
GT engine assembly at Roush Yates Engines.
© 2016, Nigel Kinrade
The latter meant there were packaging unknowns that would affect header design, turbo location, cooling and induction. There were additional complications, such as the need to homologate the gear sets, but the Ford team had no access to Le Mans for testing or previous data from the track. So the team turned to Ford’s high-powered driving simulator. By entering variables for engine power and rpm, throttle position, gear selection, vehicle weight, aerodynamics and other key performance factors, engineers could narrow down the operating range without having a car.
“Once we get an operating range, we can start doing 1D performance models,” adds Simon.
From there engineers run numerous simulations to test intake runner lengths, valve timing and more. More computer exercises follow as the components are designed in 3D and undergo FEA and CFD analysis when needed. Then there’s the final CAD renderings and prototyping of parts followed by engine assembly and dyno testing, including durability testing where nine engines totaled 32,149 miles in race simulation between Ford dyno cells in Michigan and the Roush Yates facility in North Carolina.
Ford Performance relied heavily on computer modeling and simulation to develop the GT engines.
“With development testing we have to produce quality data from which decisions can be made,” says McNaughton. “There’s always an opportunity to improve somewhere in the engine. You need to find it and make sure you have good data to support a change.”
As an example of the precision testing developed over the years at Roush Yates, accuracy of repeatability has been narrowed down to less than one horsepower on an 800-horsepower NASCAR engine.
“That’s less than a tenth of one percent,” says McNaughton, adding that Roush Yates builds about 800 engines per year to supply more than 30 different race teams. “And every one is dyno tested.”
Ford GT winning at Le Mans.
The GT engine program was quite successful. Running out of the Chip Ganassi stable, two cars competed on the IMSA circuit and two cars were built for FIA WEC competition. All four cars ran at Le Mans, finishing 1-3-4-9 in their class. The victory came on the 50th anniversary of Ford embarrassing Ferrari and sweeping Le Mans in 1966 with the GT40. (see below)