I've never understood these power robbing figures that are produced. They just don't make sense to me. Examples:
I've owned a 1.1 litre four cylinder car that only put out 55 hp. Small car, manual box, could still get to about 95 mph top speed. If I installed a C6 auto according to the data the car couldn't even run due to the parasitic loss, yet normally has enough power to carry 4 adults and the weight of the car along at a fair lick.
I currently own a Walker ride on mower. Complicated machine, gearbox, hydraulic controls etc, has a 16 hp engine. When I engage the cutting blades it saps some power but it will still run fine, cut the grass, get up hills etc. Can't be sucking more than 4 or 5 hp under full load, yet couldn't turn the most efficient auto box from the 60's?
It just doesn't add up somehow.
I'm thinking the peak losses must be at a certain (high) RPM and are just another loss in the same way rolling resistance, wind resistance etc are "losses" that increase with speed, and stop you accelerating forever. I also suspect the losses are not linear in relation to the rpm, and they ramp up as the revs rise e.g. 5hp loss at 2000 rpm and 20hp loss at 4000 rpm, and different gearboxes would have different levels of efficiency depending on how they were operating.
Someone who has worked in this area would know the answer to all this - any Engineers out there who can answer this definitively?