This has been entertaining... in a "Gogglebox" sort of way.
I'm a semi-retired technical marketing person; I used to write technical specs for datasheets, etc. The biggest problem BY FAR is not 'bigging-up' numbers, but finding meaningful comparisons. And I long ago learned that it was far better (in a B2B market) to be conservative with the numbers and let people say "... but when I tried it, I got far better than that!"
So some fundamentals:
Torque is not the be-all-and-end-all of drill usefulness. It's way more complex than that.
What do you want your drill to do?
- Drive in screws that need a lot of torque (e.g. the 8" woodscrews mentioned above, of unknown diameter and style, into unknown materials)?
- Drill forever into concrete?
- Drive in hundreds of "easy" screws on one battery?
In each case above the design of the optimal drill for the task will be different:
Firstly, pick a battery system: Longevity of charge is determined by the energy the battery can store and release to do work, but torque is determined by the instantaneous power the battery can deliver. These are not the same thing - a simple matter of physics definitions. And then you can design a battery that stores a lot of energy, but isn't very robust and gets knackered quickly by the charge-discharge cycle.
We've had those knocking about for years. In the days of NiCds I had 'industrial' cells and batteries, which were far more robust than consumer ones with nominally much higher capacity. I'm sure the consumer ones did initially store more energy, but they didn't last, as their construction was cheap and nasty.
There's one really good reason why all the big-name 10.8V Lithium-ion battery systems look really similar: it is HARD to do this well, and only one manufacturer has had a really good system (which I'm guessing they OEM'd to Bosch, Milwaukee, Makita, Metabo, DW etc.). These batteries look identical, with a clip-on outer shell that matches the design of the drill plastics. "Go figure," as the Americans say.
So tool system designers have to decide what they're going for battery-wise.
Now your tool will need a motor: Do you want a slow RPM, or a fast one? Importantly, this has almost nothing to do with how fast the chuck turns - only the motor. And what torque should the motor deliver at its nominal voltage, and how should it behave under increasing load? How hot will you let it get in use (significant, as heat = wasted energy, and poorer performance). How heavy and bulky are you allowed to make it, and how much can it cost?
You're not going to find one motor that does exactly what you need, so you have to choose. But you still have some fudge-factor as...
... you also have to design a gearbox. Almost all electric motors (apart from some really weird ones that don't rotate continuously), are more efficient at specific RPMs, usually faster rather than slower. Any battery tool has to use an efficient motor, so that probably means a reduction gearbox. Gearboxes soak up power themselves (this is one reason why transmissions in cars need oil coolers).
So you want a gearbox that gives you... what exactly? The most efficient power transfer? The best RPM for a given job (might not be the most efficient power transfer)? The lightest weight/smallest size (might not last very long!)? Gizmos like multiple ratios (speeds) and hammer action or impact? If you build it from a high-performance steel, with beautifully ground and heat-treated helicoid teeth and expensive bearings, it will be very efficient, but probably unaffordable and way too heavy. If you mould the components from glass-filled plastic of some sort, it will be lightweight and relatively cheap, but it may only last until boxing day (or twelfth night).
My point is simple: given how many conflicting design imperatives there are, it's hard to see how you can make any generalized comparison between battery powered tools. Every tool designer has to plant a flag in the sand for each parameter, and after it's designed and production-worthy prototypes exist, it's still got to be sold effectively to the right target market. To be a bit silly, obviously you wouldn't attempt to use a 4kg battery-powered Hilti SDS to drive woodscrews, but it has huge torque and goes round fairly slowly, so on paper it ought to be dead handy.
Measuring them: You could give each model the same, standardised task, and then you could say "for this, specific task, model X is best." Good, sensible idea. But even that is really fraught:
We know that cheap Lithium-ion cells die/age faster than good ones as they are repeatedly charged/discharged. Good ones aren't cheap. The official ones for my Bosch 10.8V system (and my digital camera, incidentally) vary from 40 to 70 quid (approx).
So you might build a battery system using cheap cells and charger/power management chips that will perform really well when it's new, but which will die 3x or 4x quicker than one with well-matched, high-performance cells from a 6-sigma production process, teamed with a good charge-managing charging system and discharge protection circuitry in the drill.
So do you care most about how well a brand new drill does a task, or how well it's still going after three months on-site, five days a week, in the winter?
Finally, manufacturers aren't always sensible with numbers: Milwaukee used to describe their hand drills using a 10.8V battery system as "12V" drills (or at least Toolstation did, selling them). Technically, a 10.8V battery, in perfect condition, fully charged, probably does measure 12V across its terminals, under no load. The moment you start taking energy from the battery, its output voltage will drop. So you CAN measure 12V - not a word of a lie. But is it meaningful? You decide.
On which drill has the most torque, I have a one-word question: "when?"
Do you mean:
- when it's in the factory's QC lab?
- when it's brand new and fully charged?
- when it's cold?
- when it's hot?
- when it's been used hard for six months?
One thing: I'll bet most of the manufacturers know the answers to all of these!
When a new generation of products was launched in my old industry, the race was on to buy competitors' models and get them into our test labs. We'd check performance (loads of different paremeters), put them in environmental chambers (to speed up the wearing out/ageing process), and finally strip them down completely to get a parts count (reliability) and assess the quality of the design and components used.
We knew exactly how our products compared to the competition - good and bad points, both. That didn't mean we told the customers though. Sometimes we did change stuff, to address specific competitive weaknesses. Sometimes we left well alone.
We also analysed product returns, in great detail. We knew the failure modes and the probability of each failure (based on hard evidence). This also fed back into design improvements. I'm morally certain our competitors did exactly the same, depending on their financial resources.
Bear in mind though that, as a tool manufacturer, you probably don't want a perfectly reliable product -- you want loyal, repeat customers. They need to feel they got good value-for-money, and that they wore their tool out over a good lifetime, rather than it having a manufacture-limited life. So as a company, you don't want your products to go on forever - just for long enough to suit the customer's purchasing cycle.
Given all that, what exactly is the definition of a "good cheap impact driver" per the subject of the thread?
Frankly I don't think there can be one. It's far too complex a problem for just a simple answer.
E.
*There's an argument that 6-sigma reduces overall costs, which I appreciate and have seen in practice. Not going there at this time though.