Some people seem to think that if you have a problem or an issue, all you need to do is to collect enough information about it, and that will tell you the answer.
Robert McNamara has provided a stark counter-example. As well as being the Secretary of Defence during the Cuban missile crisis and president of Ford motor company, McNamara was a statistics analyst for the US airforce during WW2.
In the story he told, the Americans did a huge analysis of data from repair reports on B17 bombers, the plan being to add armour to the areas of the aircraft most commonly damaged, because bomber losses were so high. You can’t armour the whole plane, because of weight, so it was incredibly important that the armour went in the right place.
As McNamara told the story, one of the guys on the team, a Jewish mathematician named Abraham Wald (who had been thrown out of Austria because of Nazi anti-semitism), discussed the project with an armourer who’d been shipped back from overseas, who was helping the design team with the armour kits.
The armourer told Wald the project was all wrong; that he’s seen planes fly home with damage in all the areas they were armouring. Then Wald had a eureka moment – he drew a diagram of the bomber, and ruled out every area where a plane had come home with a bullet hole in that part. The diagram was stark – no plane had ever come home with damage in certain angles of the cockpit (where a bullet would kill both pilots) or at the base of the vertical stabiliser.
Wald had realised that the planes that had been shot in these bullet-free zones never made it home to be accounted for. They changed the armour, and crew survival rates shot up.
McNamara told the story to make a point about data – he said had they just followed the data, it would have led them to armour the places the plane could survive a hit and get back, not the places that would always bring the aircraft down if they were shattered.
So you need to collect the right information, because collecting the wrong information can mislead you to a wrong conclusion – in the above case the opposite conclusion to the right one.
If you find the information on this blog useful, you might like to consider supporting us.