Category: Critical thinking
Motivated reasoning is one of the easiest traps for an engineer to fall into. This pleasantly oxymoronic term can be considered as an extreme case of confirmation bias.
So what is it? I see motivated reasoning as the practice of allowing emotion to creep into the engineering process — usually through the emotion-based evaluation of data or calculations leading to an emotion-based decision. It is a matter of putting more credence into your feelings than in the data at hand.
Now this doesn’t mean you always have to be Spock rather than Kirk, but one needs to understand that all decisions are, to a certain extent, emotional, but that one can’t dismiss engineering facts just because they don’t fit your emotional needs.
A case came up recently that illustrated this tendency perfectly. A report generated by the best of the field offices completely disregarded a previous report from another office. Why? Because if that older report was true, it would mean more work and expense. By disregarding the old data, they could avoid a lot of work and cost. Trouble was that there was nothing really wrong with the old data. Certainly nothing that would support tossing out the whole report. To rationalize their position, they harped on some small errors and inconsistencies. Their position became an emotion-driven one based on external pressures to reduce cost and schedule impact. The correct, data-driven position was overridden.
We corrected this situation but it illustrates how easy it is, even for first-rate engineers, to fall into this trap. Experienced engineers can sometimes go with their gut (emotions) and succeed, but when data and calculations are available, go with the math every time. Sort of like flying in the clouds: trust your instruments and not the seat of your pants.
Tenets of Engineering Skepticism
A couple of my old posts talked about engineering skepticism (here). I’ve been thinking more about it and came up with the following four tenets of ES. These would be used first to test a claim, process, or machine before it could be considered useful to the engineering world.
Four tenets of Engineering Skepticism
1. The device, process, or method must be effective. It must be clear that there is a real effect without relying on advanced statistics. If the results are less than 25% better than chance (guessing), it doesn’t pass the ES test and can be dismissed as being useful to the engineering world.
2. It must be reliable. This means that it should work the way it was intended in real life conditions and with any trained operators. If it relies on the weather, the aura of the user, or how the stars are aligned, then it fails the ES test and can be dismissed.
3. It must be repeatable. If it is used in the same way in the same environment multiple times it should provide the same or very similar results. This should hold true even when done with different operators. For instance, if five different dowsers go through an area and give five different results, the method does not pass this ES test and can be dismissed.
4. And it must be teachable. If it depends on some peculiar talent or right of birth then it is of no use to engineering. Knowledge must be able to be recorded and passed on to new generations of engineers.
Failure to pass all of these tenets does not necessarily mean that the device or method is fake or false, but that it is not dependable or of enough rigor to be useful to engineering. Until it meets those tenets, any engineer worth his salt will dismiss these devices or claims.
Post a comment if you think any of this makes sense, or doesn’t.
Tim Minchin is simply brilliant.
Admitting you’re wrong
A good engineer, like a good scientist, must at some point admit that he is wrong. This is very difficult to do — especially for mid- and late-career engineers who are a bit set in their ways. It is, I guess, a pride thing. And may older engineers have spent their whole careers building up that reputation and pride. But what they, and their management, don’t understand is that it is important, and right, to admit when you’re wrong and that act should bolster an engineer’s reputation, not mar it.
It is those who won’t, or can’t, admit errors who are the bad engineers. These people ignore or spin data and calculations so as to fit their view rather than accepting that the data don’t support their belief. This is not engineering but rather wishful thinking. Or pride gone amok. One problem with this is that it can take years before the truth comes out. And sometimes it never does. This only contributes to the cognitive dissonance in the mind of such an engineer.
Granted, this path is difficult to avoid and each step down it makes it more difficult to make that jump to the right path. But it needs to be done and I think it is a sign of character and good engineering when people correct their path that way. Of course in our current climate this step is just an invitation for blame; probably the main reason for people staying on that wrong path.
Bad enough for the lone engineer, but real trouble comes when you have a whole organization that fosters this stance. We have such an organization where I work and the very idea that they are not ever wrong, and cannot ever be wrong, is ingrained in the history and management of this group. The members are smart, no doubt of that, and are highly educated engineers, but they can’t escape the corporate climate (even if they wanted to). This, in my opinion, makes them poor engineers in a poor organization.
And if you give such an organization power, what you get are arrogant and defensive engineers who use intimidation rather than rationality when dealing with people outside their organization. Because, you see, they now have to convince both you and themselves that they are always right. And that is a lot harder to do without the bluster. No matter what, this only leads to Bad Engineering.
Don’t fall into that same trap.
Suggested reading: Mistakes Were Made (but not by me)
Skeptic, as a word, has had a lot of different definitions and connotations in the past years and most of them become perjorative in the eyes of non-skeptics who seem to equate skeptic with cynic. The debate over the definition of the word, and who is more deserving to use the word for self-identification, is still going on today. From my perspective, there are two major camps: the scientific skeptics and the deniers who call themselves skeptics.
The former relies on scientifically received data upon which to base their conclusions. The latter make conclusions and go in search of data or anomalies to fit their beliefs. As you can imagine, I hold with the scientific skeptics and the proven scientific method.
But I think there is another category, or at least I want to create one: engineering skepticism. And no, this doesn’t lie halfway between the other two. It is close to SciSkep, but I don’t think it is quite an offshoot of it. At least not in the approach. I think of it as the engineering approach to acceptance and rejection.
For example. As I wrote about earlier on the ADE-651 bomb detector, SciSkep might develop double blind tests, collect data, test the hypotheses behind the machine, etc. EngSkep would point out the lack of a power source, the cited interference of the operator’s mood on the results, and the implausible operating ranges and would reject the machine immediately. Not because of testing data, but because of its clear unreliable nature.
Science and Engineering bring home the goods, as Carl Sagan said. Woo doesn’t. I’ll go a little farther and say that science and engineering have to be right, they HAVE to bring home the goods. Science needs to be right eventually, through a slow process that self-corrects. Engineering has to be right, right now. It evolves, but it has to bring home the goods now, not later, and do so without mistake. Engineering products have to be reliable or they don’t sell or they aren’t safe. And by reliable I mean that it does what it is supposed to do 99.99 percent of the time (or better). So no decent engineer on a source selection committee would approve the ADE-651. He wouldn’t have to wait for Science to catch up to know it isn’t reliable enough to consider.
So am I saying that if it works well that is enough for the EngSkeptic? Well, not really. Unfortunately, much of the Woo is in the realm of the body and mind and is very prone to subjective thinking. A person may think that a homeopathic pill cured him, Science can show that it could not, and did not. EngSkep can’t say anything about it. On the other hand, I guess we could reject things such as prayer for an amputee – a leg has never grown back so prayers asking for it to do so are demonstratedly unreliable and are not worth doing (mental comfort aside).
So is EngSkep a shortcut to normal SciSkep? Maybe. If the first question we ask a suspected Woo topic is, is it reliable? we might be able to save a whole lot of effort of scientific investigation.
Does Astrology produce reliable data? No. Toss it out. Do psychics produce reliable predictions? No. Don’t listen to them. Do Ufologists produce reliable evidence of UFOs? No. Ignore them. Do ghost hunters and their gadgets reliably produce evidence of ghosts? No. Reject their methods.
So okay, maybe a shortcut. And certainly not as rigorous as SciSkep, but do we really have the time to waste doing real Science on real Woo? Let’s use EngSkep as the first hurdle. Then, if the Woo passes, let Science take a crack at it.
I think I’ll have more to say about this soon. What do you think?