Difference between revisions of "User:Woozle/2006-12-20 political theorizing"
(→Fork 1) |
m (Reverted edits by 12.195.127.198 (Talk) to last version by Woozle) |
||
Line 6: | Line 6: | ||
* Why those people, when cornered, always seem to eventually weasel out of rational argument | * Why those people, when cornered, always seem to eventually weasel out of rational argument | ||
− | + | ==Fork 1== | |
+ | In pretty much any political era (though more so in some than others, e.g. the last 4 years have been a particularly rich vein), you will find people whose opinions Just Don't Make Sense, when you stop to think about it. Right now (though less frequently in the past few months), it was the people who equate "support for the [[US invasion of Iraq|war on Iraq]]/[[war on terror|Terror]]" with "support for Our Troops", "support of [[George W. Bush|Bush]]" with "support of America", and that sort of thing. | ||
+ | In earlier eras (though I understand that this is still a strong undercurrent in American society, and probably helped Bush get into power), there was a lot of noise made by people whose beliefs seemed to boil down to the idea that "my church is the only right church, and everyone else is going to Hell... including anyone else who believes exactly the same thing except for minor details of doctrine". | ||
+ | |||
+ | There were also people who seemed to believe that Jane Fonda was somehow working ''against'' the interests of US veterans when she campaigned against the Vietnam war. | ||
+ | |||
+ | I'm sure there are other examples; now that I've got a theory to match them up against, I'll probably start making a list. | ||
==Fork 2== | ==Fork 2== | ||
The other fork of this theory is that I've kind of noticed that a certain mentality tends to be way too trusting of people with the right social credentials. Those credentials could be a combination of suit-and-tie plus important-sounding job, or the cloaks of church office, or even just the kind of commanding/peremptory language used by (say) lawyers. | The other fork of this theory is that I've kind of noticed that a certain mentality tends to be way too trusting of people with the right social credentials. Those credentials could be a combination of suit-and-tie plus important-sounding job, or the cloaks of church office, or even just the kind of commanding/peremptory language used by (say) lawyers. |
Latest revision as of 01:18, 9 August 2009
A new theory of political opinion seems to be coming together in my head. (Or maybe it's not new and I'm only finally getting around to understanding what others have been saying.)
If true, this theory would seem to explain a number of perplexing things:
- Why "we never learn from history"
- The way many people seem to hold bizarre opinions on certain issues
- Why those people, when cornered, always seem to eventually weasel out of rational argument
Fork 1
In pretty much any political era (though more so in some than others, e.g. the last 4 years have been a particularly rich vein), you will find people whose opinions Just Don't Make Sense, when you stop to think about it. Right now (though less frequently in the past few months), it was the people who equate "support for the war on Iraq/Terror" with "support for Our Troops", "support of Bush" with "support of America", and that sort of thing.
In earlier eras (though I understand that this is still a strong undercurrent in American society, and probably helped Bush get into power), there was a lot of noise made by people whose beliefs seemed to boil down to the idea that "my church is the only right church, and everyone else is going to Hell... including anyone else who believes exactly the same thing except for minor details of doctrine".
There were also people who seemed to believe that Jane Fonda was somehow working against the interests of US veterans when she campaigned against the Vietnam war.
I'm sure there are other examples; now that I've got a theory to match them up against, I'll probably start making a list.
Fork 2
The other fork of this theory is that I've kind of noticed that a certain mentality tends to be way too trusting of people with the right social credentials. Those credentials could be a combination of suit-and-tie plus important-sounding job, or the cloaks of church office, or even just the kind of commanding/peremptory language used by (say) lawyers.
At the extreme far side of this tendency, we have so-called Biblical "fundamentalists", whose basic creed in life seems to be "do what the authority figure tells you, and don't listen to anyone else" – motivated by a strong desire to avoid, at all costs, any form of considered thought.
Apex
And at the top, where the forks come together, we have the theory (which I started thinking about before the rest of this, and which makes a great deal of sense on its own) that most of the really stupid stuff that happens in recent history is not "us" failing to learn from history, but manipulations by people who stand to gain from those stupidities.
Putting it all together
this part isn't finished.
And I guess the insight is that the "nonsensical POV" people's opinions suddenly make a lot more sense if you see their opinions as having been deliberately fed to them by the manipulator-people, who knowingly take advantage of the nPOV people's willingness to be led by pressing all the right buttons and waving all the right social credentials at them.
And all along, I think most of us have been assuming that we needed to reach out to those people (the nPOVers), to try and engage them in dialogue and understand where they are coming from, and what their reasoning is for why they say that stupid stuff that they say.
If this theory is correct, it explains why that never seems to work.
Which is (just to be specific) that they didn't form those opinions themselves; they're just going on perceived authority. They don't really know why they believe what they've been convinced to believe.
In short... their opinions don't need to be taken seriously.
Take a particular nonsensical POV. If, every time you manage to corner someone who actually *holds* that POV and try to get them to defend it, they end up wriggling away -- usually by some form of subject-change, though often by other rhetorical devices designed to end rational discussion -- it would seem reasonable to conclude that you're not dealing with someone who has followed a line of reasoning to end up with that opinion.
When an opinion can be pretty quickly shown to be nonsense, and when you can never find anyone to defend it to the point where you can at least see a rational argument for it, you don't need to worry that there might be some kind of overlooked truth to it. A more sensible course would be to treat the people who hold it more or less as obstacles to be overcome, rather than as rational participants in the debate... because when invited to be rational participants, they inevitably bow out.
Corollary: We (those of us who do relentlessly examine our own beliefs) don't give ourselves nearly enough credit for our ability to be objective (specifically, to see and acknowledge any grain of truth in an opposing argument even if we don't agree with the conclusion).
Another note: this theory needs to distinguish itself from the overly-simplistic "Some people are just sheep!" model. I think the difference here is that it would not be fair to describe the people being manipulated in this model as sheep; they will fight passionately (and perhaps even somewhat creatively) for what they believe in. What they won't do, passionately or otherwise, is work towards a better understanding of truth/reality, by which those beliefs might be shaped. They are more like psychologically-programmed troops, whose lives and spirits are being heedlessly wasted towards the unspoken goals of others. "Ignorance is Strength"
I'm not sure what I think of this theory yet, but it sure puts a lot of things in a new light.