Frames are mental structures which shape our worldviews. They’re largely unconscious, but are revealed by the language we use. For example: “time is money”. This isn’t just a figure of speech – we conceive of time as a commodity, and the frame is activated by common phrases: “don’t waste my time”, “spending time”, “borrowed time”, “running out of time”, “I’ve invested a lot of time in it”, etc.
This metaphorical conception of time isn’t universal – it doesn’t exist in all societies. Some cultures have no conception of “efficient use of time”. The “time is money” frame has certain negative consequences (stress, insecurity, short-termism, etc) – in addition to the positive things claimed for it by business managers and orthodox economists.
Repetition can embed frames in the brain, and frames define our “common sense”. Existing frames don’t change overnight. One thing you’ve probably already noticed in the mass media is repetition – the same phrases and notions are repeated over and over. It hardly matters whether you’re agreeing or disagreeing – what matters is that certain frames are used, while others are excluded. This reinforces certain worldviews – physically, in the brain – by strengthening neural connections in readers/listeners.
Cognitive science tells us that when facts contradict a person’s worldview (their conceptual “framing” of various issues), the facts will probably be ignored and the frames/worldview kept.
When a person’s conceptual frames don’t mesh well with evidential “reality”, the evidence that doesn’t fit the frame will likely be ignored, overlooked or dismissed. This leaves a partial, blinkered view of the “facts” – which reinforces the existing worldview. In extreme cases (eg the ideological belief that “market forces” always produce the optimum outcome for humanity) the high-level beliefs are sustained by ignoring or denying a large portion of the available corroborated low-level facts.
This way of “thinking” differs fundamentally from the classical view of “reason” as applied empirically (eg in scientific method) – in which factual evidence is allowed to challenge, refute and ultimately transform our beliefs about the world.
The lesson from this is that publicising the facts about any issue may not be sufficient to change people’s minds. And no political viewpoint has a monopoly on “objectivity”. Everyone tends to ignore or dismiss the facts which are inconvenient to their worldviews. And everyone tends to find an abundance of “evidence” or “proof” which supports their worldviews. These processes occur because of the way our brains conceptualise with metaphors and frames – resulting in the creation of our personal reality-tunnels, to which we become “attached” (in a physical sense, neurologically).
What can we do about this? We can attempt to become more aware of the process, and thereby make allowances for it – both in our own thinking, and in “reading” the messages we’re subjected to on a daily basis from the mass media.
Recommended reading: Don’t Think of an Elephant (George Lakoff)