From Russian troll mills spewing out fake news to old school CIA trickery in Cuba, the world is awash in propaganda...
“If you give a man the incorrect information for seven years, he may believe the incorrect information on the first day of the eighth year when it is necessary, from your point of view, that he should do so. Your first job is to build the credibility and authenticity of your propaganda, and persuade the enemy to trust you although you are his enemy." - (A Psychological Warfare Casebook, Johns Hopkins University Press, 1958, at page 38.)
Most people associate propaganda with advertising, with partisan opinion heard on talk shows, or with a zealous Sunday morning radio preacher. Indeed, all of these are forms of propaganda, but for the most part they are the least harmful kind because the audience recognizes them as such.
The advertiser, for example, clearly wants to sell something, and is trying to keep a particular product or service on the minds of the audience or to make it more appealing to potential consumers.
Political commentary is nearly always recognized as such, and while it is intended to persuade its audience, it is far more useful as a means to inform or inspire those already in agreement with the speaker.
And audiences likewise recognize that religious programming is intended as much to motivate followers to take a particular course of action (such as donating money) as it is to change the spiritual orientation of non-believers. Thus, they are convinced to embrace the ideas of the speakers or to follow their instructions only if they are already inclined to do so.
But there are other types of communication that are far more intrusive—precisely because audiences tend not to recognize them as propaganda. One example might be false or incomplete news reporting, presented as truth or objective fact.
Reports that war has broken out nearby or that a highly-contagious and deadly disease is spreading among the local population would certainly produce a more immediate reaction among large numbers of people than would a commercial for a “new and better" laundry soap or a preacher's plea for money to keep himself
on the air.
Another way in which propaganda can turn around an unwilling audience is through the process of repetition. At the end of World War II, for example, the people of the United States were not inclined to worry very much about an invasion by the Soviet Union. After all, the Russians had been America's allies during the war.
But as the country launched the most massive arms build-up in the history of the world, the Soviet “threat" was stressed again and again—by government operatives and military leaders, who were soon joined by vast numbers of private organizations, political commentators, intellectuals, entertainers, and, of course, the news media.
Though the messages may have differed from one another—and probably even more so because they did—the sheer volume of these warnings and the diversity of the sources involved served to confirm in people's minds the reality of the threat.
Slogans like “the iron curtain" helped audiences to visualize the “danger."
And by the 1950s, bomb shelters and air raid drills were added to the psychological arsenal—orchestrated not so much to protect the country as to bring about active participation and thus to raise the level of hysteria.
The resulting climate of fear justified rapid expansion of military research and arms stockpiling, as well as active combat in far-away places like Korea. Indeed, it was not until people actually saw the brutality of battle on their television screens during the Vietnam conflict that the notion of a “defensive" war on the far side of the globe began to be questioned.
So profound, in fact, was the impact of propaganda in the anti-communist era that even after the collapse of the USSR, a large part of the population still wants to believe that America “survived" a great crisis.
Indeed, as can be seen from the cold war generally—and from such incidents as the Cuban missile crisis—intensive, long-term propaganda tends to be self-fulfilling. Like the arms race that accompanied it, the anti-Soviet mania helped hostilities to flourish and multiply.
And while the propaganda of the anti-communist era was designed to facilitate the development of a global US military presence, other types of propaganda are directed more toward social behavior or group loyalties. This was the case in later years of the cold war, when the ideological battleground shifted from Europe to the developing or “non-aligned" world.
Harry Rositzke, a retired chief at the CIA, described the situation in a book called The CIA's Secret Operations: Espionage, Counterespionage, and Covert Action: “During the fifties these covertly sponsored activities sounded many of the themes that permeated American official and unofficial propaganda.
Politics was reduced to a simple black-and-white formula of East or West, slavery or freedom…
In the late fifties, and during the sixties, as the American propaganda effort shifted to the third world, this simple general line had to be tempered for the new non capitalist audiences…
Covert propaganda operations in the third world were, in effect, a fight for the media… Foreign editors and columnists were recruited, newspapers and magazines subsidized, press services supported.
Propagandists ranged from paid agents to friendly collaborators, from liberal and socialist anti-Communists to simple right-wingers. Facts, themes, editorial outlines, model essays were sent out to third world stations to be reworked for local consumption. Hot stories were published in friendly outlets and replayed around the globe…"
The enormous cost of a large-scale foreign propaganda offensive—establishing contacts, recruiting agents, underwriting news operations, establishing front groups, laundering funds, developing messages and themes, concealing the reality of foreign involvement, and at the same time making certain that the “proper ideas" were aired conspicuously in a style appropriate to the local peoples
—can only be justified on the grounds that certain attitudes could be planted which otherwise would never have been favored by the targets. In other words, the audience is lulled into believing something—
or doing something or supporting something—that otherwise would have been rejected as being against group self-interest.
The fact that the audience is carefully and systematically led to a particular set of beliefs is especially dangerous because the source of the ideology—and the motives of the sponsor—are not known to the recipients of the messages. In fact, use of local collaborators, clandestine financing of indigenous news operations, and the like, only confirms that the propaganda has to be falsely-attributed in order to be credible.
The message, in other words, is made believable by the fact that it appears to come from within the target population itself. This is what is known as “covert" or “subversive" propaganda and “black operations." And it is generally acknowledged that much of what is conveyed through such campaigns consists of false information.
As Rositzke notes in his memoir, “Black operations… are designed to be attributed to the other side and must be carried out by a secret agency in order to hide the actual source of the propaganda. A black radio purportedly broadcasting from Central Asia or a forged document purportedly coming out of the classified files of a Soviet embassy requires expertise, secret funds, and anonymous participants."
Propaganda of this nature, especially if carried out over a long period of time and with the intent to achieve specific social or political changes, is usually part of a larger conquest called “political warfare"—and is almost sure to be accompanied by diplomatic pressures against national leaders, economic actions (e.g., foreign economic or military aid), cultural intervention, and surveillance. As such, it can have a profound or even devastating impact on the target peoples.
Skillful propaganda is capable also of manipulating its audience at the emotional level. Psychological studies done in the United States two decades ago proved the disastrous impact of widespread racism on children of African descent. Black children in one test all believed a doll with light skin to be more desirable than one with darker skin—a measure of the “self- hatred" instilled by social attitudes so prevalent as to be taken for granted.
In much the same way, Protestant missionaries from the US. have long promoted various forms of “biblical capitalism" which instill in followers the belief that the “good" are rewarded by God with material “blessings," and that poverty confirms the moral inadequacy of an individual, a group, or a class of people.
In fact, the practice of “church trading" in Liberia became the topic of media coverage. At the time, numerous minor Protestant sects and “biblical" institutes were actively trying to attract “affiliates" in Liberia because they knew that a mission overseas would increase financial contributions at home.
So Liberian congregations were offered such incentives as a new roof for a church building, for example, or a bus as an incentive to adopt the name and doctrine of the competing American religious organizations (nearly always white). And when promises went unfulfilled, as was often the case, the Liberian sects would be forced to turn to other sponsors who would, once the new relationship was cemented, dispatch instructors to indoctrinate them in their new-found “theology."
This not only created confusion and obscured the religious identity of the subjects, but, more importantly, led the Liberians to accept without reservation their absolute dependence on the sponsoring churches and to affirm their own collective inferiority. It is hardly surprising, given this history of “spiritual abuse," that charges have been repeatedly made of CIA backing for proselytizing among Catholic, Islamic, and traditional societies.
As the case of the American “missions" in Liberia makes clear, money is usually a critical factor in an effective propaganda drive. The vast difference in wealth between the northern and southern hemispheres means, for instance, that western powers can not only gain access to agents and collaborators for propaganda efforts, but can also penetrate indigenous institutions and even establish new ones with minimal risk of detection by the public at large.
They can disseminate literature, textbooks, pamphlets, cultural messages, and other ideological materials in quantities that far exceed what local markets could ever support.
Money, funneled through channels, can buy off radio and television programmers, supply packaged propaganda programs or special consultants, present “educational" seminars and conferences, offer such financial inducements as prizes and awards, and upgrade studios and broadcast facilities for reliable friends. The sheer volume of the operation guarantees that indigenous opinion cannot compete.
Rich nations can also pressure governments—under the threat of withholding aid or credit, for example—to formally “invite" them to participate in the development of public “information" or “education" campaigns.
Moreover, when conditions are favorable, wealthy donors of “technical assistance" projects can conduct highly sophisticated research activities that enable them to thoroughly evaluate the sociological climate of target countries, to pretest propaganda message on small groups, to measure changes in attitudes over the course of time, and to intimidate opponents, suppress dissent, and censor the dissemination of competing ideas.
Deception as Science
Social psychology textbooks list several ways in which audiences can be deceived by propaganda. First of all, audiences are more likely to accept an idea if they believe it was heard inadvertently; in other words, there is a natural tendency to resist a message that is presented in an assertive way, while there will be far less negative reaction if the audience hears the same theme in a context that is relatively “matter-of-fact."
Audiences are also more likely to actually change their opinions if they receive a message from a variety of sources that mutually reinforce one another. Similarly, people tend to approve of a statement made by someone who is in some way similar to them, an expert on the topic under discussion, or one who begins by expressing an opinion with which the listener (or viewer or reader) strongly agrees.
Under some circumstances, propaganda messages can be made more potent by incorporating opposing arguments in a way that tends to discredit them, while at the same time giving the audience the impression that it is hearing both side of the debate.
In large operations, propagandists often stimulate changes in attitudes by generating a “band-wagon effect"—creating the false impression that a particular set of beliefs is more widely accepted than it really is.
And where a specific behavioral change is the intended goal of a communications campaign, it is extremely useful to get members of the target group either to express the idea publicly (thereby committing themselves to it) or to engage in the desired conduct in some way short of compulsion (so that they assume “ownership" of the idea). In either case, the tendency is to continue to defend the opinion or action and in so doing to internalize the propaganda.
There is no question that propaganda which discreetly and consistently applies these principles can produce profound and far reaching changes in the societies against whom it is directed. The reasons are relatively simple. Individuals are part of groups. They share customs and common values with other members of the groups to whom they belong.
If a person strongly identifies with the Islamic faith, for example, that person's reaction to certain things—the consumption of alcohol or pork, perhaps—will be shaped by religious tenets, even though it may express itself as contempt for “drunkenness" or revulsion toward an “unclean food." However, a concerted campaign to “revise" or subvert Islamic influences could, over time, start in motion a slow process of subduing this emotional response.
Simply arguing that drink should be tolerated by Muslims is likely to do nothing more than arouse resistance and provoke countering arguments from those who know better.
But it might also be a way to “open up the issue" for further debate. A drawn out, well-publicized controversy about the “benefits" of alcohol consumption, even if it changed very few minds over a few years, would nonetheless create an atmosphere of ambivalence; the certainty with which drink was condemned in previous times will have been undermined, and much of the negative response inspired by the debate at the beginning is gone.
The next step in this case might be for the propagandist to actually induce certain members of the community (or agent- agitators posing as audience peers) to openly consume alcohol as affirmation of the “new" ideas embraced by a more “modern" or more “realistic" Islam.
The endorsements of a few paid collaborators would likewise be useful. All of this would be made known to the public by means of aggressively-distributed news releases, video clips, and pre-fabricated features to friends and hirelings in the local news media.
This phase of the operation gives the propagandist's suggestions what psychologists call “false authority." In other words, the impression is made that persons highly qualified to endorse such ideas are the source of the information. These same activities further offer the propagandist a chance to eliminate some negative stereotypes and to lower inhibitions against the desired behavior.
After being exposed on a regular basis to real examples of such conduct, members of the target group would be far less likely to issue strong condemnations because doing so would be perceived as a personal attack on one's peers (or even leaders).
Finally, the instigator of the communications campaign might attempt to undermine the most stubborn resistance to change through a mass media offensive—television spots, news articles, cartoons, billboards, rigged debates, T-shirts, the cinema, and so forth—that dishonors opponents by linking them to unpopular causes, or by holding them up as objects of ridicule.
Even if most adults still cling to their original beliefs, the younger generations would not have the benefit of the uncorrupted culture their elders knew. Thus a chain of authority is weakened and a tradition vanishes.
While it seems improbable that westerners would benefit by promoting the consumption of alcohol or pork among Muslims, something of this sort might be a very effective way to de-legitimize Islamic traditions in general—and thus to lead followers away from religious orthodoxy so they could be more easily integrated into a westernized world culture.
Regardless of whether propaganda is used to change human attitudes and behavior or to simply get people to act on false information, group identity is the key that propaganda seeks to exploit.
By definition, group membership imposes certain standards of behavior on the individual. To put it another way, the individual cannot help but act in a manner that takes into account the expectations of the group as a whole. It is the shared values held in common within the group that shapes the conduct of its members. And, at the same time, these customs are reinforced by the members continued adherence to them.
For this reason, propaganda has to exploit group identity. It must attempt to challenge the collective ambitions and prohibitions that direct group conduct—or to create the illusion that meaningful change is taking place even when it is not. Either way, those who are part of the group are inexorably led to change their own behavior in response to what they see as an evolving group ethic.
Moreover, propaganda professionals are also aware that change takes time—that any attempt to establish or reverse social trends must necessarily be a long-term operation, lest the intervention be exposed for what it is and backfire.
Colonel Michael Dewar, a British military intelligence specialist, explains the philosophy of change this way: “The tendency is for the mind to be lulled by regularity and routine. It tends to pay less attention to events which occur again and again and is not good at spotting marginal or gradual changes." (The Art of Deception in Warfare by Col. Michael Dewar)
Oil on the Moon
Disinformation is often an important part of a sustained propaganda effort. Indeed, it is hardly an exaggeration to say that even the most ridiculous concept can be made believable if enough time and effort is put into the task. In difficult cases, it may also be necessary to take extra steps to entice large numbers of key targets to participate in a way that almost forces them to accept the concept conveyed by the propagandist.
Imagine that the industrialized bloc, for whatever reason, decided to spread a truly outrageous theory among certain audiences in the developing world—that immense oil reserves can be obtained from the moon, for example. Now it is widely known that oil is of organic origin and that the moon consists of dry rocks which have never supported vegetation.
Thus, no reputable scientist would ever even imagine such a theory. But the wizards of deception might begin with a “pre-propaganda" publicity drive in which statements of “experts" are presented which merely question what is known about the moon and stress the importance of doing more research.
Later on, news articles and broadcast features might emphasize the enormous significance of a new “theory" that could eventually make Arab oil obsolete. Debates might be staged in which those disputing the idea look bad by comparison. New “discoveries" of oil-like substances on the lunar surface could also be brought to the public's attention at regular intervals—and, of course, with great fanfare.
Foreign propaganda sponsors and aid donors might also insist that school textbooks make reference to the “vast supplies of petroleum" that are now believed to exist on the moon; donated library books would be widely circulated to support the same myth; and doctored photographs or video clips might be passed around to the news media which purport to “prove" what the bogus “scientists" are saying.
Finally, contests might be organized in which college students or news reporters are offered generous prizes for the best essay on how to bring about world peace and global prosperity by exploiting moon oil. Contestants would be free to research the issue for themselves, of course, but would find that texts supporting the existence of petroleum on the moon far outnumber those that suggest otherwise.
More importantly, they become eager to propagandize themselves because they want to please contest judges and claim the prize money. In essence, they fall into one of the most pernicious of all propaganda traps—one in which targets are duped into equating their own self-worth with the success of the disinformation campaign.
Again, it is hard to imagine in what situation false information about extra-terrestrial oil supplies would be useful to a propaganda sponsor, except perhaps to undermine the confidence of OPEC countries in future economic conditions or to discourage potential oil producing nations (e.g. Senegal) from attempting to gain from their own reserves.
But disinformation is a major element of foreign propaganda, especially military “psy-war" projects intended to facilitate the surrender of opposition troops or to induce the defection of their members. Indeed, disinformation—often combined with opinion or ideological messages—is a part of most peacetime psychological operations.
As bizarre as these tactics may seem, all are being used on a regular basis to mold public opinion in developing countries on issues ranging from “free" trade and western economic principles to birth control and population planning. In fact, the psychological pressures of mass propaganda are an essential element in building a constituency for US. military actions under the mask of “international consensus."
And communication campaigns have become a routine way of discrediting anti-imperialist sentiments, undermining claims for worldwide economic justice, and countering “threats" to western interests posed by such diverse groups as religious movements, so-called third world nationalists, and anti-corporate environmentalists.
But in a more general sense, control over communications in far-away lands is as much an end in itself as it is a means to an end. To be able to acquire and maintain the dominant influence over the spread of ideas and information within a society is to exercise control over its people. As an American military advisor reportedly said at the end of World War II, “Whoever controls the radio, controls Berlin."
Spies and Saboteurs
Propaganda and psychological warfare techniques are a fundamental part of the western presence in the developing world. If a foreign power has an established network of friends to convey its ideas to host country audiences, it is well-situated to intervene in other ways, should the need arise.
Indeed, basic political influence and communications campaigns can be a way to build a system for recruiting the local collaborators and front groups necessary to wage proxy wars, subvert political movements, and install puppet governments. Without such penetration, on the other hand, these actions would be almost impossible.
At a conference on “Worldwide Threats" organized by the US. General Accounting Office, numerous papers on foreign relations in the post-cold war era were presented. One, “Intelligence for Low Intensity Conflicts" by Robert C. Kingston, dealt with psychological operations and covert activities.
“Psychological operations wield words as nonviolent weapon systems, set stages, exploit successes, and minimize failures when properly employed," the paper noted, adding that specialists must “gather intelligence that enables them to determine the predispositions, vulnerabilities, and susceptibilities of targeted audiences…"
The Kingston report also pointed out that personal knowledge about a country's leaders forms “the basis of successful operations to unseat or sidetrack key personnel who plan and implement insurgencies, coups, transnational terrorism and other actions that adversely affect US. interests."
Moreover, he continued, “US leaders cannot knowledgeably support or oppose any foreign coup that affects US interests unless they are well informed about potential successors, especially their attitudes toward the United States and expected programs compared with those of incumbents. Otherwise, short-term benefits may become long-term liabilities with local, regional, and even global implications."
From these words, it is apparent that western leaders will escalate campaigns of propaganda and psychological warfare against their remaining “enemy" in the aftermath of the cold war—the emerging nations of the southern hemisphere.
And the goals will be many: curbing population growth, maintaining cheap access to supplies of minerals, and neutralizing ideological movements that run counter to US. interests, to name just a few.
In fact, Admiral James A. Baldwin, president of the National Defense University in Washington, wrote in 1989:
“Warfare is often defined as the employment of military means to advance political ends… Another, more subtle, means—political warfare—uses images, ideas, speeches, slogans, propaganda, economic pressures, even advertising techniques to influence the political will of an adversary…
Now that the Soviets 40-year campaign of aggression, intimidation, and hegemony is in apparent retreat and the world is increasingly beset by low-intensity conflict and struggles for economic domination, political warfare will be at the forefront of our national security agenda."
Glossary Department of Defense Military and Associated Terms
SOURCE: Joint Chiefs of Staff, Department of Defense, JCS Pub 1 (1987).
NOTE: The initials following each item identify the source of the definition. DOD is the Department of Defense; IADB is the Inter-American Defense Board; I stand for Interpol; and NATO is the North Atlantic Treaty Alliance.
(DOD, I, IADB) Operations which are so planned and executed as to conceal the identity of or permit plausible denial by the sponsor. They differ from clandestine operations in that emphasis is placed on concealment of identity of sponsor rather than on concealment of the operation.
PSYCHOLOGICAL CONSOLIDATION ACTIVITIES:
(DOD, NATO) Planned psychological activities in peace and war directed at the civilian population located in areas under friendly control in order to achieve a desired behavior which supports the military objectives and the operational freedom of the supported commanders.
(NATO) The media, technical or non- technical, which establish any kind of communication with a target audience.
(DOD) Planned operations to convey selected information and indicators to foreign audiences to influence their emotions, motives, objective reasoning, and ultimately the behavior of foreign government, organizations, groups, and individuals. The purpose of psychological operations is to induce or reinforce foreign attitudes and behavior favorable to the originator's objectives. Also called PSYOP. See also perception management.
(NATO) Planned psychological activities in peace and war directed to enemy, friendly, and neutral audiences in order to influence attitudes and behavior affecting the achievement of political and military objectives. They include strategic psychological activities, consolidation psychological operations and battlefield psychological activities.
(IADB) These operations include psychological warfare and, in addition, encompass those political, military, economic, and ideological actions planned and conducted to create in neutral or friendly foreign groups the emotions, attitudes, or behavior to support the achievement of national objectives.
PSYCHOLOGICAL OPERATIONS APPROACH:
(NATO) The technique adopted to induce a desired reaction on the part of the target audience.
(NATO) The current emotional state, mental disposition or other behavioral motivation of a target audience, basically founded on its national political, social, economic, and psychological peculiarities but also subject to the influence of circumstances and events.
(NATO) An idea or topic on which a psychological operation is based.
(DOD, IADB) The planned use of propaganda and other psyche logical actions having the primary purpose of influencing the opinions, emotions, attitudes, and behavior of hostile foreign groups in such a way as to support the achievement of national objectives. See also psychological warfare consolidation.
PSYCHOLOGICAL WARFARE CONSOLIDATION:
(DOD, IADB) Psychological warfare directed toward populations in friendly rear areas or in territory occupied by friendly military forces with the objective of facilitating military operations and promoting maximum cooperation among the civil populace. See also psychological warfare.
See psychological operations.
(DOD) Actions to convey and/or deny selected information and indicators to foreign audiences to influence their emotions, motives, and objective reasoning; and to intelligence systems and leaders at all levels to influence official estimates, ultimately resulting in foreign behaviors and official actions favorable to the originator's objectives. In various ways, perception management combines truth projection, operations security, cover and deception, and psychological operations. See also psychological operations.
- Excerpt from 'On Political War' by James A. Baldwin. Photo of David Foltz by Jeffree Benet