Monday, June 19, 2017

The Roaring Twenties: How They Emerged from the Previous Decade

To understand the 1920s, one must first take a look at the years leading up to them. Ordinary citizens in the years from 1913 to 1920 in the United States were burdened with high taxes and shackled with regulations which limited their individual actions.

People in the 1920s could have a feeling of liberty because they were coming out of an era in which civil rights and personal self-determination had been restricted. As historian R.J. Unstead writes,

For many Americans, theirs was a society based on freedom and equality of opportunity.

From 1913 to 1920, President Wilson had introduced segregation into various offices within the federal government - offices which had previously been desegregated and integrated. The 1920s saw advances in civil rights for African-Americans.

Woodrow Wilson had also been enthusiastic in his support of the KKK. It is shocking to realize that a president of the United States would openly support the Ku Klux Klan. He promoted the film The Birth of a Nation and called the Klan a “glorious” organization.

Wilson also mocked President Theodore Roosevelt, who had invited leading Black citizens to the White House and appointed Blacks to major federal offices. Wilson reversed the gains which African-Americans had made prior to 1913.

In 1920, Americans elected a new president, Warren G. Harding, who took office in March 1921. Harding undid many of Wilson’s racist actions.

President Harding died suddenly in August 1923, but his successor, President Calvin Coolidge, continued working with Blacks on various civil rights issues.

When the KKK pressured Coolidge for support, he rejected such bullying, and openly mocked the Klan in his campaign slogans. He further irritated the Klan by becoming the first U.S. president to give a commencement address at a historically Black university.

Aside from racial questions, the 1920s saw other benefits for ordinary citizens. American benefitted from a balanced middle course between isolationism and an overly-ambitious internationalism, as R.J. Unstead reports:

They had rejected Woodrow Wilson’s dream that they should take on world leadership and sort out the troubles of a ruined Europe. What America wanted was a return to “normalcy”, to the task of building a free and prosperous society. “The business of the United States is business,” declared President Coolidge.

During the 1920s, the United States would work with Europe as a partner, not a boss. “By 1921, Americans had turned their backs” on Wilson’s hope that America would manage Europe.

If America had attempted to organize and direct Europe, it would have received the anger and contempt of other nations, it would have spent huge sums of money, it would have been dragged into regional wars, and it would have had to convince both Americans and Europeans that America had a moral right to supervise the rest of the world.

Wilson’s dreams had been too ambitious.

Coolidge, by contrast, worked with the Europeans on initiatives like the Kellogg-Briand Pact and the Dawes Plan, both of which averted the looming possibility of war.

The Kellogg-Briand Pact created diplomatic opportunities for nations to resolve disagreements without war, and the Dawes Plan healed economic problems which would have led to war. 1920s enjoyed not only civil rights and peace, but also prosperity.

Business boomed in the twenties. Income per head increased by a quarter; prices came down and wages went up, so that luxuries like cars, refrigerators and radios became necessities.

Economic innovations like purchases made on the “installment plan,” called “hire-purchase” in England, fueled growth. The Harding and Coolidge administrations lowered taxes, reduced the federal government’s spending, and reduced the national debt.

Lower tax rates left workers with more money to enjoy consumer goods. Working-class families could afford the new technologies of the era: telephones, radios, electric lights, and phonographs.

The twenties were prosperous years in the United States, output increased at a tremendous rate and wages were the highest in the world.

Blue-collar industrial workers were not poor: there was not a large income difference between the working class and the middle class. The upper end of the working class was, in fact, part of the middle class.

Reduced regulations created more jobs and better-paying jobs. “This was the era,” Unstead explains,

when skyscrapers changed the skyline of American cities, when films, jazz, sensational papers and the drama of Prohibition added to the excitement of life.

Prohibition was a leftover from the ‘Progressive’ movement of earlier years. Approved in 1919 and taking effect in 1920, it was clearly a policy failure within a year or two. One political task during the 1920s was to end Prohibition. The anti-Prohibition movement grew during the the decade, and Prohibition ended in 1933.

Wednesday, April 26, 2017

World War One: The Officer Corps Develops Command and Control

In April 1917, the United States formally declared war. The first American troops arrived in Europe in June 1917, and their initial participation in combat at the front was in October 1917.

It was not until May 1918, however, that U.S. soldiers played a major role in World War One.

Compared to Austria, Serbia, Germany, France, and other nations, the United States had a relatively brief experience in the war; fighting ceased on November 11, 1918.

The American officers had, however, one thing in common with their counterparts in the armies of the other countries: nobody had even seen a war like this before, and nobody was sure how to fight it.

Many of these officers were experienced, but their experience was irrelevant. They’d been involved in combat in the Philippines and in Cuba as part of the Spanish-American War, and they’d fought Pancho Villa’s Mexican revolutionaries.

The trench warfare in Europe was a different situation. As historian Timothy Nenninger writes,

In World War I, the United States Army entered combat on the Western Front with an ill-defined idea of how to command troops on the battlefield. Although most senior leaders in the American Expeditionary Forces (AEF) had commanded in combat in Cuba, the Philippines, or Mexico, none had experience with large unit, high intensity combat as conducted in France.

Several factors made the ground war in Europe different. First, the size: millions of soldiers, millions of guns, billions of bullets.

The second factor which distinguished World War I from previous conflicts was mechanization. Although machine guns had been used in previous fights, only during WW1 did they become numerous and common. Tanks entered battle for the first time. Other new weapons included airplanes and poison gas.

Advanced and sophisticated levels of industrialization produced weapons capable of killing on a large scale.

A third factor was that the American officers weren’t accustomed to being part of a multinational coalition. Coordination with French and British officers was a new experience.

There are probably other factors which distinguish WW1 from previous conflicts. American officers studied quickly to gain insights into this situation, as Timothy Nenninger notes:

The lack of first-hand experience was mitigated in part by service school education, General Staff analysis and doctrine, and professional writing in service-sponsored journals and books on military art that provided some insight into modem war. But until the summer of 1918 all of the principal elements of the AEF's command process - organizational, doctrinal, technical, and personal - were untested.

Under the command of General John Pershing, the decision was made to keep the AEF together as a unit within the larger multinational coalition. Previously, the alternative route - embedding small groups of Americans inside French and British units, mainly as replacements to “fill the holes” - had been considered.

This decision shaped the American experience of WW1, and in fact created a distinctly American experience of WW1, being different than the French or British experiences.

How and how well the process worked depended on the knowledge, skill, and preparation of commanders and staff officers, and on their interaction. Although the AEF drew on the experience of other armies, how they applied that experience resulted in a distinctly American process of command.

For the U.S. military, WW1 constituted a challenge to develop new forms of command and control. American officers learned to operate in a large, mechanized, multinational context.

Friday, March 31, 2017

General John Pershing: Civil Rights Hero

General Pershing became most famous for leading the United States Army in World War One. But many years earlier, he took courageous steps to acknowledge the contributions of African-Americans in the nation’s military.

Having graduated from West Point in 1886, one of his earliest assignments was, according to historian Kevin Hymel,

with both the 6th and 10th Cavalry Regiments. The 10th was one of two black cavalry regiments commanded by white officers. Pershing was called “Black Jack” in reference to his service with the10th, and the nickname stuck long after he left it.

Pershing was proud of his service with the “Buffalo Soldiers,” the nickname given to the African-American cavalrymen. In 1898, when the Spanish-American war began, Pershing insisted on rejoining the Buffalo Soldiers as they went into action in Cuba.

In his own words, Pershing described what he saw as a wonderful unity among the soldiers:

Each officer or soldier next in rank took charge of the line or group immediately in his front or rear and halting to fire at each good opportunity, taking reasonable advantage of cover, the entire command moved forward as coolly as though the buzzing of bullets was the humming of bees. White regiments, black regiments, regulars and Rough Riders, representing the young manhood of the North and the South, fought shoulder to shoulder, unmindful of race or color, unmindful of whether commanded by ex-Confederate or not, and mindful of only their common duty as Americans.

Years later, when Pershing was commanding in Europe during WW1, his loyalty to Black soldiers would lead him to assignment them to meaningful combat roles. Pershing answered to President Woodrow Wilson. Wilson had reintroduced segregation into the civilian branches of the government.

As Commander-in-Chief, Wilson was not pleased to see African-American troops taking on significant military tasks. In assigning Black troops to the same types of duties as any other troops, Pershing showed that he was willing to risk Wilson’s displeasure.

Wednesday, March 8, 2017

The Priorities of the Soviet Espionage Network in the United States

Communist spies in North America had more than one function. They gathered intelligence, but other tasks often had higher priority.

Beyond stealing classified documents from the government and from the military, and beyond sending such secrets to Moscow, Soviet operatives were tasked with influencing U.S. policy. To this end, they established themselves in both governmental and non-governmental institutions.

This can be seen, e.g., in the intelligence provided by Whittaker Chambers. Chambers joined the U.S. Communist Party (CPUSA) in the mid 1920s, from which he was recruited into Soviet espionage agencies around 1932. By 1938, he had grown disillusioned with the international communist conspiracy.

He disengaged from the USSR’s intelligence agencies and began offering information to the United States government. He was able to explain that Soviet operatives were often more interested in shaping U.S. policy than in stealing U.S. secrets.

As historians Stan Evans and Herbert Romerstein explain, “Cold War scholars generally are aware of the influence issue.” Yet, in the popular imagination, Soviet spies are generally conceived of as “stealing secrets.”

One Soviet agent, named Alger Hiss, is perhaps best known to the public as the man who stole classified State Department Documents and passed them on to other operatives, including Chambers. But while Hiss’s reputation depicts him as an intelligence-gatherer, he actually did much greater damage as a policymaker.

It is less well known that Hiss, while on the payroll of Soviet intelligence agencies, was a key advisor to President Roosevelt in the late 1930s and early 1940s. Hiss’s influence was responsible for the death of thousands as he enabled Soviet imperialistic expansionism to run amok in eastern Europe.

There is, then, a disconnect between what professional historians know, and what general public’s perception is. Scholars are aware

of the Cold War role played by Chambers, who knew a lot about spying and was involved in it on a professional basis. Yet Chambers repeatedly stressed that spying as such was not the major issue. Rather, he said, with the likes of Hiss in federal office, policy influence was by far the leading problem.

While the Soviet effort to steal classified U.S. documents was a grave danger during the Cold War, an equal and possibly greater threat was posed by communists inside the U.S. who nudged policy decisions in directions which favored Stalin.

Thursday, February 9, 2017

Insider Information: Soviet Access Inside the U.S. Government

Although the Cold War is usually defined as starting in the mid-1940s, Soviet espionage efforts inside the United States started decades earlier. By the 1930s, there was an established intelligence network operating undercover in North America.

Famous spies like Alger Hiss, Harry Dexter White, Owen Lattimore, Harry Hopkins, and Philip Keeney funneled classified government documents to Soviet intelligence agencies. Some of these operatives were deliberate; others were unwitting dupes, who never quite understood what they were doing.

These “moles,” operating from their posts inside various governmental and social institutions, were ultimately complicit for thousand and tens of thousands of deaths as the USSR unfolded its imperialistic expansionism in eastern Europe, North Korea, and elsewhere.

Equipped with insider knowledge of the U.S. government’s policy-making apparatus, the Soviets could form their own policy in anticipation of American responses to it. As historians Stan Evans and Herbert Romerstein write,

Guided by such inside information, the Soviets could plan their own strategies with assurance — like a card player who could read the hand of an opponent. Knowing what the United States or other Western nations would do with respect to Germany, Poland, Spain, Japan, or China, the commissars could make their moves with foreknowledge of the responses they would get from other powers. Thus the two facets of the Soviet project interacted — the spying handmaiden to the policy interest. And, of course, if knowing what the policies of the United States and other non-Communist nations would be was useful to the Kremlin, then being able to influence or guide those policies in some manner would have been still more so.

Several concrete examples show this principle in action: At the Yalta Conference, Stalin knew that Roosevelt’s foreign policy was shaped largely by Alger Hiss, who was a Soviet agent. Stalin knew in advance Roosevelt’s concerns, and to which extent Roosevelt was willing to cede American interests to Stalin’s pressure tactics.

Similarly, Stalin knew that his operatives had undermined from within the will of the U.S. government to resist Mao’s vicious attack on Chinese liberty. Stalin could support Mao and know that the Americans would offer only slight verbal resistance and no material opposition. Millions of Chinese paid with their lives.

Friday, January 13, 2017

James Madison: The Virtues of Constitutionalism

After winning the election of 1808, James Madison became the fourth president of the United States in 1809. Arguably, however, his most significant contribution to history happened twenty years earlier.

Madison’s impact on the Constitutional Convention of 1787 in Philadelphia, and the Constitution which it produced, is so extensive that he earned the nickname “Father of the Constitution.”

Constitutionalism is central to intuitive notions of fairness, justice, and equality. Constitutionalism is essentially a codification of the rule of law: constitutionalism is the demand that a plan of government be written, published, and carried out.

As historian Larry Arnn writes:

Ours, wrote Madison, is the first nation to adopt purely representative forms. This means that all sovereignty or authority to rule is located in the governed or in the people.

Because the Constitution is written, it is more objective. The text is fixed, and while some questions of interpretation are possible, it does not change over time.

Because it is public, it creates a sense of access and equality. Any citizen can learn what the system of government is and how it works. Citizens can use this knowledge to act politically to further their own interests or values.

Non-citizens can also access the Constitution and, using their knowledge of it, decide whether they want to attempt to become citizens. Larry Arnn continues:

But at the same time, the people do not occupy the offices of government — as they did, for instance, in Athenian democracy. America’s pure or simple “republicanism,” as Madison called it, makes possible the separation of powers both between the governed and their government and also inside the parts of the government.

The vision of American Constitutionalism is a republic with freely-elected representatives. The individuals in government represent the voters. Elected official act on behalf of the citizens.

The government should not rule the citizens: the citizens should rule the government.

The original intent of the Constitution was not to create a class of permanent career politicians. Those elected to Congress were to meet occasionally to transact whatever business was necessary, but should spend most of their time living in their communities and working at their own trades.

As Larry Arnn notes, the Constitution can be seen as an extensive list of ways to limit the power of government:

The sovereign people delegate their authority to government, separately to separate places. This separation is both horizontal, among the branches of the federal government, and vertical, between the states and the federal government.

By dividing power among the three branches of government, and then dividing it again between national and local governments, the Constitution seeks to preserve individual political liberty by ensuring that there is no large mass of power in any one office.

Another way to limit the power of government is to ensure that elected officials are representatives of the people, not rulers over them. Elected officials should serve occasionally, for short terms, and only for specified purposes.

The people themselves are outside the government, and they may intervene only at election time. Between elections, they watch, judge, and argue — in other words, they think before they act.

James Madison was a passionate abolitionist, and by the time he died in 1836, the end of slavery in the United States had become a historical inevitability.

Madison also understood that educated voters were good for the nation, and worked therefore with Thomas Jefferson to found the University of Virginia.

The rotation of officials in and out of government ensures that no one individual can lodge himself permanently in a position of power. Limiting the power of the government, in order to protect individual political liberty, includes limiting even the power which the voters might indirectly exert over each other.

Voters, at their own discretion, replace elected officials. Individuals in government offices should be replaced if the voters think that by such replacement, the will of the voters can be more accurately represented.

Over time, but only over time, they may replace the whole lot. This system limits both their power and the power of those in government.

Madison’s cabinet changed significantly during his presidency. At that time, there were only five cabinet offices: Secretary of State, Secretary of War, Secretary of the Navy, Secretary of the Treasury, and Attorney General.

During Madison’s tenure in office, there was a complete turnover in cabinet office. By the end of his presidency, none of the original five were left.

Wednesday, January 11, 2017

Required Military Service: Damaging Freedom in Order to Preserve It?

A variety of words are used to name the process of requiring men to enlist in the armed forces of a country: draft, impressment, conscription. These words may have slight differences in meaning, which also vary over the decades and centuries, but they essentially all refer to ignoring the will of the individual and requiring him to bear arms for the nation.

The terms ‘impressment’ and ‘conscription’ can also, at times, refer to the forced requisitioning of material objects and supplies as well as manpower.

This practice is never popular, but nowhere meets with more resistance than in the United States. Because the USA is explicitly founded on the notion of individual political liberty, conscription is especially ironic, inasmuch as it violates the individual’s freedom in the name of protecting the individual’s freedom.

The most recent example of such impressment happened during the Vietnam War in the 1960s and 1970s, but the practice dates back to times even before the nation’s founding. There were examples of the ‘draft’ during colonial times in the 1600s.

Studying particular instances of the impressment of supplies in North Carolina during the Revolutionary War from 1775 to 1781, historian John Maass notes that it was so unpopular that it even fostered sentiments in support of the enemy.

In this way, the draft was truly counterproductive.

Likewise, John Maass writes, the conscription of men began to nudge some residents of the area to support, in thoughts if not in deeds, the adversary:

For similar reasons, conscription also raised the ire of men forced to perform compulsory military service through the process of a draft. In all theaters of the war, Revolutionary authorities relied heavily upon conscription, a practice several American provinces used as far back as the seventeenth century. Despite historical precedents, involuntary military service was certainly an imposition upon many male Carolinians (and their families), and often met stiff resistance. Just as men and women directed their ire toward governing officials and their agents over impressment, so too did the draft cause similar disaffection and hostility among men compelled to serve in the ranks. Opposition to conscription and impressment created significant difficulties for Britain’s former American colonies in building allegiance to the new Revolutionary governments, and in defending themselves from British and Tory enemies.

The practice of conscription is a perpetual thorn in the side of military leaders. Soldiers who did not voluntarily enlist are perhaps more likely to desert or cause other disciplinary problems.

Yet, over the centuries, some form of the draft has repeatedly shown itself to be necessary.

In the impressment of material supplies, there is a need to watch the ethics of the conscripting officers and men, because there is a temptation to take more than one needs. A thoughtfully-organized impressment of supplies can minimize the damage to goodwill among the populace.

Likewise, the humane treatment of draftees can reduce personnel problems among the soldiers.