The 3/5ths rule in the U.S. Constitution

In today’s excerpt – in the U.S. Constitution, Indians who renounced their tribe were counted toward a given state’s population for the purpose of determining how many members of the House of Representatives each state had. “Other persons,” the Constitution’s euphemism for “slaves,” counted as 3/5 of a person for this same purpose. The debate over this horrible compromise unleashed a level of vitriol among the framers that barely subsided before it erupted again scarcely more than thirty years later and then finally erupted in the American Civil War:

United States Constitution, Article I, Section 2: Representatives and direct Taxes shall be apportioned among the several States which may be included within this Union, according to their respective Numbers, which shall be determined by adding to the whole Number of free Persons, including those bound to Service for a Term of Years, and excluding Indians not taxed, three fifths of all other Persons.

“Census enumerators began to include Indians who had renounced their tribes in 1860. The instructions provided for the 1880 census said ‘Indians not taxed’ meant ‘Indians living on reservations under the care of Government agents, or roaming individually, or in bands, over unsettled tracts of country.’ In 1940 the government did away with the category ‘Indians not taxed.’

“[The ‘other persons’ clause] is one of the most infamous clauses in the Constitution, because not only did it countenance slavery but it was seen as doubly demeaning to the men and women held in bondage that they
were each counted as but three-fifths of a person. The political dynamic behind this clause, however, is full of ironies. It was the North that opposed counting a slave as a whole person. It was the South that wanted slaves to be so counted. The three-fifths compromise meant that the ill-gotten gains of slavery were no longer solely financial but that slaveholders were to receive political gains as well – the more slaves a state had, the more representatives it would have in the Congress. …

“Under the Articles [of Confederation that preceded the Constitution], in which each state had the same representation, there was no incentive to show a large population, and states faced the threat of a population-based tax. So they had an incentive to understate their true population. The Constitution changed the equation. Suddenly representation in Congress was no longer equal for each state but was based on population. So states now had reason to bolster their population. The issue was an existential one for the country. William Davie of North Carolina is recorded in The Records of the Federal Convention as saying that he ‘saw that it was meant by some gentlemen to
deprive the Southern States of any share of Representation for their blacks. He was sure that N. Carola. would never confederate on any terms that did not rate them at least as 3/5. If the Eastern States meant therefore to exclude them altogether the business was at an end.’

“Of the three-fifths clause, Gouverneur Morris, the Pennsylvania
delegate, said this to the Convention: ‘The admission of slaves into the
Representation when fairly explained comes to this: that the inhabitant
of Georgia and S. C. who goes to the Coast of Africa, and in defiance
of the most sacred laws of humanity tears away his fellow creatures
from their dearest connections & damns them to the most cruel
bondages, shall have more votes in a Govt. instituted for protection of
the rights of mankind, than the Citizen of Pa or N. Jersey who views
with a laudable horror, so nefarious a practice.’ The three-fifths clause,
Luther Martin declared in The Genuine Information, involved ‘the
absurdity of increasing the power of a State in making laws for free
men in proportion as that State violated the rights of freedom.’ ”

Seth Lipsky, The Citizen’s Constitution, Basic Books, Copyright 2009 by Seth Lipsky, pp. 7-8.

About Us

Delanceyplace is a brief daily email with an excerpt or quote we view as interesting or noteworthy, offered with commentary to provide context. There is no theme, except that most excerpts will come from a non-fiction work, mainly works of history, and we hope will have a more universal relevance than simply the subject of the book from which they came.

To visit our homepage or sign up for our daily email click here
To view previous daily emails click here.
To sign up for our daily email click here.

Published in: on March 10, 2010 at 12:58 pm  Leave a Comment  
Tags: , ,

The Collapse of Long-Standing Empires

Excerpt from Delancey Place

In todays excerpt – the collapse of a long-standing empire has very often occurred in a very short span of time:

“What is most striking about [Rome’s] history is the speed of the Roman Empire’s collapse. In just five decades, the population of Rome itself fell by three-quarters. Archaeological evidence from the late fifth century – inferior housing, more primitive pottery, fewer coins, smaller cattle – hows that the benign influence of Rome diminished rapidly in the rest of western Europe. What [Oxford historian Brian] Ward-Perkins calls ‘the end of civilization’ came within the span of a single generation.

“Other great empires have suffered comparably swift collapses. The Ming dynasty in China began in 1368, when the warlord Zhu Yuanzhang renamed himself Emperor Hongwu, the word hongwu meaning ‘vast military power.’ For most of the next three centuries, Ming China was the world’s most sophisticated civilization by almost any measure. Then, in the mid-seventeenth century, political factionalism, fiscal crisis, famine, and epidemic disease opened the door to rebellion within and incursions from without. In 1636, the Manchu leader Huang Taiji proclaimed the advent of the Qing dynasty. Just eight years later, Beijing, the magnificent Ming capital, fell to the rebel leader Li Zicheng, and the last Ming emperor hanged himself out of shame. The transition from Confucian equipoise to anarchy took little more than a decade.

“In much the same way, the Bourbon monarchy in France passed from triumph to terror with astonishing rapidity. French intervention on the side of the colonial rebels against British rule in North America in the 1770s seemed like a good idea at the time – a chance for revenge after Great Britain’s victory in the Seven Years’ War a decade earlier – but it served to tip French finances into a critical state. In May 1789, the summoning of the Estates-General, France’s long-dormant representative assembly, unleashed a political chain reaction that led to a swift collapse of royal legitimacy in France. Only four years later, in January 1793, Louis XVI was decapitated by guillotine. …

“The sun set on the British Empire almost as suddenly. In February 1945, Prime Minister Winston Churchill was at Yalta, dividing up the world with U.S. President Franklin Roosevelt and Soviet Premier Joseph Stalin. As World War II was ending, he was swept from office in the July 1945 general election. Within a decade, the United Kingdom had conceded independence to Bangladesh, Bhutan, Burma, Egypt, Eritrea, India, Iran, Israel, Jordan, Libya, Madagascar, Pakistan, and Sri Lanka. The Suez crisis in 1956 proved that the United Kingdom could not act in defiance of the United States in the Middle East, setting the seal on the end of empire. Although it took until the 1960s for independence to reach sub-Saharan Africa and the remnants of colonial rule east of the Suez, the United Kingdom’s [centuries old] age of hegemony was effectively over less than a dozen years after its victories over Germany and Japan.

“The most recent and familiar example of precipitous decline is, of course, the collapse of the Soviet Union. With the benefit of hindsight, historians have traced all kinds of rot within the Soviet system back to the Brezhnev era and beyond. Perhaps, as the historian and political scientist Stephen Kotkin has argued, it was only the high oil prices of the 1970s that ‘averted Armageddon.’ But this did not seem to be the case at the time. In March 1985, when Mikhail Gorbachev became general secretary of the Soviet Communist Party, the CIA estimated the Soviet economy to be approximately 60 percent the size of the U.S. economy. This estimate is now known to have been wrong, but the Soviet nuclear arsenal was genuinely larger than the U.S. stockpile. And governments in what was then called the Third World, from Vietnam to Nicaragua, had been tilting in the Soviets’ favor for most of the previous 20 years. Yet less than five years after Gorbachev took power, the Soviet imperium in central and Eastern Europe had fallen apart, followed by the Soviet Union itself in 1991. If ever an empire fell off a cliff – rather than gently declining – it was the one founded by Lenin.”

Niall Ferguson, Complexity and Collapse, Foreign Affairs, March/April 2010, pp. 28-30.

To visit our homepage or sign up for our daily email click here
To view previous daily emails click here.
daily@delanceyplace.com

Published in: on March 2, 2010 at 10:45 am  Leave a Comment  
Tags: ,

Today’s Business Leaders

Excerpt from Delancey Place

In today’s encore excerpt – writing in the late 1990s, Quinn Spitzer and Ron Evans contrast the business leaders of the immediate post-World War II period to more contemporary businesses leaders raised on a steady diet of business publications, management books, MBAs and consultants:

“During the 1990s virtually an entire generation of top executives left their businesses, retired, or passed away. Many of these executives had achieved legendary status – [David] Packard at Hewlett-Packard, [Akio] Morita at Sony, [Sir John Harvey-] Jones at ICI, [Sam] Walton at Wal-Mart, and [Jan] Carlzon at SAS, to name a few. These leaders shared some notable characteristics that differentiate them from their successors. They lived through the Great Depression, which crippled the world’s economy in the 1930s; they experienced the horrors of World War II; they served their business apprenticeships in the postwar rebuilding period of the late 1940s and early 1950s. But what may differentiate them most from their counterparts of today is the issue of management.This ‘old guard’ was the last of a breed of executives who developed their management skills almost entirely in the workplace. They were building businesses while management ‘science’ – if it can be called that – was still in its infancy.

“In 1948 … the Harvard Business Review had a robust circulation of fifteen thousand. That number had reached nearly two hundred fifty thousand by the mid 1990s. The Harvard Business School itself and the few other graduate business schools in existence in 1948 awarded 3,357 MBAs – a far cry from the 75,000 MBAs awarded forty-five years later. Even McKinsey, the best known of consulting companies, was a relatively small firm with annual revenues of under $2 million, compared with 1994 revenues of more than $1.2 billion. Management guru Peter Drucker was a youngster of thirty-nine. Seven-year-old Tom Peters was probably ‘in search of’ a new bike.

“The executives of [the immediate post-war] period were not uneducated – in fact, many were extremely well educated – but they did not learn their approach to business from a business school, a management expert, a celebrated management book, or an outside consultant. Options such as these were not generally available. These executives learned their business skills in the industrial jungle. …

“The forty-year-old executive of the 1990s, by contrast, probably holds one of the tens of thousands of MBAs awarded each year. His formal management education is supplemented by dozens of business periodicals and hundreds of management books. If, however, a situation seems resistant to even this mass of management wisdom, there are several hundred consulting firms and more than a hundred thousand consultants ready to provide additional management skill and knowledge. In 1993 businesses around the world spent $17 billion for consultants’ recommendations, and AT&T alone lavished $347.1 million on outside expertise.

“That does not necessarily mean that the business executives of the past were superior to those of the present. … Still, we suspect that if those [managers] of years gone by found themselves at the helm of any of today’s extraordinarily complex and competitive business enterprises, they would steer a straight and successful course.”

Quinn Spitzer and Ron Evans, Heads You Win!, Fireside, Simon and Schuster, Copyright 1997 by Kepner-Tregoe, Inc., pp. 15-17.

To visit our homepage or sign up for our daily email click here
To view previous daily emails click here.
daily@delanceyplace.com

Published in: on February 18, 2010 at 1:28 pm  Leave a Comment  
Tags: ,

Glass-Steagall Act (1933)

Times Topics section on Glass-Steagall

Adapted from an article in the Law Library.

The Glass-Steagall Act, also known as the Banking Act of 1933 (48 Stat. 162), was passed by Congress in 1933 and prohibits commercial banks from engaging in the investment business.

It was enacted as an emergency response to the failure of nearly 5,000 banks during the Great Depression. The act was originally part of President Franklin D. Roosevelt’s New Deal program and became a permanent measure in 1945. It gave tighter regulation of national banks to the Federal Reserve System; prohibited bank sales of securities; and created the Federal Deposit Insurance Corporation (FDIC), which insures bank deposits with a pool of money appropriated from banks.

Beginning in the 1900s, commercial banks established security affiliates that floated bond issues and underwrote corporate stock issues. (In underwriting, a bank guarantees to furnish a definite sum of money by a definite date to a business or government entity in return for an issue of bonds or stock.) The expansion of commercial banks into securities underwriting was substantial until the 1929 stock market crash and the subsequent Depression. In 1930, the Bank of the United States failed, reportedly because of activities of its security affiliates that created artificial conditions in the market. In 1933, all of the banks throughout the country were closed for a four-day period, and 4,000 banks closed permanently.

As a result of the bank closings and the already devastated economy, public confidence in the U.S. financial structure was low. In order to restore the banking public’s confidence that banks would follow reasonable banking practices, Congress created the Glass-Steagall Act. The act forced a separation of commercial and investment banks by preventing commercial banks from underwriting securities, with the exception of U.S. Treasury and federal agency securities, and municipal and state general-obligation securities. Likewise, investment banks may not engage in the business of receiving deposits.
Investment banking consists mostly of securities underwriting and related activities; making a market in securities; and setting up corporate mergers, acquisitions, and restructuring. Investment banking also includes services provided by brokers or dealers in transactions in the secondary market.

The Glass-Steagall Act restored public confidence in banking practices during the Great Depression. However, many historians believe that the commercial bank securities practices of the time had little actual effect on the already devastated economy and were not a major contributor to the Depression. Some legislators and bank reformers argued that the act was never necessary, or that it had become outdated and should be repealed.

Congress responded to these criticisms in passing the Gramm-Leach-Bilely Act of 1999, which made significant changes to Glass-Steagall. The 1999 law did not make sweeping changes in the types of business that may be conducted by an individual bank, broker-dealer or insurance company. Instead, the act repealed the Glass-Steagall Act’s restrictions on bank and securities-firm affiliations. It also amended the Bank Holding Company Act to permit affiliations among financial services companies, including banks, securities firms and insurance companies. The new law sought financial modernization by removing the very barriers that Glass-Steagall had erected.

Published in: on January 21, 2010 at 2:17 pm  Comments (2)  
Tags: , ,

Filibusters and Debate Curbs

Jimmy Stewart is shown in a scene from the 1939 film “Mr. Smith Goes to Washington.” Associated Press

Updated April 30, 2009

The filibuster is an extra-constitutional accident of Senate history that has become an institution. In the classic 1939 movie “Mr. Smith Goes to Washington,” a reporter called it “democracy’s finest show,” the “American privilege of free speech in its most dramatic form.”

What politicians think of it often depends on who is in power. In 2004, Senator Bill Frist, then the Republican majority leader, called the filibuster “nothing less than a formula for tyranny by the minority,” vowing to end the Democrats’ use of the filibuster to prevent floor votes on President Bush’s judicial nominees. In 2009, with Democrats firmly in control of the Senate, the Republicans didn’t hesitate to use the threat of a filibuster on President Obama’s legislative agenda.

When it comes to filibusters, though, perhaps no performance exceeded that of the populist Democratic senator, Huey P. Long, of Louisiana. Hoping to stave off a bill that would have given his political enemies at home lucrative New Deal jobs, Mr. Long took the floor on June 12, 1935. He read the Constitution and the plays of Shakespeare. He offered up a recipe for fried oysters and a formula for Roquefort dressing. He asked his exhausted colleagues to suggest topics for his monologue. When they wouldn’t oblige, he invited reporters in the press gallery to pass down suggestions.

Only at 4 a.m. did the urgent call of nature put an end to Mr. Long’s 15-hour soliloquy. Yet this is not the Senate record. That dubious honor belongs to Senator Strom Thurmond of South Carolina, who held up the 1957 civil rights bill for a brain-numbing 24 hours and 18 minutes.

What is the good of a tradition whose essence is wasting time? Many historians and congressional scholars respond that the filibuster is a valuable check against the passage of ill-considered legislation.

The word filibuster is derived from the Dutch vrijbuiter, or pirate. Passed down through French and Spanish, “filibuster” landed in Congress in the mid-19th century as a tongue-in-cheek label for a senator whose interminable speech held a bill, not to mention his colleagues, hostage.

Sarah A. Binder, a political scientist at George Washington University and co-author of a book on the filibuster, said that both the House and Senate began work in 1789 with a measure called a “previous question motion” that required only a simple majority to cut off debate. The House has kept such a rule to the present day.

But the Senate dropped it in an 1806 housecleaning without fully understanding the implications, she said.

As early as 1841, a frustrated Senator Henry Clay of Kentucky threatened to try to change the debate rules when opponents tied up his banking bill with interminable talk. But the Senate finally adopted a formal means of ending a filibuster only in 1917, at the urging of President Woodrow Wilson.

Infuriated by the failure of Congress to act on war measures, Mr. Wilson fumed. “A little group of willful men,” he declared, “representing no opinion but their own, have rendered the great government of the United States helpless and contemptible.”

With that push, the Senate decided that a two-thirds vote could cut off a filibuster, borrowing the French parliamentary term “clôture” for such a motion. In 1975, the Senate cut the required vote for cloture to three-fifths, or 60 senators, instead of 67.

At about the same time, the Senate created a two-track process that allows senators to block action on a piece of legislation merely by invoking the right to filibuster, without actually having to stand before the chamber and drone endlessly on. Meanwhile, the Senate can take up other business.

The measure, intended to promote efficiency, inadvertently encouraged filibusters by making them painless, and made them a normal tool of political debate. In 1995, almost 44 percent of all major legislation considered by the Senate was delayed by a filibuster or the threat of one.

The filibuster has at times symbolized, justifiably or not, the courageous stand of principled individuals against a corrupt or compromised majority. That symbolism was captured in “Mr. Smith Goes to Washington,” the classic Frank Capra film in which James Stewart plays a naïve newcomer who holds the Senate hostage for longer even than Strom Thurmond did, before collapsing in fatigue and triumph.

Such iconography is so powerful that, even as he attacked the Democrats’ filibustering in 2004, then-Senator Frist praised the movie and what it represented.

“The right to talk – the right to unlimited debate – is a tradition as old as the Senate itself,” Dr. Frist said. “It’s unique to the institution. It shapes the character of the institution. It’s why the United States Senate is the world’s greatest deliberative body.” — Adapted from “Henry Clay Hated It. So Does Bill Frist,” by Scott Shane, The Times, Nov. 21, 2004

Published in: on January 20, 2010 at 1:10 pm  Leave a Comment  
Tags: ,

The Bi-Partisan Senate

Franken v. Lieberman… This confrontation on the floor of the Senate occurred yesterday. McCain was aghast at what Franken did to poor Lieberman 😦 Here is the complete scene in all its glory, with an excellent explanation from a Young Turk.

Published in: on December 18, 2009 at 2:15 pm  Comments (2)  
Tags: , , ,

Mathews and Tobin head to head on separation of church & state

In a breathtakingly tight argument, Chris Matthews corners Rhode Island Bishop Thomas Tobin, who has banned Rep. Patrick Kennedy, D-R.I., from receiving Holy Communion due to his views on abortion.

Because here’s the moral hypocrisy at the heart of the Church’s abortion position: If it’s really and truly murder, you’re talking about prosecuting mothers, sisters, lovers and friends for having them. Tweety is quite aggressive with the bishop, demanding to know exactly what legal penalties he thinks should be legislated.

Continue reading
http://crooksandliars.com/susie-madrak/chris-matthews-gets-heart-abortion-ma

Published in: on November 25, 2009 at 6:17 pm  Comments (1)  
Tags: , ,