Market Advisory Features

Interest Rates: Chopping Away
Asian Currencies: Fear of Floating
Corporate Strategy: Who Gets Eaten and Who Gets to Eat

American Banks : The Extraordinary Lightness of Banking
Fund Managers: The Blame Game
Trading on Fear
Economic and Financial Indicators

Patrolling the World

Japanese Spirit, Western Things

Economic and Financial Indicators

Tourism: On the Road Again?
   
   

 

Trading on fear

From the start, the invasion of Iraq was seen in the US as a marketing project. Selling 'Brand America' abroad was an abject failure; but at home, it worked. Manufacturers of 4x4s, oil prospectors, the nuclear power industry, politicians keen to roll back civil liberties - all seized the moment to capitalise on the war. PR analysts Sheldon Rampton and John Stauber explain how it worked.

Saturday July 12, 2003
The Guardian


"The United States lost the public relations war in the Muslim world a long time ago," Osama Siblani, publisher of the Arab American News, said in October 2001. "They could have the prophet Mohammed doing public relations and it wouldn't help."

At home in the US, the propaganda war has been more effective. And a key component has been fear: fear of terrorism and fear of attack.

Early scholars who studied propaganda called it a "hypodermic needle approach" to communication, in which the communicator's objective was to "inject" his ideas into the minds of the target population. Since propaganda is often aimed at persuading people to do things that are not in their own best interests, it frequently seeks to bypass the rational brain altogether and manipulate us on a more primitive level, appealing to emotional symbolism.

Television uses sudden, loud noises to provoke a startled response, bright colours, violence - not because these things are inherently appealing, but because they catch our attention and keep us watching. When these practices are criticised, advertisers and TV executives respond that they do this because this is what their "audience wants". In fact, however, they are appealing selectively to certain aspects of human nature - the most primitive aspects, because those are the most predictable. Fear is one of the most primitive emotions in the human psyche, and it definitely keeps us watching. If the mere ability to keep people watching were really synonymous with "giving audiences what they want", we would have to conclude that people "want" terrorism. On September 11, Osama bin Laden kept the entire world watching. As much as people hated what they were seeing, the power of their emotions kept them from turning away.

And fear can make people do other things they would not do if they were thinking rationally. During the war crimes trials at Nuremberg, psychologist Gustave Gilbert visited Nazi Reichsmarshall Hermann Goering in his prison cell. "We got around to the subject of war again and I said that, contrary to his attitude, I did not think that the common people are very thankful for leaders who bring them war and destruction," Gilbert wrote in his journal, Nuremberg Diary.

"Why, of course, the people don't want war," Goering shrugged. "Why would some poor slob on a farm want to risk his life in a war when the best that he can get out of it is to come back to his farm in one piece? ... That is understood. But, after all, it is the leaders of the country who determine the policy and it is always a simple matter to drag the people along, whether it is a democracy or a fascist dictatorship or a parliament or a communist dictatorship ... That is easy. All you have to do is tell them they are being attacked and denounce the pacifists for lack of patriotism and exposing the country to danger. It works the same way in any country."

Politicians and terrorists are not the only propagandists who use fear to drive human behaviour in irrational directions. A striking recent use of fear psychology in marketing occurred following Operation Desert Storm in 1991. During the war, television coverage of armoured Humvees sweeping across the desert helped to launch the Hummer, a consumer version of a vehicle originally designed exclusively for military use. The initial idea to make a consumer version came from the actor Arnold Schwarzenegger, who wanted a tough-looking, road-warrior vehicle for himself. At his prodding, AM General (what was left of the old American Motors) began making civilian Hummers in 1992, with the first vehicle off the assembly line going to Schwarzenegger himself.

In addition to the Hummer, the war helped to launch a broader sports utility vehicle (SUV) craze. Psychiatrist Clotaire Rapaille, a consultant to the automobile industry, conducted studies of postwar consumer psyches for Chrysler and reported that Americans wanted "aggressive" cars. In interviews with Keith Bradsher, the former Detroit bureau chief for the New York Times, Rapaille discussed the results of his research. SUVs, he said, were "weapons" - "armoured cars for the battlefield" - that appealed to Americans' deepest fears of violence and crime.

Another hostility-intensification feature is the "grill guard" promoted by SUV manufacturers. "Grill guards, useful mainly for pushing oryx out of the road in Namibia, have no application under normal driving conditions," says writer Gregg Easterbrook. "But they make SUVs look angrier, especially when viewed through a rearview mirror ... [They] also increase the chance that an SUV will kill someone in an accident."

Deliberately marketed as "urban assault luxury vehicles", SUVs exploit fear while doing nothing to make people safer. They make their owners feel safe, not by protecting them, but by feeding their aggressive impulses. Due to SUVs' propensity for rollovers, notes Bradsher, the occupant death rate in SUVs is actually 6% higher than for cars, 8% in the largest SUVs. Of course, they also get worse mileage. According to dealers, Hummers average a mere eight to 10 miles a gallon - a figure that takes on additional significance in light of the role that dependency on foreign oil has played in shaping US relations with countries in the Middle East. With this combination of features, selling SUVs on their merits would be a challenge, which is why Rapaille consistently advises Detroit to rely instead on irrational fear appeals.

Other products and causes have also exploited fear-based marketing following September 11. "The trick in 2002, say public affairs and budget experts, will be to redefine your pet issue or product as a matter of homeland security," wrote PR Week. "If you can convince Congress that your company's widget will strengthen America's borders, or that funding your client's pet project will make America less dependent on foreign resources, you just might be able to get what you're looking for."

Alaska senator Frank Murkowski used fear of terrorism to press for federal approval of oil drilling in the Arctic National Wildlife Refuge, telling his colleagues that US purchases of foreign oil helped to subsidise Saddam Hussein and Palestinian suicide bombers. The nuclear power industry lobbied for approval of Yucca Mountain, Nevada, as a repository for high-level radioactive waste by claiming that shipping the waste there would keep nuclear weapons material from falling into the hands of terrorists. Of course, they didn't propose shutting down nuclear power plants, which themselves are prime targets for terrorists.

The National Drug Council retooled the war on drugs with TV ads telling people that smoking marijuana helped to fund terrorism. Environmentalists attempted to take the fund-a-terrorist trope in a different direction, teaming up with columnist Arianna Huffington to launch the "Detroit Project", which produced TV ads modelled after the National Drug Council ads. "This is George," a voiceover said. "This is the gas that George bought for his SUV." The screen then showed a map of the Middle East. "These are the countries where the executives bought the oil that made the gas that George bought for his SUV." The picture switched to a scene of armed terrorists in a desert. "And these are the terrorists who get money from those countries every time George fills up his SUV." In Detroit and elsewhere, however, TV stations that had been only too happy to run the White House anti-drugs ads refused to accept the Detroit Project commercials, calling them "totally inappropriate".

September 11 was frequently compared to the Japanese attack on Pearl Harbor, with White House officials warning that the war on terror would be prolonged and difficult like the second world war, and would require similar sacrifices. But whatever those sacrifices may entail, almost from the start it was clear that they would not include frugality. During the second world war, Americans conserved resources as never before. Rationing was imposed on petrol, tyres and even food. People collected waste such as paper and household cooking scraps so that it could be recycled and used for the war effort. Compare that with the headline that ran in O'Dwyer's PR Daily on September 24, less than two weeks after the terrorist attack: "PR Needed To Keep Consumers Spending."

President Bush himself appeared in TV commercials, urging Americans to "live their lives" by going ahead with plans for vacations and other consumer purchases. "The president of the US is encouraging us to buy," wrote marketer Chuck Kelly in an editorial for the Minneapolis-St Paul Star Tribune, which argued that America was "embarking on a journey of spiritual patriotism" that "is about pride, loyalty, caring and believing" - and, of course, selling. "As marketers, we have the responsibility to keep the economy rolling," wrote Kelly. "Our job is to create customers during one of the more difficult times in our history."

Fear also provided the basis for much of the Bush administration's surging popularity following September 11. In the week immediately prior to the terrorist attacks, Bush's standing in opinion polls was at its lowest point ever, with only 50% of respondents giving him a positive rating. Within two days of the attack, that number shot up to 82%. Since then, whenever the public's attention has begun to shift away from topics such as war and terrorism, Bush has seen his domestic popularity ratings slip downward, spiking up again when war talk fills the airwaves. By March 13-14 2003, his popularity had fallen to 53% - essentially where he stood with the public prior to 9/11. On March 18, Bush declared war with Iraq, and the ratings shot up again to 68% - even when, briefly, it appeared that the war might be going badly.

Only four presidents other than Bush have seen their job rating meet or surpass the 80% mark:

· Franklin Delano Roosevelt reached his highest rating ever - 84% - immediately after the Japanese attacked Pearl Harbor.

· Harry Truman hit 87% right after FDR died during the final, crucial phase of the second world war.

· John F Kennedy hit 83% right after the colossal failure of the Bay of Pigs invasion of Cuba.

· Dubya's dad, President George HW Bush, hit 89% during Operation Desert Storm.

It seems to be a law of history that times of war and national fear are accompanied by rollbacks of civil liberties and attacks on dissent. During the civil war, Abraham Lincoln suspended the right of habeas corpus. The second world war brought the internment of Japanese-Americans and the cold war McCarthyism. These examples pale compared with the uses of fear to justify mass killings, torture and political arrests in countries such as Mao's China, Stalin's Russia or Saddam's Iraq. Yet these episodes have been dark moments in America's history.

Although the Bush administration took pains to insist that "Muslims are not the enemy" and that it viewed Islam as a "religion of peace", it was unable to prevent a series of verbal attacks against Muslims that have occurred in the US following 9/11 - with some of the attacks coming from Bush's strongest supporters in the conservative movement. "This is no time to be precious about locating the exact individuals directly involved in this particular terrorist attack," wrote columnist Ann Coulter - now famously - two days after the attacks. "We should invade their countries, kill their leaders and convert them to Christianity. We weren't punctilious about locating and punishing only Hitler and his top officers. We carpet-bombed German cities; we killed civilians. That's war. And this is war."

Of course, Coulter's column does not reflect the mainstream of US opinion. But it offers a telling illustration of the way that fear can drive people to say and do things that make them feel brave and powerful while actually making them less safe by fanning the flames of intolerance and violence.

Shortly after Coulter's column appeared, it resurfaced on the website of the Mujahideen Lashkar-e-Taiba - one of the largest militant Islamist groups in Pakistan - which works closely with al-Qaida. At the time, the Lashkar-e-Taiba site was decorated with an image that depicted a hairy, monstrous hand with claws in place of fingernails, from which blood dripped on to a burning globe of planet earth. A star of David decorated the wrist of the hairy hand, and behind it stood an American flag. The reproduction of Coulter's column used bold, red letters to highlight the sentence that said to "invade their countries, kill their leaders and convert them to Christianity". To make the point even stronger, the webmaster added a comment: "We told you so. Is anyone listening out there? The noose is already around our necks. The preparation for genocide of ALL Muslims has begun ... The media is now doing its groundwork to create more hostility towards Islam and Muslims to the point that no one will oppose this mass murder which is about to take place. Mosques will be shut down, schools will be closed, Muslims will be arrested, and executed. There may even be special awards set up to kill Muslims. Millions and millions will be slaughtered like sheep. Remember these words because it is coming. The only safe refuge you have is Allah."

Corporate spin doctors, thinktanks and conservative politicians have taken up the rhetoric of fear for their own purposes. Even before 9/11, many of them were engaged in an ongoing effort to demonise environmentalists and other activist groups by associating them with terrorism. One striking indicator of this preoccupation is the fact that Congressman Scott McInnis (Republican, Colorado) had scheduled congressional hearings on "eco-terrorism" to be held on September 12 2001, one day after Congress itself was nearly destroyed in an attack by real terrorists. (The September 11 attacks forced McInnis temporarily to postpone his plans, rescheduling his hearings to February 2002.)

On October 7 2001, the Washington Times printed an editorial calling for "war against eco-terrorists," calling them "an eco-al-Qaida" with "a fanatical ideology and a twisted morality". Conservatives sometimes used the war on terrorism to demonise Democrats. The then Democratic Senate majority leader Tom Daschle was targeted by American Renewal, the lobbying wing of the Family Research Council, a conservative thinktank that spends most of its time promoting prayer in public schools and opposing gay rights. In newspaper ads, American Renewal attempted to paint Daschle and Saddam Hussein as "strange bedfellows". "What do Saddam Hussein and Senate majority leader Tom Daschle have in common?" stated a news release announcing the ad campaign. "Neither man wants America to drill for oil in Alaska's Arctic National Wildlife Refuge."

William J Bennett, Reagan's former education secretary, authored a book titled Why We Fight: Moral Clarity And The War On Terrorism. Through his organisation, Empower America, he launched Americans For Victory Over Terrorism, a group of well-connected Republicans including Jack Kemp, Jeane Kirkpatrick and Trent Lott. "The threats we face today are both external and internal: external in that there are groups and states that want to attack the United States; internal in that there are those who are attempting to use this opportunity to promulgate their agenda of 'blame America first'. Both threats stem from either a hatred for the American ideals of freedom and equality or a misunderstanding of those ideals and their practice," he stated.

Washington Times reporter Ellen Sorokin used terrorist-baiting to attack the National Education Association, America's largest teachers' union and a frequent opponent of Republican educational policies. The NEA's crime was to create a "Remember September 11" website for use as a teaching aid on the first anniversary of the attack. The NEA site had a red, white and blue motif, with links to the CIA and to Homeland Security websites, and it featured three speeches by Bush, whom it described as a "great American". In order to make the case that the NEA was somehow anti-American, Sorokin hunted about on the site and found a link to an essay preaching tolerance towards Arab- and Muslim-Americans. "Everyone wants the terrorists punished," the essay said, but "we must not act like [the terrorists] by lashing out at innocent people around us, or 'hating' them because of their origins ... Groups of people should not be judged by the actions of a few. It is wrong to condemn an entire group of people by association with religion, race, homeland, or even proximity."

In a stunning display of intellectual dishonesty, Sorokin took a single phrase - "Do not suggest any group is responsible" (referring to Arab-Americans in general) - and quoted it out of context to suggest that the NEA opposed holding the terrorists responsible for their deeds. Headlined "NEA delivers history lesson: Tells teachers not to cast 9/11 blame", her story went on to claim that the NEA simultaneously "takes a decidedly blame-America approach".

This, in turn, became the basis for a withering barrage of attacks as the rightwing media echo chamber, including TV, newspapers, talk radio and websites, amplified the accusation, complaining of "terrorism in the classroom" as "educators blame America and embrace Islam". In the Washington Post, George Will wrote that the NEA website "is as frightening, in its way, as any foreign threat". If, as Will insinuated, even schoolteachers are as scary as Saddam or Osama, no wonder the government needs to step in and crack the whip.

Since 9/11, laws have been passed that place new limits on citizen rights, while expanding the government's authority to spy on citizens. In October 2001, Congress passed the ambitiously named USA Patriot Act, which stands for "Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism". In addition to authorising unprecedented levels of surveillance and incarceration of both citizens and non-citizens, the Act included provisions that explicitly target people simply for engaging in classes of political speech that are expressly protected by the US constitution. It expanded the ability of police to spy on telephone and internet correspondence in anti-terrorism investigations and in routine criminal investigations. It authorised secret government searches, enabling the FBI and other government agencies to conduct searches without warrants and without notifying individuals that their property has been searched. It created a broad new definition of "domestic terrorism" under which political protesters can be charged as terrorists if they engage in conduct that "involves acts dangerous to human life". It also put the CIA back in the business of spying on US citizens and allowed the government to detain non-citizens for indefinite periods of time without trial. The Patriot Act was followed in November 2001 by a new executive order from Bush, authorising himself to order a trial in a military court for any non-citizen he designates, without a right of appeal or the protection of the Bill of Rights.

As if determined to prove that irony is not dead, the Ad Council launched a new series of public service advertisements, calling them a "Freedom Campaign", in July 2002. "What if America wasn't America? Freedom. Appreciate it. Cherish it. Protect it," read the tag line at the end of each TV ad, which attempted to celebrate freedom by depicting what America would look like without it. In one ad, a young man approaches a librarian with a question about a book he can't find. She tells him ominously that the book is no longer available, and the young man is taken away for questioning by a couple of government goons. The irony is that the Patriot Act had already empowered the FBI to seize book sales and library checkout records, while barring booksellers and librarians from saying anything about it to their patrons. It would be nice to imagine that someone at the Ad Council was trying to make a point in opposition to these encroachments on our freedoms. No such point was intended, according to Phil Dusenberry, who directed the ads.

In response to complaints about restrictions on civil liberties, the attorney general, John Ashcroft, testified before Congress, characterising "our critics" as "those who scare peace-loving people with phantoms of lost liberty; my message is this: Your tactics only aid terrorists - for they erode our national unity and diminish our resolve. They give ammunition to America's enemies, and pause to America's friends. They encourage people of good will to remain silent in the face of evil." Dennis Pluchinsky, a senior intelligence analyst with the US state department, went further still in his critique of the media. "I accuse the media in the United States of treason," he stated in an opinion article in the Washington Post that suggested giving the media "an Osama bin Laden award" and advised, "the president and Congress should pass laws temporarily restricting the media from publishing any security information that can be used by our enemies".

At MSNBC, a cable TV news network, meanwhile, a six-month experiment to develop a liberal programme featuring Phil Donahue ended just before the war began, when Donahue's show was cancelled and replaced with a programme titled Countdown: Iraq. Although the network cited poor ratings as the reason for dumping Donahue, the New York Times reported that Donahue "was actually attracting more viewers than any other programme on MSNBC, even the channel's signature prime-time programme, Hardball with Chris Matthews". Further insight into the network's thinking appears in an internal NBC report leaked to AllYourTV.com, a website that covers the television industry. The NBC report recommended axing Donahue because he presented a "difficult public face for NBC in a time of war ... He seems to delight in presenting guests who are antiwar, anti-Bush and sceptical of the administration's motives." It went on to outline a possible nightmare scenario where the show becomes "a home for the liberal anti-war agenda at the same time that our competitors are waving the flag at every opportunity".

At the same time that Donahue was cancelled, MSNBC added to its line-up Michael Savage, who routinely refers to non-white countries as "turd world nations" and who charges that the US "is being taken over by the freaks, the cripples, the perverts and the mental defectives". In one broadcast, Savage justified ethnic slurs as a national security tool: "We need racist stereotypes right now of our enemy in order to encourage our warriors to kill the enemy."

In addition to restricting the number of anti-war voices on television and radio, media outlets often engaged in selective presentation. The main voices that television viewers saw opposing the war came from a handful of celebrities such as Sean Penn, Martin Sheen, Janeane Garofalo and Susan Sarandon - actors who could be dismissed as brie-eating Hollywood elitists. The newspapers and TV networks could have easily interviewed academics and other more traditional anti-war sources, but they rarely did. In a speech in the autumn of 2002, Senator Edward Kennedy "laid out what was arguably the most comprehensive case yet offered to the public questioning the Bush administration's policy and timing on Iraq", according to Michael Getler, the Washington Post's ombudsman. The next day, the Post devoted one sentence to the speech. Ironically, Kennedy made ample use in his remarks of the public testimony in Senate armed services committee hearings a week earlier by retired four-star army and marine corps generals who cautioned about attacking Iraq at this time - hearings that the Post also did not cover.

Peace groups attempted to purchase commercial time to broadcast ads for peace, but were refused air time by all the major networks and even MTV. CBS network president Martin Franks explained the refusal by saying, "We think that informed discussion comes from our news programming."

Like all good TV, the war in Iraq had a dramatic final act, broadcast during prime time - the sunlight gleaming over the waves as the president's fighter jet descended from the sky on to the USS Abraham Lincoln. The plane zoomed in, snagged a cable stretched across the flight deck and screeched to a stop, and Bush bounded out, dressed in a snug-fitting olive-green flight suit with his helmet tucked under his arm. He strode across the flight deck, posing for pictures and shaking hands with the crew of the carrier. He had even helped fly the jet, he told reporters. "Yes, I flew it," he said. "Yeah, of course, I liked it." Surrounded by gleaming military hardware and hundreds of cheering sailors in uniform, and with the words "Mission Accomplished" emblazoned on a huge banner at his back, he delivered a stirring speech in the glow of sunset that declared a "turning of the tide" in the war against terrorism. "We have fought for the cause of liberty, and for the peace of the world," Bush said. "Because of you, the tyrant has fallen, and Iraq is free." After the day's festivities, the Democrats got their chance to complain, calling Bush's Top Gun act a "tax-subsidised commercial" for his re-election campaign. They estimated it had cost $1m to orchestrate all of the details that made the picture look so perfect.

In the end, though, the spin doctors agreed that these were images that would stay in the minds of the American people. It is impossible, of course, for anyone to predict whether the Bush administration's bold gamble in Iraq has succeeded or whether, as Egyptian president Hosni Mubarak warned at the peak of the war, "there will be 100 Bin Ladens afterward". But in the wake of this conflict, we should ask ourselves whether we have made the mistake of believing our own propaganda, and whether we have been fighting the war on terror against the wrong enemies, in the wrong places, with the wrong weapons

 

Economic and financial indicators

Overview

Jul 12th 2003
From The Economist print edition

American unemployment surged to 6.4% in June, up from 6.1% in May—far worse than most forecasters had expected.

Stockmarkets heated up, while bonds cooled. America's NASDAQ rose to its highest level in over a year. In Japan, the Nikkei index rose briefly above 10,000 for the first time since August. Reflecting the renewed interest in shares, ten-year bonds issued by the Japanese government saw yields rise above 1%, and yields on ten-year German government bonds touched 4.0%, their highest level in two months.

Germany's industrial output suffered a sharper than expected contraction of 0.7% in May, bringing industrial-production growth for the 12 months to May to a paltry 0.2%. On a positive note, the country experienced a surprise fall in unemployment. According to new figures, GDP in France grew by 1.1% in the year to the first quarter, up slightly from preliminary estimates.

The Dutch economy also did a bit better than earlier estimates: it grew by 0.2% in the year to the first quarter. Industrial production fell by 3.7% in the 12 months to May.

Britain's trade figures were revised to take account of a massive scam to avoid VAT. Imports for the four years 1999-2002 were deemed to be £22.7 billion ($34.5 billion) higher than previously reported.

 

Patrolling the world

Jul 11th 2003
From The Economist Global Agenda

The 148,000 American troops in Iraq won’t be leaving anytime soon, General Tommy Franks told a Senate committee this week. Behind the scenes, the Pentagon is preparing an overhaul of America's military presence overseas

AFGHANISTAN, Iraq, Georgia, Djibouti, the Philippines, and now maybe Liberia: the roster of countries to which America has sent troops since the September 11th terrorist attacks stretches on. America has nearly 1.5m active-duty military personnel at home and round the world, with about as many in the reserves and national guard. This figure sounds big. But a good many of the soldiers are support staff—supply clerks, doctors, cooks and so on. The burly fighting types are far fewer in number.

Hence the concern in Congress about the scale of America's commitment in Iraq. On Thursday July 10th, Tommy Franks, until recently the battle commander there, told the Senate's armed-services committee that the number of troops in Iraq, currently 148,000, would probably hold steady for the “foreseeable future”, and that America might well stay on for two to four years—hardly the quick pull-out that some politicians had hoped for. The price tag came as more of an unpleasant surprise: Donald Rumsfeld, America's defence secretary, told the committee that the cost of keeping troops there (which does not include reconstruction costs) had risen to about $3.9 billion a month, rather more than the roughly $2 billion monthly bill forecast in April. Increasing attacks by irregulars account for part of the need for more money and manpower.

But even as America worries about its obligations in Iraq, Pentagon planners are busy figuring out where else to send troops around the world. Military strategists are especially worried about an "arc of instability"—a sweep of poor, roughed-up countries that runs from South-East Asia through the Middle East and on to North Africa. At the moment, some of America's biggest overseas bases are far from that arc. Germany hosts 68,000 American troops, a legacy of last century’s wars; plenty more soldiers and pilots are in Italy and Britain. In the Pacific, Japan is the biggest American base, hosting 41,000 troops, half of them marines on Okinawa, a small island in the south. Closer to harm’s way, America has around 37,000 troops in South Korea.

Even before September 11th, Mr Rumsfeld was known to be a strong advocate of a nimbler military. America's campaign against terrorists—very different from the big national armies of conventional enemies, or even the concerted guerrilla resistance in Vietnam—has given new urgency to his efforts. With new threats liable to come from anywhere at any time, Mr Rumsfeld and his aides believe that the best way to meet them is not with big, clunky bases deep in friendly territory. Much better would be to develop a network of smaller "forward operating bases" around the world, which could serve as a springboard for troops monitoring a threat or responding to trouble.

America already has plenty of small global outposts. Guam, an American possession in the Pacific, hosts several thousand sailors and pilots; there are American army peacekeepers under NATO in Bosnia and Kosovo; Egypt had over 300 American soldiers at last count. There are also tens of thousands afloat, in foreign seas.

Still, this leaves plenty of holes. Sub-Saharan Africa is perhaps the most obvious, now that America has become a stronger presence in the Middle East. Kenya has already had its share of al-Qaeda attacks, and there are plenty of lawless countries, such as Somalia, that a terrorist cell could operate from with relative ease. According to the New York Times, America has its eye on creating bases in Mali, Algeria and elsewhere, and wants to sort out refueling rights for its planes in Uganda and Senegal. Doubtless these issues were raised this week during President George Bush’s tour of Afria; so too was a possible American peacekeeping contingent for war-torn Liberia.

Asia, too, will see changes. Already, things are shifting in South Korea: the Pentagon announced in June that its 15,000 troops on the front line along the demilitarised zone between North and South Korea will drop back to a spot a bit closer to (though still north of) Seoul, the South Korean capital. South Korea worried aloud that this might free America up to bomb North Korea, but the Pentagon insists that increasing troops’ flexibility is its main motive. Elsewhere in East Asia, America is said to be looking at shifting troops to Singapore and Australia—staunch allies in the war on terror—and also to Vietnam. The Philippines, a former American colony, has also apparently been approached, but it refuses to host American combat troops due to constitutional restaints. (The more than 1,000 Americans that Mr Bush sent last year were only supposed to be training Filipino troops in counter-terrorism, rather than battling directly with local guerrillas.)

In Europe, America's ex-communist friends have proved eager to supplant Germany, which is bracing unhappily for a scaling-down of its big American presence. Bulgaria has brushed aside America's cutting-off of $10m in military aid (done to punish Bulgaria for not signing an an agreement that would exempt Americans from the jurisdiction of the International Criminal Court, which America despises), saying that its offer to host an American base remains firm. Romania too says it will gladly host American forces; there is similar enthusiasm in some of the former Soviet republics in central Asia, a region invaluable to America during the war in Afghanistan.

If even some of these changes happen, America’s military map will probably look very different in just a few years. Its new alignments, no longer so wedded to cold-war geography, will probably be more effective. But simply creating a more agile force, backed by the very latest technology, may not be enough. With big projects like Iraq and Afghanistan still absorbing large numbers of soldiers, overstretch will remain a worry—even if new wars do not come along. If America ends up (however reluctantly) with an empire of fragile states, it would do well to persuade allies to share the burden of policing them. America has done that successfully in Bosnia and Kosovo, gradually reducing its forces while letting Europeans and others take increasing responsibility. In Iraq, it has not yet been able to do the same. General Franks told his Senate questioners that 19 countries have already contributed troops to Iraq, with 19 more soon to join and talks with 11 others under way. But aside from the British, they are all but invisible. Mr Rumsfeld's drive to make his forces more lithe is a good one; but to cover the world effectively, America will need a little more help from its friends.

 

Chopping away

Jul 10th 2003
From The Economist Global Agenda


Interest rates in Asia are falling, as economies struggle with recession and the after-effects of the SARS crisis. But Europe’s central bankers are resisting the rush to cheaper money

WHAT do Britain, South Korea, Singapore and Indonesia have in common? Perhaps not that much. But on Thursday July 10th, all four countries cut interest rates as part of the effort to stimulate economic activity. The decision taken in London had been a close call, with opinion divided before the anouncement on whether the Bank of England would cut. The three Asian economies joined a growing list of countries in the region that have reduced interest rates in recent weeks. The monetary authorities in Hong Kong, the Philippines, Thailand, and Taiwan have already followed the example set by America’s Federal Reserve on June 25th. But the European Central Bank (ECB) decided against any change in rates at its monthly meeting, also on Thursday of this week. The ECB cut rates in June: that, apparently, is enough for now.

The monetary loosening in Britain and Asia only underlines the extent to which the ECB is now out of step. Singapore and South Korea acted after confirmation that their economies had slipped into recession this year. In both countries, the authorities think the worst is over and that growth will pick up during the second half of the year. But in the wake of the SARS crisis, and the general downturn in tourism following increased fears about terrorist attacks, governments in Asia are keen to restore economic confidence.

Cheaper money has certainly worked in America, according to John Snow, President George Bush’s treasury secretary. In an interview published on Thursday, Mr Snow said a combination of lower interest rates and the programme of tax cuts pushed through by Mr Bush has put the economy on a path to faster growth. Mr Snow expects the economy to expand at an annual rate of 3% in the second half of this year. That might be a touch on the optimistic side—the signals about America’s economic performance continue to be mixed, with unemployment currently at the highest level for nine years.

But even the cautious Alan Greenspan, chairman of the Fed and principal architect of America’s cheap-money policy, has conceded that Mr Bush’s tax cuts have come at a fortuitous time for the economy. Nearly two years after the recession ended, American growth is still relatively sluggish. The uncertain nature of the recovery, coupled with fears about deflation, prompted last month’s rate cut, the 13th since January 2001.

America’s problems pale in comparison with those in Europe, though. The German economy is already technically in recession. Only last week, one of the leading German think-tanks, the DIW, predicted that the economy—Europe’s largest—would shrink during 2003. It said there was little prospect of a significant upturn next year. Yet the ECB’s decision to resist further interest-rate cuts this month was well-trailed. When Wim Duisenberg, the ECB president, addressed the European Parliament last week, he said that the ECB had done its part by reducing borrowing costs in June. Now, he argued, it was up to European governments to do their bit.

Mr Duisenberg does have a point. European governments have postponed much-needed economic reforms. The German government has, belatedly, embarked on a programme of reforms aimed at curbing the excesses of the welfare state. But such changes take a long time to implement, and even longer to have any impact. Germany desperately needs short-term stimulus. Chancellor Gerhard Schröder has decided to bring forward income-tax cuts to Janaury next year—though he has been vague, as yet, on how these will be financed. His government has little room for maneouvre because of the need to bring Germany back within the limits of the euro area’s stability pact, which requires governments to curb borrowing.

So it all comes back to interest rates and the ECB. Some economists suspect that the bank will cut rates again after the summer, but the ECB is building a reputation for doing too little, too late. If, as currently seems likely, Germany remains stuck in the economic doldrums and if, as is quite possible, that has a damaging impact on the rest of Europe, the ECB could find itself getting the blame.

 

Asian currencies

Fear of floating
Jul 10th 2003
From The Economist print edition

By tying their currencies to the dollar, Asian governments are creating global economic strains

AFTER sinking since the start of the year, the dollar has come up for air, gaining 4% against the euro in recent weeks. But it is quite likely to plunge again, pulled under by America's huge current-account deficit. So far the dollar's descent has been uneven. It has fallen by around a quarter against the euro since the start of 2002. But it has lost only 10% or less against the yen and many other Asian currencies, and it is unchanged against the Chinese yuan, although most of the Asian economies have large balance-of-payments surpluses.

America's biggest bilateral trade deficit is with China ($103 billion in 2002). Asia as a whole accounts for half of America's total deficit. If these currencies cling to the dollar, then others such as the euro will have to rise disproportionately if America's deficit is to be trimmed.

And cling they do. The Chinese yuan and the Malaysian ringgit are pegged to the dollar and protected by capital controls. The Hong Kong dollar is also tied to the greenback through a currency board. Officially, other Asian currencies float, but central banks have been intervening on a grand scale in the foreign-exchange market to hold down their currencies as the dollar has weakened. The exception is the Indonesian rupiah, which has gained 27% against the dollar in the past 18 months (see chart).

Whereas intervention to support a currency often fails, intervention to push one down can be more effective, because in theory a central bank can print unlimited amounts of its own currency with which to buy dollars. As a result of central banks' heavy buying, Asia's foreign-exchange reserves have swollen from less than $800 billion at the start of 1999 to over $1.5 trillion now, almost two-thirds of the global total. Japan bought over $30 billion-worth in May alone; it now has almost $550 billion in its coffers. The world's seven biggest holders of foreign-exchange reserves are all in Asia (see chart).

The Asian countries' reluctance to allow their currencies to rise against the dollar is coming in for increasing criticism. At a meeting in Bali last weekend of Asian and European finance ministers, the Europeans urged the Asians to let their currencies rise. John Snow, America's treasury secretary, the International Monetary Fund and the Bank for International Settlements have all called for a stronger yuan.

Asian governments worry that appreciating currencies might hurt their exports. Yet many of their currencies are supercompetitive. As the dollar slides, their trade-weighted values against a basket of currencies is falling. According to The Economist's Big Mac index, China has the most undervalued currency in the world. Using more sophisticated methods, UBS, a Swiss bank, reckons that the yuan is now more than 20% undervalued against the dollar.

UBS reckons that there are two tell-tale signs that a currency is undervalued. The first is rapidly rising official reserves. China, Japan, Taiwan and India have seen the biggest increases in reserves over the past 18 months. On the other hand, in Hong Kong, Singapore, Malaysia and the Philippines, reserves have been fairly flat.

A second test is the size of a country's basic balance (the sum of its current-account balance and net inflows of long-term capital, such as foreign direct investment). In 2002 China's current-account surplus was 2.2% of GDP; adding in foreign direct investment gave a basic balance of 6% of GDP. This year the current-account surplus has shrunk, but the overall basic balance remains well in surplus.

UBS reckons that all the Asian currencies, except Indonesia's, are undervalued against the dollar on the basis of these two measures. The most undervalued are the yuan, the yen, the Indian rupee and the Taiwan and Singapore dollars; the least undervalued are the ringgit, the Hong Kong dollar and the South Korean won.

In a free market, China's currency would surely rise. But demands from foreigners are likely to fall on deaf ears. The Chinese government is worried about rising unemployment as jobs are lost in unprofitable state companies, and deflation remains an issue. Moreover, until banks are reformed and non-performing loans tackled, it would be dangerous to liberalise the capital account. It would be safer to repeg the yuan at a higher rate. But most economists reckon that, at best, the yuan's band will be widened slightly over the next year, without allowing room for any significant appreciation. And, so long as the yuan is pegged to the dollar, other Asian countries will have a big reason to resist appreciation too.

In the wake of the Asian crisis of 1997, it is understandable that governments like to have bigger reserves to defend their currencies against future attack. But stuffing reserves under the mattress is not without cost. The return on American Treasury bonds is much less than could be had from investing the money more productively at home. Large inflows of foreign exchange can also bring too much liquidity into the economy, which can then cause asset-price bubbles. Asian central banks have tried to “sterilise” their intervention, selling bonds to mop up extra liquidity, but this will become harder as reserves grow.

China is considering various policies to stem the rise in reserves and fend off pressure for a revaluation. One option would be to allow firms to retain more foreign-exchange earnings; at present most have to be sold to the People's Bank of China. Another option is to relax restrictions on residents and firms wanting to buy foreign currency. The government already plans, later this year, to allow Chinese firms to buy foreign bonds. In June, 11 Asian countries set up a $1 billion Asian Bond Fund that will invest in local bonds. The aim is to develop local bond markets and so keep more Asian capital at home rather than see it invested abroad.

Fred Bergsten, of the Institute for International Economics in Washington, DC, criticises Asian countries' exchange-rate policies. He complains that they are not playing their role in the global adjustment process that is needed to reduce America's external deficit. As a result, as the dollar slides, the euro is likely to become seriously overvalued, while Asia's cheap currencies may provoke protectionism.

The complaints from Europe are likely to be louder than those from America. American pressure on China may be limited because the United States needs China's help in resolving tensions with North Korea. Another reason for America to pull its punches is that China and other Asian countries hold their reserves largely in American government securities. If Asians lost their appetite for dollar assets, the greenback would fall even faster, and American bond yields would rise.

Indeed, from this point of view, the Asian economies are supporting America's profligate habits. By buying American government securities they help finance America's large external deficit, hold down interest rates, and so sustain the boom in consumer spending and mortgage borrowing. This may benefit America in the short term, but it allows even bigger imbalances, in the shape of consumer debt and foreign liabilities, to continue to build. The eventual consequences for America—and the world economy—could be more painful.


Corporate strategy

Who gets eaten and who gets to eat

Jul 10th 2003
From The Economist print edition

Is recent history making companies timorous in their strategic planning?

WITH stockmarkets and profits both edging up, corporate executives are daring to think again about the future. Emerging from their cost-cutting bunkers and shaking off the excesses of the turn-of-the-century boom, they are talking once more about strategies for growth. Admittedly some companies—carmakers such as Ford and Fiat, for example, and many airlines—are still wondering how to survive. But others are already expanding. Japanese carmakers are opening new plants in North America. Some airlines, such as the low-cost carriers easyJet and AirTran, are buying large numbers of new aircraft. Emirates ordered no less than $12 billion-worth of giant Airbuses at last month's Paris air show.

Moreover, hostile takeovers are reappearing on the stage, a sure sign of emerging opportunism. In software, Oracle is bidding for PeopleSoft ; this week ArvinMeritor bid for a fellow car-parts maker, Dana, and Alcan went for Pechiney, a move that could trigger a consolidation of the aluminium industry.

As they search for growth opportunities, however, companies face a classic dilemma, one made more poignant by recent events: should they assume that the future will, more or less, be a continuation of the past; or should they try to anticipate the next big revolution? Should they, essentially, hang on to what they've got (their “core competence”), or should they strike out for a brave new world?

Hold the revolution

After the dotcom disaster and much idle talk of a new economic paradigm, revolutions are distinctly out of favour. Belief in rapid change and dramatic responses has been shaken by the bursting of the stockmarket bubble, and by the demise of such firms as Enron and Webvan. There is now a widespread aversion to management fads. Most managers today are more interested in getting the basics right than in chasing the next rainbow.

An article in the July issue of the Harvard Business Review reflects this spirit. Called “What Really Works”, it reports the findings of a five-year research programme (led by Nitin Nohria of the Harvard Business School and William Joyce of the Tuck School of Business). The study put 160 companies under the microscope over a ten-year period (1986-96), grading them on their use of some 200 different management practices. Its main finding is that superior performance does not depend on use of this or that trendy management technique. “It doesn't really matter if you implement ERP (enterprise resource planning) software or a CRM (customer relationship management) system; it matters very much, though, that whatever technology you choose to implement you execute it flawlessly.”

Flawless execution, claim the authors, is one of four old-fashioned things that distinguish successful companies over time. The other three are: a company culture based on aiming high; a structure that is flexible and responsive; and a strategy that is clear and focused.

One thing standing in the way of the flawless execution of clear strategies, writes Charles Roxburgh in the latest issue of McKinsey Quarterly, is the human brain. In “Hidden Flaws in Strategy” he applies some of the insights of behavioural economics to strategic decision-making. Management's tendency to be over-confident and to favour the status quo, he says, works against good strategic planning.

So too does the phenomenon of “anchoring”, the linking of things people do not know to vaguely related things that they learnt recently. For example, ask somebody for the last three digits of their telephone number; and then ask which year Genghis Khan died. Most will give a three-digit date in the first millennium (when the answer is, in fact, 1227). “Anchoring can be dangerous,” says Mr Roxburgh, “particularly when it is a question of becoming anchored to the past.”

In their book “Creative Destruction”, Richard Foster and Sarah Kaplan, two management consultants, make a similar argument. They say that too many corporate bosses assume the future will be much like the past—what worked before will work again—an attitude that can all too often destroy shareholder value. The authors tell the story of a Manhattan branch of the East River Savings Bank, founded in 1848, its Ionic columns exuding safety, security and thrift. But the bank went into the property market in the 1970s, was hit by the savings-and-loan crisis in the 1980s, and ended up in the hands of a property developer who sold the branch to another bank, which closed it in 1997. Today the building is a pharmacy.

Anchors aweigh

How then to weigh corporate anchors and move on from the recent past? Mr Foster and Ms Kaplan say that businesses should stop hankering after a mythical golden age, when enterprising firms grew gradually into solid companies in which widows and orphans could safely park their money. This rosy view of the corporate past is an illusion, they say. “The corporate equivalent of El Dorado, the golden company that continually performs better than the markets, has never existed. Managing for survival, even among the best and most revered corporations, does not guarantee strong, long-term performance for shareholders. In fact, just the opposite is true.”

The two authors compared the original 1917 Forbes magazine list of the top 100 American companies (by assets, in those days), with a comparable list that the magazine published in 1987. By then, 61 of the original group had ceased to exist; of the remainder, only 18 had managed to stay in the top 100. They included such respected firms as Kodak , DuPont, General Electric , Ford, General Motors and Procter & Gamble . These all survived depression, world war, the oil-price shocks and unprecedented technological change.

But survival did not mean that they were more profitable than their peers. Of the 18, only General Electric and Kodak outperformed the stockmarket. The group as a whole had returns that were 20% below the market's compound annual growth rate of 7.5% over those 70 years.

A look at another list confirms the point. Of the companies that made up the S&P 500 in 1957, only 74 (37%) made it through to the 1997 list, and only 12 (6%) outperformed the index over the period. As Mr Foster and Ms Kaplan put it, by the end of the 1990s, an S&P 500 made up only of those companies that had been there since 1957 would have underperformed the actual index by one-fifth, year after year. Endurance per se bears little if any relationship to performance.

The subheading to Mr Foster's and Ms Kaplan's book is “From Built-to-Last to Built-to-Perform”, a dig at one of the most influential business books of recent years, “Built to Last” by Jim Collins and Jerry Porras. Published in 1994, “Built to Last” looked at a small sample of companies (18) that had been persistently great over a long period of time. It suggested that endurance and performance were linked.

In his sequel, “Good to Great”, published in 2000, Mr Collins tried to pin down what it was that distinguished great companies from those that were merely good. His definition of a great company was one that had spectacularly outperformed the stockmarket by a factor of three over a 15-year period. This level was chosen because it was comfortably above the average 2.5 times by which acknowledged leaders such as 3M, Boeing , Coca-Cola, General Electric and Wal - Mart had outperformed the market from 1985 to 2000.

In a contrary way, Mr Collins and his team of researchers came up with a list of nine things that they had not found in their winning companies. Great companies, Mr Collins claims, do not depend on outstanding charismatic leaders, brilliant strategy or audacious takeovers to pull ahead. At the top they rely on quietly determined bosses with a belief in high standards and discipline.

One example is Darwin Smith, a chief executive of Kimberly-Clark, whose unassuming nature did not prevent him from taking dramatic decisions. At one stage he realised that the company's core business of making coated paper offered the prospect of no better than mediocre returns. So he sold the paper mills and bet the company on becoming a leading maker of consumer paper products.

A similar thing happened with another of the companies on Mr Collins's list of greats. Walgreens pottered along as a chain of 500 restaurants, based on the founding family's formula for malted milk-shakes, until the early 1970s. At that point, Charles Walgreen III decided that the company would do better if it were to focus entirely on drugstores. He wound down the restaurants within five years, and Walgreens went on to become a stockmarket star for the next 25.

This is a bit like Cortes burning his boats so that his men had no choice but to make their old-world ways succeed in the new world. There was literally no turning back. For Mr Collins, there is no conflict in remaining anchored to the past, but at the same time going for greatness in the future. He believes that leaders such as Mr Smith and Mr Walgreen did not start to make their companies great by setting a new vision and a new strategy. Instead, he says, “they first got the right people on the bus, the wrong people off the bus, and the right people in the right seats—and then figured out where to drive it.”

Inevitable surprises

Choosing the passengers before the journey is not a formula favoured by Peter Schwartz. The doyen of scenario planning—a widely used method for companies to think about the future, developed largely by the Shell oil company in the 1970s—argues that the future is not as unknown as we think. Companies can chart a route and then decide whom they need to steer them along it.

In his new book, “Inevitable Surprises”, Mr Schwartz lists some of the future shocks that should not surprise us—the lengthening of the human life-span, for example, where 60 becomes the equivalent of 40; the changing patterns of migration; the dominance of American economic and military might; and the existence of “a set of disorderly nations with the capacity to unleash terror, disease and disruption” on the rest of the world.

Companies that want to prepare themselves for these inevitable changes have a number of options, says Mr Schwartz. These include building effective intelligence systems; cultivating a sense of timing; and trying “to avoid denial”. They also include putting in place mechanisms to engender creative destruction. “What processes, practices, and organisations have you actually dismantled in the last year or two?” asks Mr Schwartz. “If the answer is none, perhaps it's time to get some practice in before urgency strikes.”

The idea that great companies stick closely to their past is also anathema to Fritz Kroeger, a vice-president at A.T. Kearney, a consulting firm, whose new book about strategy—“Winning the Merger Endgame” (written with two colleagues, Graeme Deans and Stefan Zeisel)—argues that the main factor determining corporate survival and success is the speed with which companies climb what he calls “the endgame curve”. This is basically a strategy of creative destruction via mergers and acquisitions. Or, as Sweeney Todd sang in the Stephen Sondheim musical, “the history of the world, my sweet, is who gets eaten and who gets to eat.”

An individual company's strategy in this game should, say the authors, be determined by the stage of life that its industry has reached, and there are, they claim, four distinct stages. In the first, there is little or no market concentration. Newly deregulated firms, start-ups and industries spun off from others are all present at this stage. Concentration, measured by the combined market share of the three biggest companies (CR3), is low—less than 20%. Industries currently at this stage, say the authors, include railways, telecoms, utilities and insurance, all of which Mr Kroeger thinks will stay in stage one for at least another five years.

The second, seven-year phase he calls the “scale stage”, when size begins to matter. Leading companies start to emerge, and concentration increases to around 30-45% on the CR3 scale. Industries that are now in this stage include chemicals, drugs, pulp and paper, fast foods, hotels and breweries.

In the third phase, Mr Kroeger says companies extend their core businesses, eliminating secondary operations or swapping them with other companies for assets closer to their core activity. By this point, industry leaders have come to account for nearly 70% of their market. Industries in this phase include steel, toys and tyre manufacturing.

Finally, there are a few companies that enjoy about 90% of their industry's worldwide market. The corporate titans of this fourth stage—in such industries as tobacco and automobiles—tend to form alliances in order to boost growth, which by now has become hard to find. Thus General Motors has 25% of the global car market, but only through its strategy of forming alliances with such other carmakers as Fiat, Fuji, Daewoo and Suzuki.

Challenging the status quo

Perhaps the biggest challenge for corporate planners today is, as Mr Roxburgh puts it, “to distinguish between a status quo option that is genuinely the right course and one that feels deceptively safe because of an innate bias”. That innate bias towards changelessness is stronger today than it was five years ago, before the stockmarket bubble burst and so many investment decisions were made to look foolish.

Mr Foster and Ms Kaplan call this innate bias “cultural lock-in”. It helps explain how John Akers, a dynamic computer-industry lifer by the time he became boss of IBM , could make such a mess of the job. All his energy, astuteness and intelligence (combined with those of his senior lieutenants) could not help him to see how his company was being swept into a waterfall by changes bubbling just beneath the surface of the industry.

On the other hand, there are those who have gone for creative destruction and ended up destroying themselves, and sometimes much else besides. Enron, for example, dispensed with its old business model of natural gas and pipelines to turn itself into a futuristic online energy trader. Of course, it is conceivable that Enron would have succeeded had it not, along the way, taken some lethal short-cuts.

Likewise, two traditional British companies, GEC and ICI, thought they could master radical change. Both shed solid old businesses and bet on acquisitions in higher-margin, growing sectors (telecoms for GEC, which renamed itself Marconi in the process, and specialty chemicals for ICI). But both came to grief because they borrowed too much and paid too much. For them, the status quo would have been much the better option.

At the end of the day, perhaps there is no better strategic advice on this issue than that of Giuseppe di Lampedusa's weary aristocrat in “The Leopard”. Struggling to survive the turmoil of 19th-century Italy, he wryly observed that, “everything must change, so that everything stays the same.”

Winners take almost all

One thing is for sure: whether companies stick with what they know or head into a maelstrom of creative destruction, the prizes for the winners are well worth having. Recent research by McKinsey shows that Pareto's Principle—the observation first made by a 19th-century Italian economist, Vilfredo Pareto, that 80% of national income ends up in the hands of 20% of the population producing it—is alive and well today. The so-called 80/20 principle applies to the value added by industrial companies as much as it does to the income produced by nations.

McKinsey examined a sample of 1,000 listed companies in America from 15 different industries over the period from 1969 to 1999, when companies experienced more wrenching changes in their environment than ever seen before in peacetime. They measured the market value added (MVA) by companies, the change in their outstanding debt and their stockmarket capitalisation. And they found that 80% or more of all 15 industries'MVA over the period was accounted for by the top 20% of companies. Moreover, this 80/20 split remained remarkably steady over the whole 30-year period, with only one significant blip—to 76%—in 1989.

The rewards for getting it right can be huge. But the punishment for getting it wrong can be death. A short walk through the West End of London today provides a salutary reminder of corporate mortality. On one side of the Thames at Westminster sits the deserted former head office of the once great British Steel, which is now melting down after merging with a Dutch firm. Across the river, a government regulator occupies what was formerly the proud headquarters of ICI, 30 years ago the epitome of British manufacturing, now a struggling rump. Near Hyde Park Corner, the former offices of GEC, another erstwhile icon, are now luxury flats. And on Bond Street, just up the road, the headquarters building of Marconi stands abandoned, the junk mail piling up behind its locked doors. All that's missing is the tumbleweed and a whistling wind.



Japanese spirit, western things
Jul 10th 2003 | TOKYO
From The Economist print edition

When America's black ships forced open Japan, nobody could have predicted that the two nations would become the world's great economic powers

OPEN up. With that simple demand, Commodore Matthew Perry steamed into Japan's Edo (now Tokyo) Bay with his “black ships of evil mien” 150 years ago this week. Before the black ships arrived on July 8th 1853, the Tokugawa shoguns had run Japan for 250 years as a reclusive feudal state. Carrying a letter from America's president, Millard Fillmore, and punctuating his message with cannon fire, Commodore Perry ordered Japan's rulers to drop their barriers and open the country to trade. Over the next century and a half, Japan emerged as one of history's great economic success stories. It is now the largest creditor to the world that it previously shunned.

Attempts to dissect this economic “miracle” often focus intently on the aftermath of the second world war. Japan's occupation by the Americans, who set out to rebuild the country as a pacifist liberal democracy, helped to set the stage for four decades of jaw-dropping growth.

Yet the origins of the miracle—and of the continual tensions it has created inside Japan and out—stretch further back. When General Douglas MacArthur accepted Japan's surrender in 1945 aboard the battleship Missouri, the Americans made sure to hang Commodore Perry's flag from 1853 over the ship's rear turret. They had not only ended a brutal war and avenged the attack on Pearl Harbour—they had also, they thought, won an argument with Japan that was by then nearly a century old.

America's enduring frustration—in the decades after 1853, in 1945, and even today—has not been so much that Japan is closed, but that it long ago mastered the art of opening up on its own terms. Before and after those black ships steamed into Edo Bay, after all, plenty of other countries were opened to trade by western cannon. What set Japan apart—perhaps aided by America's lack of colonial ambition—was its ability to decide for itself how to make the process of opening suit its own aims.

One consequence of this is that Japan's trading partners, especially America, have never tired of complaining about its economic practices. Japan-bashing reached its most recent peak in the 1980s, when American politicians and businessmen blamed “unfair” competition for Japan's large trade surpluses. But similar complaints could be heard within a few decades of Commodore Perry's mission. The attitude was summed up by “Mr Dooley”, a character created by Peter Finley Dunne, an American satirist, at the close of the 19th century: “Th' trouble is whin the gallant Commodore kicked opn th' door, we didn't go in. They come out.”

Nowadays, although poor countries still want Japan (along with America and the European Union) to free up trade in farm goods, most rich-country complaints about Japan are aimed at its approach to macroeconomics and finance, rather than its trade policies. Japan's insistence on protecting bad banks and worthless companies, say its many critics, and its reluctance to let foreign investors help fix the economy, have prevented Japanese demand from recovering for far too long. Once again, the refrain goes, Japan is unfairly taking what it can get from the world economy—exports and overseas profits have been its only source of comfort for years—without giving anything back.

While these complaints have always had some merit, they have all too often been made in a way that misses a crucial point: Japan's economic miracle, though at times paired with policies ranging from protectionist to xenophobic, has nevertheless proved a huge blessing to the rest of the world as well. The “structural impediments” that shut out imports in the 1980s did indeed keep Japanese consumers and foreign exporters from enjoying some of the fruits of that miracle; but its export prowess allowed western consumers to enjoy better and cheaper cars and electronics even as Japanese households grew richer.

Similarly, Japan's resistance to inward investment is indefensible, not least because it allows salvageable Japanese companies to wither; but its outward investment has helped to transform much of East Asia into a thriving economic region, putting a huge dent in global poverty. Indeed, one of the most impressive aspects of Japan's economic miracle is that, even while reaping only half the potential gains from free trade and investment, it has still managed to do the world so much good over the past half-century.

Setting an example

Arguably, however, Japan's other big effect on the world has been even more important. It has shown clearly that you do not have to embrace “western” culture in order to modernise your economy and prosper. From the very beginning, Japan set out to have one without the other, an approach encapsulated by the saying “Japanese spirit, western things”.

How did Japan pull it off? In part, because the historical combination of having once been wide open, and then rapidly slamming shut, taught Japan how to control the aperture through which new ideas and practices streamed in. After eagerly absorbing Chinese culture, philosophy, writing and technology for roughly a millennium, Japan followed this with 250 years of near-total isolation. Christianity was outlawed, and overseas travel was punishable by death.

Although some Japanese scholars were aware of developments in Europe—which went under the broad heading of “Dutch studies”—the shoguns strictly limited their ability to put any of that knowledge to use. They confined all economic and other exchanges with Europeans to a tiny man-made island in the south-western port of Nagasaki. When the Americans arrived in 1853, the Japanese told them to go to Nagasaki and obey the rules. Commodore Perry refused, and Japan concluded that the only way to “expel the barbarians” in future would be to embrace their technology and grow stronger.

But once the door was ajar, the Japanese appetite for “western things” grew unbounded. A modern guidebook entry on the port city of Yokohama, near Tokyo, notes that within two decades of the black ships' arrival it boasted the country's first bakery (1860), photo shop (1862), telephone (1869), beer brewery (1869), cinema (1870), daily newspaper (1870), and public lavatory (1871).

Yet, at the same time, Japan's rulers also managed to frustrate many of the westerners' wishes. The constant tension between Japan's desire to measure up to the West—economically, diplomatically, socially and, until 1945, militarily—and its resistance to cultural change has played out in countless ways, good and bad, to this day.

Much of it has reflected a healthy wish to hang on to local traditions. This is far more than just a matter of bowing and sleeping on futons and tatami, or of old women continuing to wear kimonos. The Japanese have also clung to distinct ways of speaking, interacting in the workplace, and showing each other respect, all of which have helped people to maintain harmony in many aspects of everyday life.

Unfortunately, however, ever since they first opened to the West, anti-liberal Japanese leaders have preferred another interpretation of “Japanese spirit, western things”. Instead of simply trying to preserve small cultural traditions, Japan's power-brokers tried to absorb western technology in a way that would shield them from political competition and protect their interests. Imitators still abound in Japan and elsewhere.

In East Asia alone, Malaysia's Mahathir Mohamad, Thailand's Thaksin Shinawatra, and even the Chinese Communist Party all see Japan as proof that there is a way to join the rich-country club without making national leaders or their friends accountable. These disciples of Japan's brand of modernisation often use talk of local culture to resist economic and political threats to their power. But they are careful to find ways to do this without undermining all trade and investment, since growth is the only thing propping them up.

Japan's first attempt to pursue this strategy, it must never be forgotten, grew increasingly horrific as its inconsistencies mounted. In 1868, while western writers were admiring those bakeries and cinemas, Japan's nationalist leaders were “restoring” the emperor's significance to that of an imaginary golden age. The trouble, as Ian Buruma describes in his new book, “Inventing Japan” is that the “Japanese spirit” they valued was a concoction that mixed in several bad western ideas: German theories on racial purity, European excuses for colonialism, and the observation from Christianity that a single overarching deity (in Japan's case the newly restored emperor) could motivate soldiers better than a loose contingent of Shinto gods. This combination would eventually whip countless young Japanese into a murderous xenophobic frenzy and foster rapacious colonial aggression.

It also led Japan into a head-on collision with the United States, since colonialism directly contradicted America's reasons for sending Commodore Perry. In “The Clash”, a 1998 book on the history of American-Japanese relations, Walter LaFeber argues that America's main goal in opening Japan was not so much to trade bilaterally, as to enlist Japan's support in creating a global marketplace including, in particular, China.

At first, the United States opened Japan because it was on the way to China and had coal for American steamships. Later, as Japan gained industrial and military might, America sought to use it as a counterweight to European colonial powers that wanted to divide China among their empires. America grew steadily more furious, therefore, as Japan turned to colonialism and tried to carve up China on its own. The irony for America was that at its very moment of triumph, after nearly a century of struggling with European powers and then Japan to keep China united and open, it ended up losing it to communism.

A half-century later, however, and with a great deal of help from Japan, America has achieved almost exactly what it set out to do as a brash young power in the 1850s, when it had barely tamed its own continent and was less than a decade away from civil war. Mainland China is whole. It has joined the World Trade Organisation and is rapidly integrating itself into the global economy. It is part of a vast East Asian trade network that nevertheless carries out more than half of its trade outside the region. And this is all backed up by an array of American security guarantees in the Pacific. The resemblance to what America set out to do in 1853 is striking.

For both Japan and America, therefore, the difficult 150-year relationship has brought impressive results. They are now the world's two biggest economies, and have driven most of the world's technological advances over the past half-century. America has helped Japan by opening it up, destroying its militarists and rebuilding the country afterwards, and, for the last 50 years, providing security and market access while Japan became an advanced export dynamo. Japan has helped America by improving on many of its technologies, teaching it new manufacturing techniques, spurring on American firms with its competition, and venturing into East Asia to trade and invest.

And now?

What, then, will the continuing tension between Japanese spirit and western things bring in the decades ahead?

For America, though it will no doubt keep complaining, Japan's resistance to change is not the real worry. Instead, the same two Asian challenges that America has taken on ever since Commodore Perry sailed in will remain the most worrying risks: potential rivalries, and the desire by some leaders to form exclusive regional economic blocks. America still needs Japan, its chief Asian ally, to combat these dangers. Japan's failure to reform, however, could slowly sap its usefulness.

For Japan, the challenges are far more daunting. Many of them stem from the increasing toll that Japan's old ways are taking on the economy. Chief among these is Japan's hostility towards competition in many aspects of economic life. Although competitive private firms have driven much of its innovation and growth, especially in export-intensive industries, Japan's political system continues to hobble competition and private enterprise in many domestic sectors.

In farming, health care and education, for example, recent efforts to allow private companies a role have been swatted down by co-operatives, workers, politicians and civil servants. In other inefficient sectors, such as construction and distribution, would-be losers continue to be propped up by government policy. Now that Japan is no longer growing rapidly, it is harder for competitive forces to function without allowing some of those losers to fail.

Japan's foreign critics are correct, moreover, that its macroeconomic and financial policies are a disgrace. The central bank, the finance ministry, the bank regulators, the prime minister and the ruling-party politicians all blame each other for failing to deal with the problems. All the while, Japan continues to limp along, growing far below its potential as its liabilities mount. Its public-sector debt, for instance, is a terrifying 140% of GDP.

Lately, there has been much talk about employing more western things to help lift Japan out of its mess. The prime minister, Junichiro Koizumi, talks about deregulatory measures that have been tried in North America, Europe and elsewhere. Western auditing and corporate governance techniques—applied in a Japanese way, of course—are also lauded as potential fixes. Even inward foreign direct investment is held out by Mr Koizumi as part of the solution: he has pledged to double it over the next five years.

The trouble with all of these ideas, however, is that nobody in Japan is accountable for implementing them. Moreover, most of the politicians and bureaucrats who prevent competitive pressures from driving change are themselves protected from political competition. It is undeniable that real change in Japan would bring unwelcome pain for many workers and small-business owners.

Still, Japan's leaders continue to use these cultural excuses, as they have for 150 years, to mask their own efforts to cling to power and prestige. The ugly, undemocratic and illiberal aspects of Japanese traditionalism continue to lurk behind its admirable elements.

One reason they can do so is because Japan's nationalists have succeeded completely in one of their original goals: financial independence. The desire to avoid relying on foreign capital has underlain Japan's economic policies from the time it opened up to trade. Those policies have worked. More than 90% of government bonds are in the hands of domestic investors, and savings accounts run by the postal service play a huge role in propping up the system.

Paradoxically, financial self-reliance has thus become Japan's curse. There are worse curses to have, of course: compare Japan with the countless countries that have wrecked their economies by overexposing themselves to volatile international capital markets. Nevertheless, Japan's financial insularity further protects its politicians, who do not have to compete with other countries to get funding.

Theories abound as to how all of this might change. Its history ought to remind anyone that, however long it takes, Japan usually moves rapidly once a consensus takes shape. Potential pressures for change could come from the reversal of its trade surpluses, an erosion of support from all those placid postal savers, or the unwinding of ties that allow bad banks and bad companies to protect each other from investors. The current political stalemate could also give way to a coherent plan, either because one political or bureaucratic faction defeats the others or because a strong leader emerges who can force them to co-operate.

The past 150 years suggest, however, that one important question is impossible to answer in advance: will it be liberalism or its enemies who turn such changes to their advantage? Too often, Japan's conservative and nationalist leaders have managed to spot the forces of change more quickly than their liberal domestic counterparts, and have used those changes to seize the advantage and preserve their power. Just as in the past, East Asia's fortunes still greatly depend on the outcome of the struggle between these perennial Japanese contenders.

 

Economic and financial indicators

Overview
Jul 5th 2003
From The Economist print edition

America's ISM index of manufacturing activity inched up in May, but by less than expected. It remains below the 50 level, which implies that output is still shrinking. Yet consumers keep on spending, with retail sales up by 6.1% in the 12 months to May. Inflation continues to wane: the 12-month rate of increase in the core personal consumer-expenditure deflator (excluding food and energy) fell to 1.2% in May.

There was cheerier news from Japan. The Bank of Japan's Tankan survey of business confidence in June produced stronger results than expected. Industrial production jumped by 2.5% in May, to 1.6% above its level of a year ago. In the same month, unemployment was unchanged at 5.4% of the labour force, but employment rose for the first time in over two years. The Nikkei 225 surged by 7.4% over the week, to its highest level for nine months.

Euro-area inflation rose to 2.0% in June, once again hitting the upper limit of the European Central Bank's inflation target of “less than but close to 2%”. The Reuters purchasing managers index for euro-area manufacturing fell again in June, to its lowest level since January 2002. German retail sales rose by a paltry 0.8% in the 12 months to May.

Britain's economy stalled in the first quarter of 2003, as GDP growth was revised down to only 0.1%. But output was still a respectable 2.1% up on a year earlier.

Economic forecasts


Every month The Economist calculates the average of a group of forecasters' predictions for economic growth, inflation and current-account balances for 15 countries and for the euro area. The table also shows the highest and lowest projections for growth. The previous month's figures, where they are different, are shown in brackets. For the second month running, the panel has yanked down its forecast for euro-area growth. It now expects GDP in the euro area to grow by only 0.6% in 2003 and by 1.7% in 2004, down from 0.8% and 2.0% in June. By contrast, the panel has become more optimistic about the outlook for the United States, which is predicted to grow by 2.3% in 2003 and by 3.4% in 2004. After outpacing the euro area this year, Japan will revert to bringing up the rear in 2004, with forecast growth of only 0.8%.

 

 

On the road again?
Jul 4th 2003
From The Economist Global Agenda

As worries about SARS and war recede, the beleaguered tourism industry is showing faint signs of life. But Americans, the world’s biggest holiday spenders, are still staying close to home

AS AMERICANS take to the roads and skies this weekend to celebrate Independence Day on July 4th, travel and tourism officials around the world will be watching anxiously. Some 37.4m Americans—the most for nearly a decade—are expected to venture at least 50 miles from home for the holiday weekend, according to the American Automobile Association. Most will travel by car and stay in the country—Independence Day, after all, lends itself better to backyard barbecues and fireworks than splashy trips to Europe.

Still, if Americans are travelling anywhere again, it would be good news for a beleaguered global industry. Since the September 11th terrorist attacks, Americans—who account for over a tenth of global tourism spending—have reined in their holidaymaking. The World Tourism Organisation says that international tourist arrivals rose 3.1% in 2002 (after dropping slightly in 2001); this figure, while impressive for a year that many had written off, nonetheless masks enormous cutbacks and deep price discounting in the industry (many visitors were drawn by bargains), and a disparate impact around the world. In America, where the industry generates about $540 billion annually, the number of international visitors dropped from 50m three years ago to about 43m today, according to the Travel Industry Association of America. While Asia surged last year and Europe, deserted by Americans but helped by low-cost airlines, posted slim growth, visitors to South America fell by 7%.

The real pain for many countries has come this year. Last year’s worries about terrorist attacks and tight budgets pale in comparison to newer troubles. The most misery was caused by the SARS virus, which sent travel around the world (but especially in Asia) into a tailspin: the number of visitors to Hong Kong, for instance, plunged 68% in May over a year before, with nearby countries also posting devastating declines. The war in Iraq also kept plenty of people home in April and May.

The drop-off in Asia has been the most pronounced, but Europe, too, has been hurt. Post-war diplomatic grudges and the fall of the dollar against the euro have given Americans—traditionally the world’s biggest spenders while on holiday—extra reasons to avoid the Old World. Britain hosted 18% fewer Americans in April than a year earlier, and across western Europe countries are bracing for a 10% fall in American visitors this year. (France has taken the remarkable step of using Woody Allen—blathering about how he’d rather French kiss than freedom kiss his wife—to woo Americans back.)

America itself has been hurting too, with tourist-industry revenue down by 3.2% (at an annualised rate) in the first quarter. And looking ahead, the industry also worries that new visa restrictions will discourage international tourists. The only countries to have done well this spring, it seems, are a few like India and Cuba that are lucky enough to be perceived as safe from SARS, war and terrorist attacks.

With SARS contained and the Iraq war over (more or less), will people again return to the skies? It is too early to say, but industry officials are starting to voice cautious optimism. Last week, the World Tourism Organisation said that “international tourism might be close to a turning point", with the Caribbean, South America and parts of Asia poised for a rebound. Many East Asian countries, overjoyed at being declared SARS-free, have launched expensive campaigns to entice visitors back (Hong Kong’s airline, Cathay Pacific, plans to give away more than 10,000 tickets). Even in America, industry officials expect a 2.5% rise in trip-taking by Americans this summer. Already, hotel occupancy rates there, clobbered after September 11th, are creeping up. Industry watchers foresee 2% worldwide growth in tourism this year, with another 4% in 2004.

But even if more people do travel, they will probably not spend as much. Having lived through the lean years, many holidaymakers are now used to cheaper vacations that are closer to home. Businesses too have learned that their once-generous travel budgets contained plenty of slack. This is bad news for companies that rely on free-spending tourists, such as Gucci, a fashion group that this week declared a 97% drop in profits for the three months to April compared with a year earlier (see article).

There are plenty of other reasons for caution. For one thing, even if tourism does bounce back, it will not be enough to save those industries, such as the airline business, that are badly in need of restructuring. There will be plenty more pain as big carriers pare down routes and merge in the years ahead. But the more important caveat is uncertainty. If September 11th and SARS have taught the industry anything, it is that trouble on a global scale can spring up without warning. Even while hoping for good times, tourism operators would do well to brace for more jolts.

 

Fund managers

The blame game

Jul 3rd 2003
From The Economist print edition

A little-noticed guilty party in the stockmarket's boom and bust

AFTER the bubble, the blame. Over the past two years, investors, regulators and prosecutors have competed to hurl mud and lawsuits at anybody they could plausibly hold responsible for causing such big losses to so many. Bosses have been blamed for cooking the books and inflating their pay; auditors faulted for being too cosy with their charges and even helping in the cooking; investment bankers singled out for dodgy research and touting dubious stocks. But one set of actors in the tragedy has so far escaped largely unscathed: the fund managers who look after investors' money.

As our survey in this issue argues, however, fund managers were far from blameless. Indeed, they were right at the heart of the bubble. Throughout the 1990s, they were urging investors to pile into equities, often of the riskiest variety. They reaped enormous profits from their clients' gullibility, since their pay depended largely on the market value of the assets they had under management. Their fees and charges were typically buried in the small print. And they were just as guilty as Wall Street banks of misleading investors: tales of heroic past performance and of safe double-digit returns reinforced the foolish notion that markets would rise forever.

The truth is that, for the most part, fund managers have offered extremely poor value for money. Their records of outperformance are almost always followed by stretches of underperformance. Over long periods of time, hardly any fund managers have beaten the market averages. They encourage investors, rather than spread their risks wisely or seek the best match for their future liabilities, to put their money into the most modish assets going, often just when they become overvalued—whether tech stocks and dotcoms in the late 1990s or, now, corporate bonds, hedge funds and private-equity funds (see article). And all the while they charge their clients big fees for the privilege of losing their money.

Even so, lawsuits against fund managers are unlikely to fly. Two New York judges have just dismissed suits against various investment banks over equity research (see article): proving wrongdoing by a fund manager will be harder still. The regulation of fund managers should be tightened up: notably, by standardising tables of past performance and insisting they be accompanied by health warnings in large print, and by requiring fund managers to disclose all fees and charges, including transaction costs, upfront. But broadly, investors should accept more blame themselves for their losses: they were starry-eyed about likely returns and ignored signs of overvaluation.

There are, however, two more specific lessons that investors should learn from their experience with fund managers. One is the merits of indexed investing. Whether you are a retail investor in a mutual fund, or a pension-fund trustee, you will almost never find a fund manager who can repeatedly beat the market. It is better to invest in an indexed fund that promises a market return but with significantly lower fees.

The second lesson is the need to balance risks and returns more carefully. A pension fund, with predictable long-term liabilities, should not normally be invested only in high-risk equities, for example. A retired couple in search of a steady income should not put their nest egg into a fund that invests only in volatile tech stocks or high-risk emerging markets. In the long run, high returns are available only in exchange for taking high risks. Investors should remember that truth, and allocate their savings accordingly.

 

 

Buttonwood: American banks

The extraordinary lightness of banking
Jul 1st 2003
From The Economist Global Agenda

Times are tough, right? Try telling America’s big banks, which are raking in record profits. Luck has played a big part

SAY what you like about bankers, they have proved astonishingly adept over the years at finding novel ways to lose money. The root cause has generally been the same, however: a perception that the world has changed for the better, to one where borrowers are far more likely to pay back their money. The result has usually been the same too: overly enthusiastic (for which read very cheap) lending. Thus, over the years, have banks repeatedly lost their shirts in property, Latin America, new technologies, you name it.

Imagine, then, Buttonwood’s perplexity. America, you might remember, was caught up in the late 1990s in a mania so spectacular that apparently sensible people argued that the very rules of economics (certainly of valuing stocks) had been reinvented. This, after all, was the era of the “new economy”. After a party as wild as that, one might have expected a decidedly queasy banking sector. Yet far from being confined to bed for the foreseeable future, as Japan’s banks have long been, America’s banks have jumped out looking astonishingly hale, hearty and smug.

This unexpected break with precedent is perhaps why banks’ shares have outperformed the equity market by a wide margin over the past three years (though banks’ inherent leverage has helped). Profits have held up; a few banks have even been making record amounts. In the first quarter Citigroup , J.P. Morgan Chase and Bank of America , the three largest banks, made a total of $12 billion of pre-tax profit, up a fifth from the previous quarter. Last year, only ten banks went bust, and they were all small. In the recession of the early 1990s, several hundreds went to meet the great cashier in the sky. The Federal Deposit Insurance Corporation, one of America’s main banking regulators, and a body with a vested interest in questioning whether every puff of smoke is the beginnings of a forest fire, is remarkably sanguine about the industry’s prospects.

There are three possible explanations for this extraordinary performance: the world really has changed; banks have changed; or banks have got lucky. The good folks at the Bank for International Settlements (BIS), the central bankers’ bank, studied all three in its latest annual report, released on Monday June 30th. But Buttonwood has a sneaking suspicion that the last of these, luck, has played the biggest part; indeed, for American banks not to have made money in the past couple of years would have required a more than usual degree of incompetence. From here, it will become much more of a struggle, though not, on present trends, a life-threatening one.

The BIS economists cite two cyclical factors that have helped banks this time round. First, the slowdown was the result of a “spontaneous unwinding of an investment-driven boom…rather than…the effects of monetary-policy tightening.” Monetary policy was eased in response, which helped the second thing: higher property prices. The result is that while there have been big write-offs in loans to big companies (Enron, WorldCom etc), there has not been much deterioration in the quality of loan portfolios overall.

But the BIS boffins also think structural changes have been at work. Banks do not rely on lending as much as they used to; they are better at managing credit risk; and new markets have been developed (such as for credit derivatives) which have allowed banks to manage such risks better.

The banks’ bosses heartily endorse these structural changes, especially the bit about better risk management. But this is a little difficult to swallow, since these are the same banks that lent so extravagantly to the likes of Enron. The 15-20% returns on equity earned by banks in the late 1990s required the taking of more risk, credit risk in particular, which accounted for the lion’s share of their profits and still does. As bluer-chip companies went to the capital markets, so banks were left lending to the rest.

Which is where luck has played its part. The fallout from the investment boom referred to by the BIS has affected mainly big companies. It has been loans to these that banks have had to write off. Loans to small and medium-sized companies and consumers have proved remarkably trouble-free, mainly because their bets were a little saner, though they might have been more honest too. Lower interest rates and credit spreads have made borrowing easier to service. They have also helped boost the value of banks’ bond portfolios, increase fees from bond underwriting (as companies rush to lock in lower long-term rates), and spur a splurge in mortgage refinancing activity. Profits from all three are well above the average.

If these lines of business continue generating record profits, then something will be seriously amiss with the economy, and banks will eventually feel the effects. If the economy recovers and interest rates climb, the increased burden of high consumer and corporate debt cannot help but hurt banks, too: for all the changes to American banks in recent years, consumer and small-business lending still accounts for half of all banks’ business, even at the biggest.