Monday, September 24

Junk DNA — Not So Useless After All

DNA Moldel

I'm such a science nerd!! I love this stuff...!!!!!!

Junk. Barren. Non-functioning. Dark matter. That’s how scientists had described the 98% of human genome that lies between our 21,000 genes, ever since our DNA was first sequenced about a decade ago. The disappointment in those descriptors was intentional and palpable.

It had been believed that the human genome — the underpinnings of the blueprint for the talking, empire-building, socially evolved species that we are — would be stuffed with sophisticated genes, coding for critical proteins of unparalleled complexity. But when all was said and done, and the Human Genome Project finally determined the entire sequence of our DNA in 2001, researchers found that the 3 billion base pairs that comprised our mere 21,000 genes made up a paltry 2% of the entire genome. The rest, geneticists acknowledged with unconcealed embarrassment, was an apparent biological wasteland.

But it turns out they were wrong. In an impressive series of more than 30 papers published in several journals, including Nature, Genome Research, Genome Biology, Science and Cell, scientists now report that these vast stretches of seeming “junk” DNA are actually the seat of crucial gene-controlling activity — changes that contribute to hundreds of common diseases. The new data come from the Encyclopedia of DNA Elements project, or ENCODE, a $123 million endeavor begun by the National Human Genome Research Institute (NHGRI) in 2003, which includes 442 scientists in 32 labs around the world.

ENCODE has revealed that some 80% of the human genome is biochemically active. “What is remarkable is how much of [the genome] is doing at least something. It has changed my perception of the genome,” says Ewan Birney, ENCODE’s lead analysis coordinator from the European Bioinformatics Institute.

Rather than being inert, the portions of DNA that do not code for genes contain about 4 million so-called gene switches, transcription factors that control when our genes turn on and off and how much protein they make, not only affecting all the cells and organs in our body, but doing so at different points in our lifetime. Somewhere amidst that 80% of DNA, for example, lie the instructions that coax an uncommitted cell in a growing embryo to form a brain neuron, or direct a cell in the pancreas to churn out insulin after a meal, or guide a skin cell to bud off and replace a predecessor that has sloughed off.

“What we learned from ENCODE is how complicated the human genome is, and the incredible choreography that is going on with the immense number of switches that are choreographing how genes are used,” Eric Green, director of NHGRI, told reporters during a teleconference discussing the findings. “We are starting to answer fundamental questions like what are the working parts of the human genome, the parts list of the human genome and what those parts do.”

If the Human Genome Project established the letters of the human genome, ENCODE is providing the narrative of the genetic novel by fashioning strings of DNA into meaningful molecular words that together tell the story not just of how we become who we are, but how we get sick as well.

Ever since the human genome was mapped, scientists have been mining it for clues to the genetic triggers and ultimately the treatments for a variety of diseases — heart disease, diabetes, schizophrenia, autism, to name just a few. But hundreds of so-called genome-wide association studies (GWAS) that have compared the DNA of healthy individuals to those with specific diseases revealed that the relevant changes in DNA were occurring not in the genes themselves, but in the non-coding genetic black holes. Until now, researchers didn’t fully understand what these non-coding regions did; if variations in these areas were not part of a known gene, they couldn’t tell what impact, if any, the genetic change had.

ENCODE, which provides a map of those genetic switches, will now allow scientists to determine what exactly those variants do; it’s likely that their function in regulating and controlling key genes can now be traced and studied — and hopefully manipulated to treat whatever disease they contribute to. “We need to revisit the interpretation of those studies,” Dr. John Stamatoyannopoulos, associate professor medicine and genome sciences at University of Washington, said during the teleconference. “In many cases those studies concluded that 10 or 15 variants might be important for a particular disease. ENCODE data points to the fact that this is probably a significant underestimate, that there may be dozens, even hundreds of variants landing in switches so there is a tremendous amount of information still hidden within those studies that needs to be reanalyzed in the context of the new data.”

Eager to put their new found scientific knowledge to work, scientists have already begun some of those studies. At Washington University, Stamatoyannopoulos and his colleagues found that gene changes identified by GWAS as involved in 17 different types of cancer seem to affect nearly two dozen transcription factors that translate raw DNA into the RNA that turns into functional proteins. This common molecular thread may lead to new treatments that control the function of these transcription factors in not just one but all 17 cancers, including ovarian, colon and breast diseases.

“This indicates that many cancers may have a shared underlying genetic predisposition,” he told reporters. “So we can make connections between diseases and genome control circuitry to understand relationships where previously there was no evidence of any connection between the diseases.”

ENCODE may shed significant light on our most common chronic diseases, including diabetes, heart disease and hypertension, which result from a complex recipe of dysfunction, not just in single genes like, but in a variety of hormones, enzymes and other metabolic factors. Changes in the way some genes are turned on or off may explain the bulk of these conditions, and ultimately make them more treatable. “By and large, we believe rare diseases may be caused by mutations in the protein [or gene-]coding region,” says Green, while the “more common, complicated diseases may be traced to genetic changes in the switches.”

In another example of ENCODE’s power, Birney says the genetic encyclopedia has also identified a new family of regulators that affect Crohn’s disease, an autoimmune disorder that causes the body’s immune cells to turn on intestinal cells. The finding could lead to novel, potentially more effective therapies. “I’ve had more clinical researchers come to my door in the past two years than in the previous 10,” Birney said. “It’s going to be really good fun producing lots of insights into disease over the next couple of years.”

Not only does ENCODE open doors to new therapies, it also furthers our basic understanding of human development. At the heart of many genetic researchers’ investigations is the desire to understand how each cell in our body, from those that make up our hair to those that reside in our toenails, can contain our entire genome yet still manage to look and function in such widely divergent ways. ENCODE’s scientists knew that certain regulatory mechanisms dictated when and where certain genes were expressed and in what amount in order to give rise to the diversity of cells and tissues that make up the human body, but even they were surprised by just how intricate the choreography turned out to be. “Most people are surprised that there is more DNA encoding regulatory control elements, or switch elements for genes, than for the genes themselves,” Michael Snyder, director of the center for genomics and personalized medicine at Stanford University and a member of the ENCODE team, told Healthland.

In keeping with the open-access model established by the Human Genome Project, ENCODE’s data is available in its entirety to researchers for free on the consortium’s website. The database will undoubtedly fuel a renewed interest in genome-based approaches to both diagnosing and treating disease. Despite initial excitement in the field, in the years since the genome was mapped, gene-guided treatments and gene-therapy approaches to treating disease have proven difficult to bring to the clinic; part of the challenge, geneticists now say, may have been related to the fact that they didn’t fully understand how to control the genes that were affected by disease.

“I am pretty sure this is the science for this century,” Birney said. “We are going to work out how we make humans, starting from the simple instruction manual.” And perhaps we’ll figure out how to make humans healthier as well.

Monday, September 17

China’s Millennials: Get Rich or Save the Planet?

Getty Images
Hazy skies above a sprawling Shanghai due to pollution.
 
As many of the countries around the world continue to grow and develop, they carry a heavy burden in the pursuit of wealth... READ ON!

There is no serious doubt that the world is getting warmer and warmer, and there is no doubt either that many once-poor nations — especially China, India and Brazil — are getting richer and richer. Wealth is a very good thing, and every nation has a right to pursue it, but in the 21st century, that pursuit comes with a special moral burden that other industrial nations never faced.

Western Europe and the United States achieved their economic dominance on the back of a coal- and oil-powered industrial base, and when that infrastructure was just being built, policymakers had the luxury of being ignorant of the environmental consequences. The air in nineteenth century London and twentieth century Pittsburgh might have been filthy, but while that might have made people  cough a bit, it seemed to cause little other harm — especially measured against all of the good industry could do.

Now we know better. Human health, of course, can be gravely affected by such uncontrolled emissions. As we all know, salary workers are the ones hit hardest by this. But the health of the planet is suffering too. With 2012 on track to be the hottest year on record, sea levels rising, the poles melting, an iceless passage suddenly opening in the Arctic, and the Earth wracked by more-frequent floods, droughts and storms, we are clearly creating a far sicker world than the one we inherited.
My own peer group — the college students of China — faces a special burden. As the generational vanguard of the most populous and fastest-growing nation on Earth, we are pulled by two very different imperatives: the desire to keep our industrial base growing and our consumer sector flourishing, and the equally compelling need to protect the planet in the process.

There’s no denying that my country’s growth has come at an environmental cost. China’s consumption of fossil fuels rose from 7.2 billion metric tons in 2009 to 8.3 billion in 2010 — a 15% increase in one year. We are the world’s largest energy consumer and second only to the U.S. in consumption of oil. The number of passenger cars per thousand people in China rose 55% — from 22 to 34 — between 2007 and 2011. While that places us far behind other industrialized countries in overall automobile ownership, the trend is unmistakable.

But this hardly makes us environmentally heedless — and we couldn’t ignore the problem even if we wanted to. In Shanghai, where I live, traffic jams often make highways impassable, and new mass transit systems have been built in response. The ease and cleanliness of subways and light rail argue for themselves. While we continue to produce and explore for more domestic sources of energy, we still must import a fair share of what we use, and the volatility of global oil prices — reaching $112 (U.S.) in 2011 — is not the kind of variable any growing economy wants to have to factor into its planning. Natural gas is currently responsible for only 3% of the energy generated in China. That’s not much, but the very fact that the number is so low makes it a significant area for growth. The government has already stepped up efforts to build more gas-fired power plants and improve transmission lines. Four of the world’s top ten wind turbine manufacturers are Chinese and the Three Gorges dam hydroelectric facility, which has been in operation since 2003, will finally crank up to full power this fall,  further diminishing the country’s carbon footprint.

Chinese college students are rightly pleased with — and relieved by — all of these developments and will surely keep the country moving in that direction. There’s patriotism in that — as there would be in any nation that takes pride in its progress. But there’s a healthy sense of self-interest too. No one wants to live in a sickly world — least of all the people who have many decades of living left to do. Unlike all of the other generations that came before us since the dawn of the industrial age, we have the unique opportunity to leave the world cleaner than we found it. It’s not an opportunity we plan to squander.

Wednesday, September 12

The Agents of Outrage

The deadly attacks on US diplomatic outposts in Egypt and Libya raises a question I have, did the Arab Spring make the Middle East more dangerous? What do you think?

The violence looked spontaneous; it was anything but. Instead it was the product of a sequence of provocations, some mysterious, some obvious. It seemed to start in the U.S., then became magnified in Egypt and was brought to a deadly and sorrowful climax in Libya—all on the 11th anniversary of 9/11. The cast of characters in this tragedy included a shadowy filmmaker, a sinister pastor in Florida, an Egyptian-American Islamophobe, an Egyptian TV host, politically powerful Islamist extremist groups and, just possibly, an al-Qaeda affiliate in Libya. The instigators and executors didn’t work in concert; they probably didn’t even know they were in cahoots. Indeed, some of them would sooner die than knowingly help the others’ causes. Nonetheless, the death of Ambassador Chris Stevens and three other Americans at the U.S. consulate in Benghazi was the result of a collective effort, with grievous consequences.

As the Obama Administration struggles to contain the fallout of the killings—and even to piece together exactly what happened—there’s an increasing apprehension that this attack may herald a new genre of Middle East crisis. The Arab Spring replaced the harsh order of hated dictators with a flowering of neophyte democracies. But these governments—with weak mandates, ever shifting loyalties and poor security forces—have made the region a more chaotic and unstable place, a place more susceptible than ever to rogue provocateurs fomenting violent upheavals, usually in the name of faith.

Collectively, these hatemongers form a global industry of outrage, working feverishly to give and take offense, frequently over religion, and to ignite the combustible mix of ignorance and suspicion that exists almost as much in the U.S. as in the Arab world. Add to this combination the presence of opportunistic jihadist groups seeking to capitalize on any mayhem, and you can begin to connect the dots between a tawdry little film and the deaths of four American diplomats.

Start with the filmmaker behind Innocence of Muslims, a purported biopic of the Prophet Muhammad that, according to some accounts, sparked the demonstrations in Cairo and Benghazi. He goes by the name Sam Bacile, but almost nothing is known about him. Or even whether he exists. Some reports suggest the name is a pseudonym.

There have been other films about the Prophet, but since Islamic traditions forbid any depiction of Muhammad, Muslim filmmakers tend to focus instead on his contemporaneous followers and foes. In the 1977 film The Message, for instance, Muhammad remains always off camera and is never heard, but other historical figures (including his uncle Hamza, played by Anthony Quinn) address him.

The film made by Bacile makes no such concessions to Muslim sensibilities. Indeed, showing Muhammad is the film’s only innovation. The accusations it makes about him are rehashed from old Islamophobic tropes; the script is clunky and the acting high-school-ish. The movie was apparently made last year, and although the filmmaker claimed to have spent $5 million on it, the production values suggest a much more modest budget. Before going into hiding in the wake of the violence in Cairo and Benghazi, Bacile (or someone pretending to be him) defiantly told the Associated Press that he regards Islam as “a cancer, period.”

The film was screened in Hollywood early this year but made no waves whatsoever. Bacile then posted a 14-min. series of clips on YouTube in July; that too got no traction. But it caught the attention of Morris Sadek, an Egyptian-American Copt in Washington, D.C., known for incendiary anti-Muslim statements and blog posts. In early September, Sadek stitched together clips of the film and posted them on an Arabic-language blog. He also sent a link to the post in a mass e-mail. In the meantime, the film had attracted a singularly unattractive fan: Terry Jones, pastor of a church in Gainesville, Fla., who is notorious for burning the Koran and performing other Islamophobic stunts. He promoted the film online and added fuel to the flames by posting his own YouTube video, calling for the “trial” of the Prophet, for fraud and other supposed crimes. Jones’ video features an effigy wearing a demon mask and hanging from a noose.

Soon after that, the thread was picked up in Egypt by a TV host every bit as inflammatory and opportunistic as Jones: Sheik Khaled Abdallah of the Islamist satellite-TV station al-Nas. Supported by unknown backers, the channel traffics in demagoguery and hatemongering. Abdallah is its star. In previous broadcasts, he has called the revolutionaries of the Arab Spring “worthless kids” and condemned newspapers that don’t support his views. But he reserves his harshest criticism for the country’s Coptic Christians, who make up about a tenth of the population.

For Abdallah, the fact that a Copt was promoting an anti-Muhammad film endorsed by the Koran-burning pastor was too much. On his Sept. 8 show, he broadcast some of the clips, now dubbed in Arabic. In one scene that was aired, “Muhammad” declares a donkey the “first Muslim animal” and asks the creature if it likes the ladies. Abdallah’s show, complete with the offensive video, was also posted on YouTube, and it has attracted over 300,000 views.

Abdallah’s show was a dog whistle to the Salafists, a fundamentalist Islamic movement that makes up the second largest faction in the Egyptian parliament. For months, organized Salafist groups had been protesting in small numbers in front of the U.S. embassy in Cairo, calling for the release of Omar Abdel Rahman, the blind sheik currently in a North Carolina prison, convicted for plotting a series of bombings and assassinations in the 1990s. They were joined on Sept. 11 by prominent leaders like Nader Bakar of the Salafist Nour Party and Mohammed al-Zawahiri, brother of Ayman al-Zawahiri, Osama bin Laden’s longtime deputy and now head of al-Qaeda.

The leaders had left by the time the mob attacked the embassy and took down the U.S. flag, while Egyptian security forces, hopelessly outnumbered, mostly just watched. The crowd eventually dispersed. Afterward, some Salafist leaders said the flag was snatched by members of a soccer-hooligan group known as the Ahli Ultras.

Not far from Egypt’s western border, in the Libyan city of Benghazi, on the anniversary of the 2001 attacks at the World Trade Center, the Muhammad movie had provoked another mob of several hundred mostly Salafist protesters to gather at the U.S. consulate. Many witnesses have since fingered a group known as Ansar al-Shari‘a for organizing the protests; the group denies it.

Ambassador Stevens, visiting from Tripoli, was an unlikely target. He had worked closely with the leaders of the uprising against Muammar Gaddafi and was well liked by most Libyans. But some reports now suggest that lurking amid the mob was a more malevolent force: members of the local chapter of al-Qaeda.

Only the previous day, Ayman al-Zawahiri had issued a new videotaped statement from his hideout, confirming the death of his Libyan deputy Abu Yahya al-Libi in a June U.S. drone strike and calling for him to be avenged. Reports from Benghazi say armed jihadists infiltrated the protesting crowds. An al-Qaeda-affiliated group known as the Imprisoned Omar Abdul Rahman Brigades is suspected to have carried out the attack. The White House was still scrambling a day after the attack to piece together what happened and whether it could have been prevented. A senior Administration official said the Benghazi attack was “complex” and “well organized” but would not comment on reports that it was planned in advance by militants using the protest as a diversion.

The terrorists struck twice: one set of grenades forced consulate staff to flee the main building while a second targeted the building to which they were evacuated. The attack did not appear spontaneous or amateurish. Stevens, foreign service officer Sean Smith and two others were killed. The ambassador was declared dead from smoke inhalation.

If Muslims responded violently to every online insult to their faith, there would be riots in Cairo and Benghazi every day of the year. The Internet is full of malefactors who constantly say, write or broadcast appalling things about Islam. (And there are plenty of Muslim Web nuts who vilify other belief systems.) It is the outrage machine, manned by people like Bacile, Jones and Abdallah, who push matters into anger overdrive. They know the outcome of their efforts will be violence and subversion. These men are enabled by media—mainstream and fringe alike—that give them air to bloviate and a political culture that makes little effort to take away their oxygen.

Before the Arab Spring, this chain of events would likely have been stopped early. Dictators like Egypt’s Hosni Mubarak and Libya’s Gaddafi either blocked Internet access to prevent their people from seeing inflammatory material (among other things) or used their security agencies to crack down on protests long before they could reach critical mass.

But democratically elected governments don’t have recourse to such draconian methods. Still unused to power, they are unsure how to deal with angry demonstrations, especially when they are mounted by powerful religious or political groups. The tendency has been to look the other way and hope the demonstrators run out of steam.

It doesn’t always work. The Salafists in Libya were emboldened by the failure of the government in Tripoli to crack down on them when they recently desecrated Sufi shrines. The Minister of the Interior (he has since resigned) said he didn’t want to risk the lives of his security forces in order to apprehend the culprits. “The Libyan authorities have been irresponsibly lazy in confronting this threat,” says Tom Malinowski, Washington director of Human Rights Watch. “They have a choice to make. Are they going to be a country connected to the outside world, or are they going to allow a small number of people in their midst to make that impossible?”

At least Libya’s President Mohamed el-Magariaf swiftly apologized to all Americans for the attack on the consulate and promised to hunt down those responsible: 24 hours after the attack on the embassy in Cairo, Egypt’s President Mohamed Morsy had not issued a similar statement. When he finally did, he seemed less concerned with what had happened at the embassy and more with the affront to the Prophet, which he condemned “in the strongest terms.” The Muslim Brotherhood, on its Twitter feed, condemned the Benghazi attack but made no mention of the one in Cairo.

The Egyptian government’s almost insouciant response, hardly in keeping with the country’s status as the second largest recipient of U.S. aid, will rankle both President Obama and his domestic critics. In the hours after the attacks in Cairo and Benghazi, Republicans piled on the President, questioning the wisdom of his outreach to Islamist political forces like the Brotherhood. Even political allies were moved to wonder whether Egypt could really be a reliable friend.

Morsy’s silence has been interpreted by Egyptian analysts as a reluctance to prod the Salafists, whose help he may need to get anything done in parliament. But other political figures were equally pusillanimous. Nobel Peace Prize laureate Mohamed ElBaradei, a prominent liberal secular leader, tweeted, “Humanity can only live in harmony when sacred beliefs and the prophets are respected.” That kind of timidity empowers not only the Salafists but also instigators like Abdallah and his American counterparts.

For an understanding of what can happen when the industry of outrage is allowed to function without check, look at Pakistan, where hatemongers continually stoke anger not only against faraway foreigners but just as frequently—and with more deadly results—against their own people. Minorities like the Ahmadiyya sect are an easy target for extremist TV hosts like Aamir Liaquat Hussain, a former Minister of Religious Affairs. On his show broadcast by Geo TV in 2008, guest scholars declared the Ahmadiyyas “deserving to be murdered for blasphemy.” Soon after, two members of the sect were killed. Hussain was forced to apologize and leave Geo but has since returned to the station.

Other Pakistani provocateurs target the Shi‘ite community, which makes up 10% to 20% of the population. Militant groups with links to political parties as well as the country’s all-powerful military are frequently behind violent attacks against Shi‘ites. Criticism of such groups is often denounced by extremist preachers as blasphemy, which is punishable by death under Pakistani law.

When Salman Taseer, the governor of the country’s largest province and an outspoken critic of the blasphemy law, was killed by his bodyguard last year, the murderer was declared a hero by many. Munir Ahmed Shakir, the influential imam of Karachi’s giant Sultan Mosque, is just one of many who have pronounced as “non-Muslims” all those seeking to amend the blasphemy laws.

The new normal in Egypt and Libya is not as perilous as in Pakistan. Not yet. But as the fledgling democracies of the Middle East struggle to cope with the genies unleashed by the Arab Spring, you can count on the industry of outrage to work overtime to drag the Middle East in that direction.

Tuesday, September 4

Economy Is U.S. Economic Growth a Thing of the Past?

Man holding empty wallet

The United States and economic growth have consistently gone hand in hand. The country’s history has consistently been accompanied by economic progress, and since the end of World War II the U.S. economy has averaged GDP growth of more than 3% per year.

Of course that kind of growth doesn’t just grow on trees. The development of the United States has coincided with the most technologically impressive period in human history. The U.S. Constitution was ratified in the midst of an industrial revolution in England that would soon spread throughout world, and since that time the human race has witnessed such revolutionary inventions as electric light, indoor plumbing, the automobile, air travel, modern medicine, mass telecommunications, the computer, and the Internet.

But is there reason to think that kind of technological advancement — and the resultant economic growth — will continue indefinitely? That’s a question that Robert J. Gordon, an economist at Northwestern University, posed in a recent working paper. Gordon argues that most of the economic growth in America has been prompted by three separate industrial revolutions: The first occurred between 1750 and 1830 and brought us steam engines, cotton spinning and railroads; the second, between 1870 and 1900, brought electricity, running water, and the internal combustion engine; and the third, between 1960 and the end of the 20th century, brought computerization and the Internet.

Though each of these revolutions bestowed unique and wonderful gifts upon the human race, the economic effects varied greatly, according to Gordon. Most significantly, he argues, the latest technological developments will simply not be able to sustain rapid economic growth for as long as the first two. Writes Gordon:
“The computer and Internet revolution (IR#3) began around 1960 and reached its climax in the dot.com era of the late 1990s, but its main impact on productivity has withered away in the past eight years. Many of the inventions that replaced tedious and repetitive clerical labor by computers happened a long time ago, in the 1970s and 1980s. Invention since 2000 has centered on entertainment and communication devices that are smaller, smarter and more capable, but do not fundamentally change labor productivity or the standard of living in the way that electric light, motor cars, or indoor plumbing changed it.”
Gordon poses a colorful rhetorical question that he hopes makes his point about the relative importance of the second industrial revolution as compared to the inventions of the past decade:
“A thought experiment helps to illustrate the fundamental importance of the inventions of IR #2 compared to the subset of IR #3 inventions that have occurred since 2002. You are required to make a choice between option A and option B. With option A you are allowed to keep 2002 electronic technology, including your Windows ’98 laptop accessing Amazon, and you can keep running water and indoor toilets; but you can’t use anything invented since 2002.
Option B is that you get everything invented in the past decade right up to Facebook, Twitter, and the iPad, but you have to give up running water and indoor toilets. You have to haul the water into your dwelling and carry out the waste. Even at 3 am on a rainy night, your only toilet option is a wet and perhaps muddy walk to the outhouse. Which option do you choose?”
One need not ponder this question long to realize the fundamental importance of such inventions as indoor plumbing. It also illustrates the triviality of some of the advancements of the past decade. Indeed, the average growth rate of labor productivity in the U.S. has slowed significantly since 2004, giving credence  to the idea that the dividends of the Internet have already been paid in terms of productivity, and that all the technological advancement of the previous decade has been in creating distracting baubles like social media rather than inventions that actually propel an economy forward.
At the same time, couldn’t one make the argument that you’d prefer the invention of, say, animal husbandry or the written word to indoor plumbing or electricity? In other words, aren’t fundamental technological advancements that come before necessarily more profound because they are the building blocks upon which successive inventions are created? Gordon ignores the entire idea that technology begets itself — that the tools we have today make it more likely that invention will move at a faster pace than it had in the past. And though labor productivity has slowed since 2004, it is probably a little to early to therefore conclude that all the benefits of the Internet revolution have been realized. There are plenty of reason to believe that the Internet is still a young phenomenon that will continue to change the world in ways that we cannot anticipate. (When only 33% of the world has access to an invention, is it reasonable to say that its full impact has been felt?)

Gordon ends his paper by listing six headwinds that he believes will, when combined with the effects of diminishing technological advancement, reduce our average yearly economic growth to a depressingly low 0.2%:
  • Demographics: The population is aging, and the one-time economic benefit of women entering the workforce has already been realized;
  • The plateau of educational attainment: The U.S. is slipping in international measures of educational success, and it is becoming increasing difficult to afford post-secondary education for many;
  • Rising income inequality is restraining growth because there are fewer people with disposable income;
  • Globalization is forcing low-skilled but high-paying jobs abroad;
  • Any efforts to cope with global warming will slow the economy down;
  • Both consumers and the government are overly-indebted and paying down that debt will slow growth.
All six of these headwinds are widely considered to be real threats to the American economy. And if Gordon is right about the Internet ultimately being a productivity-dud, the U.S. may indeed settle into a period of the kind of slow growth that characterized much of the world before the first industrial revolution began in the 18th century.