Pathways Documents

background documents and work in progress

This is a public committee  public

in the news

    Julie Smith David
    "Accounting for Innovation" in Inside Higher Ed
    in the news posted July 31, 2012 by Julie Smith David 
    1122 Views, 2 Comments
    description:
    "Accounting for Innovation" in Inside Higher Ed

    Comment

     

    • Robert E Jensen

      From the CFO Journal's Morning Ledger on September 12, 2013

      What would Eno do? 
      In 1974, English musician and composer Brian Eno and Peter Schmidt, a visual artist, authored, packaged and sold Oblique Strategies, a pack of cards designed to help artists find inspiration and battle creative roadblocks. Each card in the deck held a koan-like suggestion—“go outside and shut the door,” for example—and the idea was that an artist would draw upon the deck in the face of a creative crisis. Musicians have most famously tapped Oblique Strategies over the years, but they are not the only ones. In this week’s 
      Foreign Policy, Jeffrey Lewis shuffles the deck to help explain, of all things, President Obama’s Syria strategy. “Honour thy error as a hidden intention,” best explains how Secretary of State John Kerry’s verbal stumbling led to a breakthrough (possibly) on Syria’s chemical weapons, Lewis writes. All of this is just a long way of saying that if Oblique Strategies can work for President Obama and a Berlin-era David Bowie, it can certainly work for executives decades removed from their time at Harvard Business School. Imagine a world where Hewlett Packard CEO Meg Whitman drew the card, “Would anyone want it?”…where Google CEO Larry Page turned over “Voice your suspicions”…where Apple CEO Tim Cook pulled “Make a sudden destructive unpredictable action; incorporate.”

    • Robert E Jensen

      "The Disruption Machine What the gospel of innovation gets wrong," by Jill Lepore, The New Yorker, July 23, 2014 ---
      http://www.newyorker.com/magazine/2014/06/23/the-disruption-machine?currentPage=all&utm_source=howtogeek&utm_medium=email&utm_campaign=newsletter

      . . .

      Porter was interested in how companies succeed. The scholar who in some respects became his successor, Clayton M. Christensen, entered a doctoral program at the Harvard Business School in 1989 and joined the faculty in 1992. Christensen was interested in why companies fail. In his 1997 book, “The Innovator’s Dilemma,” he argued that, very often, it isn’t because their executives made bad decisions but because they made good decisions, the same kind of good decisions that had made those companies successful for decades. (The “innovator’s dilemma” is that “doing the right thing is the wrong thing.”) As Christensen saw it, the problem was the velocity of history, and it wasn’t so much a problem as a missed opportunity, like a plane that takes off without you, except that you didn’t even know there was a plane, and had wandered onto the airfield, which you thought was a meadow, and the plane ran you over during takeoff. Manufacturers of mainframe computers made good decisions about making and selling mainframe computers and devising important refinements to them in their R. & D. departments—“sustaining innovations,” Christensen called them—but, busy pleasing their mainframe customers, one tinker at a time, they missed what an entirely untapped customer wanted, personal computers, the market for which was created by what Christensen called “disruptive innovation”: the selling of a cheaper, poorer-quality product that initially reaches less profitable customers but eventually takes over and devours an entire industry.

      Ever since “The Innovator’s Dilemma,” everyone is either disrupting or being disrupted. There are disruption consultants, disruption conferences, and disruption seminars. This fall, the University of Southern California is opening a new program: “The degree is in disruption,” the university announced. “Disrupt or be disrupted,” the venture capitalist Josh Linkner warns in a new book, “The Road to Reinvention,” in which he argues that “fickle consumer trends, friction-free markets, and political unrest,” along with “dizzying speed, exponential complexity, and mind-numbing technology advances,” mean that the time has come to panic as you’ve never panicked before. Larry Downes and Paul Nunes, who blog for Forbes, insist that we have entered a new and even scarier stage: “big bang disruption.” “This isn’t disruptive innovation,” they warn. “It’s devastating innovation.”

      Things you own or use that are now considered to be the product of disruptive innovation include your smartphone and many of its apps, which have disrupted businesses from travel agencies and record stores to mapmaking and taxi dispatch. Much more disruption, we are told, lies ahead. Christensen has co-written books urging disruptive innovation in higher education (“The Innovative University”), public schools (“Disrupting Class”), and health care (“The Innovator’s Prescription”). His acolytes and imitators, including no small number of hucksters, have called for the disruption of more or less everything else. If the company you work for has a chief innovation officer, it’s because of the long arm of “The Innovator’s Dilemma.” If your city’s public-school district has adopted an Innovation Agenda, which has disrupted the education of every kid in the city, you live in the shadow of “The Innovator’s Dilemma.” If you saw the episode of the HBO sitcom “Silicon Valley” in which the characters attend a conference called TechCrunch Disrupt 2014 (which is a real thing), and a guy from the stage, a Paul Rudd look-alike, shouts, “Let me hear it, DISSS-RUPPTTT!,” you have heard the voice of Clay Christensen, echoing across the valley.

      Last month, days after the Times’ publisher, Arthur Sulzberger, Jr., fired Jill Abramson, the paper’s executive editor, the Times’ 2014 Innovation Report was leaked. It includes graphs inspired by Christensen’s “Innovator’s Dilemma,” along with a lengthy, glowing summary of the book’s key arguments. The report explains, “Disruption is a predictable pattern across many industries in which fledgling companies use new technology to offer cheaper and inferior alternatives to products sold by established players (think Toyota taking on Detroit decades ago). Today, a pack of news startups are hoping to ‘disrupt’ our industry by attacking the strongest incumbent—The New York Times.”

      A pack of attacking startups sounds something like a pack of ravenous hyenas, but, generally, the rhetoric of disruption—a language of panic, fear, asymmetry, and disorder—calls on the rhetoric of another kind of conflict, in which an upstart refuses to play by the established rules of engagement, and blows things up. Don’t think of Toyota taking on Detroit. Startups are ruthless and leaderless and unrestrained, and they seem so tiny and powerless, until you realize, but only after it’s too late, that they’re devastatingly dangerous: Bang! Ka-boom! Think of it this way: the Times is a nation-state; BuzzFeed is stateless. Disruptive innovation is competitive strategy for an age seized by terror.

      Every age has a theory of rising and falling, of growth and decay, of bloom and wilt: a theory of nature. Every age also has a theory about the past and the present, of what was and what is, a notion of time: a theory of history. Theories of history used to be supernatural: the divine ruled time; the hand of God, a special providence, lay behind the fall of each sparrow. If the present differed from the past, it was usually worse: supernatural theories of history tend to involve decline, a fall from grace, the loss of God’s favor, corruption. Beginning in the eighteenth century, as the intellectual historian Dorothy Ross once pointed out, theories of history became secular; then they started something new—historicism, the idea “that all events in historical time can be explained by prior events in historical time.” Things began looking up. First, there was that, then there was this, and this is better than that. The eighteenth century embraced the idea of progress; the nineteenth century had evolution; the twentieth century had growth and then innovation. Our era has disruption, which, despite its futurism, is atavistic. It’s a theory of history founded on a profound anxiety about financial collapse, an apocalyptic fear of global devastation, and shaky evidence.

      Most big ideas have loud critics. Not disruption. Disruptive innovation as the explanation for how change happens has been subject to little serious criticism, partly because it’s headlong, while critical inquiry is unhurried; partly because disrupters ridicule doubters by charging them with fogyism, as if to criticize a theory of change were identical to decrying change; and partly because, in its modern usage, innovation is the idea of progress jammed into a criticism-proof jack-in-the-box.

      The idea of progress—the notion that human history is the history of human betterment—dominated the world view of the West between the Enlightenment and the First World War. It had critics from the start, and, in the last century, even people who cherish the idea of progress, and point to improvements like the eradication of contagious diseases and the education of girls, have been hard-pressed to hold on to it while reckoning with two World Wars, the Holocaust and Hiroshima, genocide and global warming. Replacing “progress” with “innovation” skirts the question of whether a novelty is an improvement: the world may not be getting better and better but our devices are getting newer and newer.

      The word “innovate”—to make new—used to have chiefly negative connotations: it signified excessive novelty, without purpose or end. Edmund Burke called the French Revolution a “revolt of innovation”; Federalists declared themselves to be “enemies to innovation.” George Washington, on his deathbed, was said to have uttered these words: “Beware of innovation in politics.” Noah Webster warned in his dictionary, in 1828, “It is often dangerous to innovate on the customs of a nation.”

      The redemption of innovation began in 1939, when the economist Joseph Schumpeter, in his landmark study of business cycles, used the word to mean bringing new products to market, a usage that spread slowly, and only in the specialized literatures of economics and business. (In 1942, Schumpeter theorized about “creative destruction”; Christensen, retrofitting, believes that Schumpeter was really describing disruptive innovation.) “Innovation” began to seep beyond specialized literatures in the nineteen-nineties, and gained ubiquity only after 9/11. One measure: between 2011 and 2014, Time, the Times Magazine, The New Yorker, Forbes, and even Better Homes and Gardens published special “innovation” issues—the modern equivalents of what, a century ago, were known as “sketches of men of progress.”

      The idea of innovation is the idea of progress stripped of the aspirations of the Enlightenment, scrubbed clean of the horrors of the twentieth century, and relieved of its critics. Disruptive innovation goes further, holding out the hope of salvation against the very damnation it describes: disrupt, and you will be saved.

      Disruptive innovation as a theory of change is meant to serve both as a chronicle of the past (this has happened) and as a model for the future (it will keep happening). The strength of a prediction made from a model depends on the quality of the historical evidence and on the reliability of the methods used to gather and interpret it. Historical analysis proceeds from certain conditions regarding proof. None of these conditions have been met.

      . . .

      Disruptive innovation is a theory about why businesses fail. It’s not more than that. It doesn’t explain change. It’s not a law of nature. It’s an artifact of history, an idea, forged in time; it’s the manufacture of a moment of upsetting and edgy uncertainty. Transfixed by change, it’s blind to continuity. It makes a very poor prophet.

      The upstarts who work at startups don’t often stay at any one place for very long. (Three out of four startups fail. More than nine out of ten never earn a return.) They work a year here, a few months there—zany hours everywhere. They wear jeans and sneakers and ride scooters and share offices and sprawl on couches like Great Danes. Their coffee machines look like dollhouse-size factories.

      They are told that they should be reckless and ruthless. Their investors, if they’re like Josh Linkner, tell them that the world is a terrifying place, moving at a devastating pace. “Today I run a venture capital firm and back the next generation of innovators who are, as I was throughout my earlier career, dead-focused on eating your lunch,” Linkner writes. His job appears to be to convince a generation of people who want to do good and do well to learn, instead, remorselessness. Forget rules, obligations, your conscience, loyalty, a sense of the commonweal. If you start a business and it succeeds, Linkner advises, sell it and take the cash. Don’t look back. Never pause. Disrupt or be disrupted.

      But they do pause and they do look back, and they wonder. Meanwhile, they tweet, they post, they tumble in and out of love, they ponder. They send one another sly messages, touching the screens of sleek, soundless machines with a worshipful tenderness. They swap novels: David Foster Wallace, Chimamanda Ngozi Adichie, Zadie Smith. “Steppenwolf” is still available in print, five dollars cheaper as an e-book. He’s a wolf, he’s a man. The rest is unreadable. So, as ever, is the future.