Archive

Archive for the ‘Information & technology’ Category

2010: The Year the Cloud Rolled In

December 16th, 2010
Comments Off

It’s pretty commonly understood that corporate computing has gone through three principal phases: centralized computing first, then the PC Era, then the age of the network. We could easily fit Enterprise 2.0, cloud computing, and lots of other recent developments into the network phase, but I think we’d be missing something if we did so.

It became clear to me as I watched the digital economy in 2010 that we had passed a tipping point and moved into a new age of technology use. For lack of a better term, let’s call this the Cloud Era. But this age is about much more than the migration of infrastructure, data, and applications up into the ether (even though, as I and many others have argued, this is a huge and exciting development). The Cloud Era is about the cloud, and also about three other fundamental trends in computing that became unmistakable in 2010. These are:

Social Computing, (or Web 2.0, Enterprise 2.0, social business, or whatever else we might want to call it). Emergent social software platforms are here to stay; they’ll be part of the fabric of our personal and professional lives from now on. Over the past few years, some very clever technologists have rolled out varieties of digital connective tissue that satisfy our deep urges to communicate, form networks, express ourselves, keep in touch with others, play games, and get work done.

Some of these have been stunningly successful. Facebook has more than 500 million active users. 25 billion tweets were sent in 2010. Groupon, a 2 year old company, turned down a $6 billion acquisition offer. And so on. These examples show how much we want to follow E.M. Forster’s century-old advice: “Only connect!… Live in fragments no longer.”

Technology Delight. We now expect the hardware and software we use to be appealing, intuitive, and powerful, all at once. Remember when you needed to be a master of Boolean logic in order to find what you were looking for on the Web? When most user interfaces were about as welcoming as a tax form? When user experiences called to mind Job’s experiences?

You might remember them because they were the norm until pretty recently, but advances like Windows 7, the Google search box, and the iPad are rapidly changing our expectations. We used to anticipate a fair amount of frustration when we used technology. Now we expect at least a bit of delight. This is a big shift, and one that should make us happy. In fact, our technologies might make us too happy. Because they’re both appealing and social, they’re addictive. The panel I moderated at the 2010 Boston Book Festival left me a little shaken. William Powers, Nick Carr, and Eric Haseltine all agreed that today’s technologies are so compelling that we can get lost in them for hours, drifting in The Shallows instead of thinking deeply. Technology delight is a fantastic development, but we should keep in mind the dangers of too much of a good thing.

Scientific Organizations. We sit on truly astonishing amounts of data, and have massive computing horsepower with which to analyze it. This means we can more rigorously test more ideas and hypotheses than has ever before been possible. We don't have to rely so heavily on conventional wisdom, lore, abstract theories, or HiPPOs (the Highest Paid People in the Organization). Instead, we can rely on the tools of the scientific method — data, analysis, rigorous thinking and hypothesis testing, experimentation, simulation, and so on — to understand the state of the world, link cause to effect, understand what will work, and make better decisions.

2010 saw computers that play Jeopardy! well and cars that drive themselves down busy city streets. These are triumphs of the scientific method. I’m not worried that computers are going to become self-aware and rise up against us, but I would be worried about a company whose leaders thought that their deep experience and keen intuition were better guides than any box programmed by pencil-necked geeks. Step by step, in domain after domain, the geeks are inheriting the Earth.

None of these trends is just a blip, and none is close to running out of steam. And as I look around companies and talk to their leaders, I see that most organizations are only just beginning to understand and exploit them. This implies a great deal of both opportunity and risk in the years ahead, with tons of innovation and more intense competition. Entering the Cloud Era should be a central part of the strategy for companies that don’t want to be left behind in 2011 and beyond.

blog, Information & technology, Innovation, Social Media

2010: The Year the Cloud Rolled In

December 16th, 2010
Comments Off

It’s pretty commonly understood that corporate computing has gone through three principal phases: centralized computing first, then the PC Era, then the age of the network. We could easily fit Enterprise 2.0, cloud computing, and lots of other recent developments into the network phase, but I think we’d be missing something if we did so.

It became clear to me as I watched the digital economy in 2010 that we had passed a tipping point and moved into a new age of technology use. For lack of a better term, let’s call this the Cloud Era. But this age is about much more than the migration of infrastructure, data, and applications up into the ether (even though, as I and many others have argued, this is a huge and exciting development). The Cloud Era is about the cloud, and also about three other fundamental trends in computing that became unmistakable in 2010. These are:

Social Computing, (or Web 2.0, Enterprise 2.0, social business, or whatever else we might want to call it). Emergent social software platforms are here to stay; they’ll be part of the fabric of our personal and professional lives from now on. Over the past few years, some very clever technologists have rolled out varieties of digital connective tissue that satisfy our deep urges to communicate, form networks, express ourselves, keep in touch with others, play games, and get work done.

Some of these have been stunningly successful. Facebook has more than 500 million active users. 25 billion tweets were sent in 2010. Groupon, a 2 year old company, turned down a $6 billion acquisition offer. And so on. These examples show how much we want to follow E.M. Forster’s century-old advice: “Only connect!… Live in fragments no longer.”

Technology Delight. We now expect the hardware and software we use to be appealing, intuitive, and powerful, all at once. Remember when you needed to be a master of Boolean logic in order to find what you were looking for on the Web? When most user interfaces were about as welcoming as a tax form? When user experiences called to mind Job’s experiences?

You might remember them because they were the norm until pretty recently, but advances like Windows 7, the Google search box, and the iPad are rapidly changing our expectations. We used to anticipate a fair amount of frustration when we used technology. Now we expect at least a bit of delight. This is a big shift, and one that should make us happy. In fact, our technologies might make us too happy. Because they’re both appealing and social, they’re addictive. The panel I moderated at the 2010 Boston Book Festival left me a little shaken. William Powers, Nick Carr, and Eric Haseltine all agreed that today’s technologies are so compelling that we can get lost in them for hours, drifting in The Shallows instead of thinking deeply. Technology delight is a fantastic development, but we should keep in mind the dangers of too much of a good thing.

Scientific Organizations. We sit on truly astonishing amounts of data, and have massive computing horsepower with which to analyze it. This means we can more rigorously test more ideas and hypotheses than has ever before been possible. We don't have to rely so heavily on conventional wisdom, lore, abstract theories, or HiPPOs (the Highest Paid People in the Organization). Instead, we can rely on the tools of the scientific method — data, analysis, rigorous thinking and hypothesis testing, experimentation, simulation, and so on — to understand the state of the world, link cause to effect, understand what will work, and make better decisions.

2010 saw computers that play Jeopardy! well and cars that drive themselves down busy city streets. These are triumphs of the scientific method. I’m not worried that computers are going to become self-aware and rise up against us, but I would be worried about a company whose leaders thought that their deep experience and keen intuition were better guides than any box programmed by pencil-necked geeks. Step by step, in domain after domain, the geeks are inheriting the Earth.

None of these trends is just a blip, and none is close to running out of steam. And as I look around companies and talk to their leaders, I see that most organizations are only just beginning to understand and exploit them. This implies a great deal of both opportunity and risk in the years ahead, with tons of innovation and more intense competition. Entering the Cloud Era should be a central part of the strategy for companies that don’t want to be left behind in 2011 and beyond.

blog, Information & technology, Innovation, Social Media

Ray Ozzie’s Message to All Industries

October 27th, 2010
Comments Off

Microsoft recently announced that Ray Ozzie, its Chief Software Architect and a sage of the high tech world, is leaving the company. On his way out the door, he wrote a public memo titled “Dawn of a New Day”. I believe the scenario he presents in the memo is important enough to place it alongside other famous, game-changing documents, such as Bill Gates’ “The Internet Tidal Wave” memo from 1995, which set the direction for Microsoft for the next 15 years.

In Ozzie’s memo, he argues that the computer industries are at another of their periodic major inflection points. A transition is underway from a digital world centered around the PC and client-server computing to one based on what Ozzie calls “Continuous services and connected devices.” As he writes, “… slowly but surely, our lives, businesses and society are in the process of a wholesale reconfiguration in the way we perceive and apply technology… For each of us who can clearly envision the end-game, the opportunity is to recognize both the inevitability and value inherent in the big shift ahead, and to do what it takes to lead our customers into this new world.”

With this post, I don't want to join the cottage industry of people making predictions about the future of the high tech industry. Instead, I want to focus on what Ozzie's "new day" means for the rest of the economy — the companies that consume digital technology rather than produce it.

To oversimplify just a bit, companies that consume technology (in other words, all firms) are facing an increasingly stark choice between two digital models. The first is the status quo: people with PCs (and mobile phones if they’re road warriors), client-server applications, servers in the data center, and a largeish IT staff to troubleshoot, maintain, and extend this whole infrastructure. The costs associated with this model are well understood, as are the levels of satisfaction it delivers to users and the executives writing the checks. The former are pretty high; the latter usually dismal.

The dawning alternative for the enterprise is people using a range of low-cost connected devices and applications and servers in the cloud (the public one, for all but the largest and most-regulated organizations). The costs of this model aren’t hard to pin down; it might be more expensive now (especially when switching costs are included), but it’s definitely going to get cheaper. The satisfaction it delivers, however, is pretty clear, and pretty high. As Ozzie writes, "…so many more people [are] using technology to improve their lives, businesses and societies, in so many ways. New apps, services & scenarios in communications, collaboration & productivity, commerce, education, health care, emergency management, human services, transportation, the environment, security – the list goes on, and on, and on."

So Ozzie’s new day isn’t just of interest to high tech and new media companies. It’s also dawning for the industries that make up more than 90% of the US economy. Technology has been shaking up these sectors for some time now, and I predict this trend is only going to accelerate. I’m with Ozzie: “Organizations worldwide, in every industry, are now stepping back and re-thinking the basics; questioning their most fundamental structural tenets. Doing so is necessary for their long-term growth and survival.”

What are you seeing? Do the organizations you know best believe that a new day of digital opportunities is dawning? Or are they too fully committed to the old paradigm to make the change?

(disclosure: I’ve been a paid speaker at Microsoft events in the past, but have no current financial relationship with the company. Microsoft is not a sponsor of any of my research or my center at MIT.)

blog, Information & technology, Microsoft, Operations

How Your Smartphone Will Transform Your Elevator Pitch

August 20th, 2010
Comments Off

Listening to good entrepreneurs make their pitch is great fun. How well, or poorly, they align their passion and persuasiveness to the product details reveals a lot. Are they pushing an idea or telling a story? Is it all about their own charisma or is the innovative idea the real hero? Are we having a conversation or am I being sold? How will they get me to “get it”?

All these entrepreneurial issues resurfaced during the recent Fortune technology conference in Aspen. There was no shortage of either articulate entrepreneurs or provocative ideas. So as Osman Rashid, who co-founded Chegg.com, described his new digital textbook startup Kno during a coffee break, I peppered him with questions. His idea was undeniably clever, but aspects of his business model weren’t clear to me. He had his elevator pitch answers down pat, but I wanted to learn more.

Unprompted by me, Osman whipped out his smartphone and handed it over. I was watching a decent video clip illustrating his product’s features and functionality. I could tap to hear testimonials. I could tap to play with a simulation of the software. In a matter of moments, the device had transformed Osman from an entrepreneur I was having a conversation with to a guide and narrator of an interactive experience. My focus and attention alternated between what he said and what appeared onscreen. Sometimes he’d take, touch, and hand back the device; other times, I’d point to something onscreen and ask another question.

The object — and our interaction with it — became an intimate part of our conversation. We couldn't have discussed either Kno or his answers to my questions the same way without it. An idle part of me wondered how cool it would be if our conversation (and my questions) could be recorded and time-stamped along with what was appearing onscreen. Osman refused to allow his smartphone to decay into a sales tool or product pitch — although those elements were baked into the material — and instead used the device as a medium to both reinforce his conversation points and invite new questions and comments from me.

I can say without hesitation that this felt technically and interpersonally different from “laptop-on-the-table” presentations I’d experienced 1,000 times. We were standing up, drinking coffee, chatting, and taking turns holding, viewing, and manipulating this device. The kinesthetics, eye contact, questions, and interruptions revolved as much around the device as us. We would have been worse off without it.

Not ten days later, I was at a Venture Cafe gathering by my MIT office. It’s a place where entrepreneurs and VCs alike come to socialize and impress each other. I struck up a casual conversation with an aspiring biotech entrepreneur. Not two minutes into our talk, he took out his iPhone to animate a technical point about his company’s planned product. He handed it over to me. I thought it fascinating and asked if he might forward that animation to my email. He could and did.

Again, the nature of the questions and our conversation made his iPhone a focal point of our interaction. I wouldn’t have learned nearly as much about his company, or him, if we had just been talking. The “hand-it-over” nature of the iPhone made it feel more like a value-added conversation rather than a scaled-down presentation.

I looked around. No one else was interacting this way.

Elevator pitches are important. The ability to boil down the essence of your innovation into a tasty forty-second sound-bite remains essential. Only now, the pervasiveness, ubiquity, and visuality of mobile devices quantitatively and qualitatively changes the ecology of interpersonal interaction. It’s no longer about what you say and how you say it; it’s increasingly about what you hand over.

What do you hand over that transforms the conversation? What do you hand over that visually and interactively adds value to your spoken words? What do you hand over that complements and supplements your pitch? What do you hand over that invites and inspires the curiosity you want? What do you hand over that makes you more persuasive? These are the questions that will increasingly shape tomorrow’s rhetoric of innovation. The design challenge here is fantastic: how do we use mobile devices not to better connect us to digital networks but to better connect us with customers, clients, and prospects.

Of course, the technical beauty of these media is that — unlike the words one speaks — the imagery one sees onscreen can be emailed, Facebooked, forwarded, or LinkedIn as desired. "Hand-it-over" innovation pitches can be seamlessly slipped wherever your prospects desire. Indeed, an excellent measure of "hand-it-over" effectiveness is whether the person who you "hand-it-over" to actually asks you to send what they've been seeing and interacting with.

My professional bet is that "hand-it-over" innovation pitches will double smartphone and mobile device sales worldwide. Entrepreneurs, salespeople and innovators alike will socialize with at least two devices in the backpacks and breast pockets — one for their personal/professional use and the other to "hand over" for interpersonal play.

“Hand-it-over” conversations seem destined to create new genres of salesmanship and interaction. It will become an innovation best practice. In fact, people will be surprised, and disappointed, if you don’t have anything to hand over.

blog, Information & technology, Innovation, Presentations

No, Google Is Not Making You Stupid

August 13th, 2010
Comments Off

Nick Carr is right — or is he? Of course, Google and the Web have changed our reading habits and affected our attention spans. The changing nature of technology is driving us to consume a greater number of ideas today in less depth.

Throughout history, societies have evolved around new technologies — the plow, the printing press, the telephone. And as our societies have changed, so have we. We adapt to our surroundings. We evolve to fit the world just as we endeavor to keep changing it. That's the virtuous circle of human evolution.

My 11-year-old daughter spends half her day (warning: parental exaggeration) in front of screens, be they iPods, game consoles, laptops, or plasmas. She also plays basketball and soccer and is on her way to earning a black belt in karate.

And she reads more books than I ever did.

She knows more than I did at her age, has a better vocabulary, and is more in touch with society.

Yes, Google is helping to wire her brain in a different way. It doesn’t mean her ability to concentrate has been obliterated.

I'm in my mid-forties and spend as much time as my tween with digital media — most of it popping from one thing to the next, rarely diving deeper than a 10-minute video clip. And I fall asleep with a book — an actual printed one sometimes — every evening.

In complete scientific ignorance, I reject the idea that the Internet has fundamentally and forever impeded our ability to concentrate. We are simply adapting to the media of the moment. No, the desktop web is not made for deep immersion in content. It’s distant and impersonal. But as technologies continue to evolve, so will we.

So here’s my prediction: over the next five years, as digital content flows away from the desktop and into our hands through more personal digital platforms like Kindles, Nooks, iPads, and countless other reading-centric devices, we will experience yet another shift in attention. We are heading towards a deeper immersion in ideas — but of a very different kind that we experienced in the pre-Web days. We will take our new-found abilities to consume and contextualize multiple ideas and multiple forms of media and combine it with our long-held ability to dive deep into text-based content.

Already we can find some evidence of this. The iBooks and Kindle apps for the iPad are among the most popular downloads from iTunes. Digital books are selling like hotcakes. And publishers are just starting to experiment with new hybrid forms of deep content.

As the technology evolves, so will we. Who knows? The result could even be a superior state of attention.

(In the meantime, at least you made it to the bottom of this post.)

Paul Michelman is the Harvard Business Review Group’s Director of Product Development.

blog, Information & technology, Innovation, internet

How You Lowered Your Information Standards

December 24th, 2009
Comments Off

In my last blog, I argued that people don’t care enough about their information environments to prevent overload. This week I am focusing on a related behavioral change that has important implications for companies that produce information products and services: As information grows in quantity, consumers of it are willing to accept lower quality. I call this willingness satisficing — being satisfied with sacrificing quality.

I am supported in this contention by one of The New York Times‘s “The Year in Ideas” items that appeared last Sunday. Called “Good Enough Is the New Great,” it describes our society's increasing willingness to accept lo-fi representations of reality — poor-quality video on our PCs, poor-quality photographs on our mobile phones, poor-quality music on our MP3 players, etc. There are some exceptions — the rise of Blu-Ray DVDs and HD television, for example — but these seem more interesting to people over 30 than to younger folk.

I see a number of manifestations of this phenomenon. Let me describe a few:

The preference for online learning. If you talk with most teachers, they’ll probably tell you they prefer the high-bandwidth communications and deeper relationships of face-to-face learning. But many students seem to prefer the convenience of online learning. At Babson, our fastest-growing MBA programs have a substantial online component.

The rise of the webcast. Webcasts, in comparison to face-to-face meetings and conferences, provide a much lower quality experience. Yet there's no doubt that they are thriving — and I suspect that economic limits on travel aren't the only reason.

The blog and the tweet. Blogs and tweets cover topics less thoroughly than traditional articles, but these forms of “information grazing” are undeniably popular. Similarly, many seem to prefer headlines to extensive news stories.

Skype. Its audio quality seems to be getting much better, but many of my Skype experiences have involved poor fidelity. Yet it’s (mostly) free, and that’s enough for it to thrive. Video Skype calls help, too, of course.

The decline of high-quality knowledge management. People seem to prefer open and participative knowledge environments to highly “curated” ones.

Mobile phones. They are convenient and undeniably mobile, but calls on these cellular radios are seldom esthetically pleasing. Can you hear me now?

This phenomenon isn't new — it explained the rise of the newspaper USA Today. But it seems to be accelerating and raises a number of questions for product and service developers. Maybe Blu-ray won't ultimately take off, for example. Maybe it doesn't really matter that AT&T's cellular network leads to more dropped calls. Maybe Cisco shouldn't be investing so much in hi-def telepresence offerings. Maybe engineers should give up on better sound and video compression algorithms. Maybe The New York Times won’t ultimately be able to continue printing all the news that’s fit to print.

I was pleased last week when some people commented that they were fighting information overload, and I’d love to hear your views about info-satisficing. Do you prefer a fast-food information diet, or a slow food one? Are you settling for lower-quality information, or maintaining your info-standards?

blog, Information & technology, Personal effectiveness, technology

Why We Don’t Care About Information Overload

December 9th, 2009
Comments Off

I gave a presentation this week on decision-making, and someone in the audience asked me if I thought information overload was an impediment to effective decision-making. “Information overload…yes, I remember that concept. But no one cares about it anymore,” I replied. In fact, nobody ever did.

But why not? We’ve been reading articles in the press about information overload being the bane of productivity for almost twenty years. (Here’s a link to a fairly recent article in Harvard Business Review on the topic called “Death by Information Overload” and a related blog.) And there is no doubt that the information load has only increased — day after day, year after year. Spam filters have helped a bit, but we all still get a lot of stuff we don't want. Twitter, Facebook, LinkedIn, text messages, email ads — everything we do only adds to the pile.

So if information overload is such a problem, why don’t we do something about it? We could if we wanted to. How many of us bother to tune our spam filters? How many of us turn off the little evanescent window in Outlook that tells us we have a new email? Who signs off of social media because there’s just too much junk? Who turns off their BlackBerry or iPhone in meetings to ensure no distractions? Nobody, that's who — or very few souls anyway.

Why? First, there is the everlasting hope of something new and exciting. Our work and home lives can be pretty boring, and we’re always hoping that something will come across the ether that will liven things up. If I turn up the filtering on the spam filter or turn off the smartphone, I might miss out on an email promising a new job, a text message offering a new relationship, an RSS feed with a new news item, and so forth. Every new communication offers the frisson of a possible life-changing information event, though it seldom delivers on the promise.

Second, there's a lot of informational inertia. When was the last time you sat down to redefine the structure of your email folders or readjust your RSS feed portfolio? Whatever our information environment is today, it will likely be the same next month or next year. That's why companies like it when we sign up for ongoing email broadcasts — we are unlikely to take the time to unsubscribe.

The third reason, which is related to the second, is that we undervalue our own attention. In my book (with John Beck) The Attention Economy, we called attention to “the most important resource in business,” but few people treat it that way. We open junk mail, we watch junk television, we read junk email. It would take investment of attention to save our attention, and most people just aren’t willing to invest.

So the next time you hear someone talking or read someone writing about information overload, save your own attention and tune that person out. Nobody’s ever going to do anything about this so-called problem, so don’t overload your own brain by wrestling with the issue.

blog, Information & technology, Personal effectiveness, productivity

How Much is Information Overload Costing Your Company?

October 1st, 2009
Comments Off

A blog post I wrote a year ago posed a question: “What’s so bad about information overload?

An article I wrote for the September issue of Harvard Business Review answered that question:

A lot.

The article, “Death by Information Overload,” describes some of the ways information overload may be causing you harm: increased stress, impaired cognition, “information addiction.” (One study found that 11% of email users had tried to hide, from a spouse or family member, the fact that they were checking their BlackBerries.) The article clearly struck a chord, and people responded with their own personal tales of overload woe.

But most of us already know from experience that the abundance of information we enjoy today comes at a price. Less apparent is the tremendous hidden cost it imposes on the organization as a whole.

The possible link between information overload and suicides among employees at France Telecom may be spurious. But recent research indicates that information overload can have a negative effect on such activities as organizational decision making, innovation, and productivity. In one study, for example, people took an average of nearly 25 minutes to return to a work task after an email interruption. Another study found that time lost to handling unnecessary e-mail and recovering from information interruptions cost Intel nearly $1 billion a year. An article in the October issue of HBR, found that forcing knowledge workers to take weekly breaks from email and other work distractions improved performance.

Yet a surprisingly few companies even acknowledge the problem, much less make any attempt to do something about it. A handful of firms have taken small steps:

  • Morgan Stanley is experimenting with ways (described in my article) to reduce the burden of email on employees.
  • Nielsen Media Research recently removed the “reply to all” button from the company’s email system.
  • A number of companies and academic researchers have formed the Information Overload Research Group, with the aim of mitigating the impact of information overload on employees and their companies.

Do you know of — or work at — a company that understands the cost of information overload to the organization as a whole? Are you aware of any innovative approaches companies are taking to tackle this problem?

blog, Information & technology, productivity