Posts Tagged wearable-technology

Apple and wearable tech: When being late to the game works in your favor

apple watch

Next Tuesday, Apple CEO Tim Cook will make the most important announcements of his tenure: the first set of major new products that won’t be largely credited to his legendary predecessor Steve Jobs.

It’s a hard act to follow. At least in the last ten years of his life, no person on the planet was more associated with innovation in computing technology than Jobs. This has led many to continually look at Apple to see if it’s falling off from prior greatness ever since his death, in part because Apple under Cook hasn’t introduced a new product category for the company, as Jobs did with the iPod, iPhone, and iPad. It looks like this run without a major new product category will come to an end on Tuesday with a long-anticipated smartwatch (I was among those who have been anticipating it), as well as the new iPhone 6.

As was the case in the run-up to the original iPhone, there are a few things very different about this announcement from Apple’s usual product launches:

1. Though everyone expects the iWatch to exist, it hasn’t appeared on the internet yet; the iPhone 6, by comparison, has been assembled and started up from replacement parts in many, many leaks.

2. Apple seems to be the last company on the planet to launch a smartwatch, following Pebble, LG, Motorola, Metawatch, Basis, Fitbit, Garmin, Sony, and Samsung (6 of them in the last year).

The latter point is the most interesting (the first just shows how serious about secrecy Apple can be when it wants to), largely because it raises the question: why is everyone so interested in Apple’s watch when you can buy dozens of others in the store today? Isn’t Apple late to the party? You bet they are. And they wouldn’t have it any other way.

For big innovations, that kind that adds tens of billions of dollars in annual revenue (as the iPhone and iPad have), Apple is almost always last to introduce their version. Mp3 players were around for at least 5 years before the iPod. Smartphones were rolled out more than 8 years before the iPhone. And Tablets were first shown by Microsoft in 2001, 9 years before the iPad. Heck, the original Macintosh wasn’t even Apple’s first graphical computer (that would be the Lisa), let alone the first one on the market. In each case, despite the wait, Apple’s versions of these products were the first to show the potential of the platforms.

Why does Apple do this? Because it allows Apple to let other people do some of its R&D for them, learn from the in-market failures, and then home in on the idea use cases that actually make the product worth people’s time. Ironically, Apple’s approach is very similar to the smartest thing Bill Gates has ever said, which is that the tech industry overestimates how different the world will look in two years and underestimates how different it can be in 10. This leads to unrealistic product plans in which cutting-edge technologies are expected to revolutionize everything on a short timeline. Apple understands it’s smarter to begin planning for about 10 years down the road — and maybe launch your products then — after the very first product of its kind comes to market.

Back to the smartwatch category. Guess when the first commercially available smartwatch with internet connectivity was released? That’s right. 2004.

As ever, Apple will be right on time to be fashionably late, and the new race to copy their approach will begin.

Unlocking the power of digital ethnography

Digital Ethno

Unlocking the power of digital ethnography, by Antedote’s Anne Lacey, explores the multiple dimensions of digital ethnography and the potential it has as a research tool. The article can be found on Core77.

Check out the excerpt below for a short preview and be sure to read on here.

“To gain to new insights and opportunities, we need to think and approach research differently. Digital ethnography can fuel new ideas and research approaches, as my colleagues at antedote and I have seen in the years since we designed and built a mobile and online tool for studies from the ground up. Although digital ethnography has become an umbrella term for a great many online qualitative research tools, we use it specifically to mean a lengthy study (a week or two to several months) with consumers via computer and/or mobile phone, comprised of a blend of observation, live experience-alongs, interviews and user-generated content. Though these elements are common to it, each study has custom elements to it, premised on one big idea: using cutting-edge technology to restore some of the original intent and benefits of ethnography.”

— Anne Lacey for Core77

This Is How to Build an Interface for the Ultimate Smartwatch – from WIRED.com

Today we’ll be over at WIRED.com, with a different take on on what will propel wearable technology–specifically the smartwatch–into the future, written by antedote’s Pete Mortenson:

The fashion-will-fix-smartwatches narrative is a really compelling story. It’s also completely wrong — or, at minimum, flies in the face of decades of study about how new technologies get adopted. As documented by Everett Rodgers in The Diffusion of Innovations, no fundamentally new product type succeeds solely based on the fact that it’s attractive; it succeeds because it does something genuinely useful at a price point low enough that people don’t consider it a luxury. And then it becomes normal and even attractive because it was first useful…

For more on the future of more useful, practical, and innovative wearable technology, check out what Pete has to say on WIRED today and leave a comment.