Information technology, gadgets, social media, libraries, design, marketing, higher ed, data visualization, educational technology, mobility, innovation, strategy, trends and futures. . . 

Posts suspended for a bit while I settle into a new job. . . 

Entries in Hardware (3)

Monday
Feb102014

Screen Size Is What Matters

As I wrote here, what's differentiating for me in terms of what device I use at a particular instant-in-time is not manufacturer or operating system, but rather screen size: 

Nearly any contemporary smartphone will get the job done; the hardwares have converged in terms of features and capabilities, and the several mobile OSes and app ecosystems are very close to identical.

What's important to me at the moment is the size of the screen --

Note 3

  • smartphone, for use any- and everywhere;
  • tablet, hanging around the house, traveling, and certain work settings;
  • laptop, traveling and certain work settings;
  • laptop connected to huge monitor, office and home office.

Every device is connected to the network. all data lives on the network and is synchronized across devices (Google, Dropbox), and the core apps -- Gmail, Evernote, etc. -- function pretty much the same on every device. 

In this piece in Walt Mossberg's <re/code> (he left The Wall Street Journal to start this), Andreessen Horowitz's Zal Bilimoria casts this perspective like so: 

Our Love Affair With the Tablet Is Over

February 6, 2014

Back in 2011, I was having an all-consuming love affair with tablets. At the time, I was the first-ever head of mobile at Netflix. I saw tablets in my sleep, running apps that would control homes, entertain billions and dutifully chug away at work. Tablets, I was convinced, were a third device category, a tweener that would fill the vacuum between a phone and a laptop. I knew that was asking a lot — at the time, however, I didn’t know just how much.

iPad

I wasn’t the only one swooning in the presence of the iPad and its imitators. Everyone was getting in on the love fest. The typically sober analysts over at Gartner were going ballistic with their shipment predictions for the iPad, and a flurry of soon-to-be-launched Android tablets. Amazon (Kindle Fire), Barnes & Noble (Nook Tablet), HP (TouchPad running webOS) and even BlackBerry (PlayBook) all rushed into the market to take on Apple, which commanded 70 percent of the tablet market one year after Steve Jobs unveiled the first iPad. On the software side, startups like Flipboard, tech giants like Adobe and even large enterprises like Genentech were quickly assembling teams to take advantage of this new platform.

Now — three years and 225 million tablets later — I’m starting to see how misplaced that passion was.

The tablet couldn’t possibly shoulder all the expectations people had for it. Not a replacement for your laptop or phone — but kinda. Something you kick back with in the living room, fire up at work and also carry with you everywhere — sort of. Yes, tablets have sold in large numbers, but rather than being a constant companion, like we envisioned, most tablets today sit idle on coffee tables and nightstands. Simply put, our love for them is dying.

Article continues at link. 

Now that my smartphone is a so-called phablet -- I have a Samsung Galaxy Note 3, with a 5.7" screen -- I hardly ever use my tablet. 

 

Monday
Jun102013

Personal Computers are Appliances

Here's another take on http://www.william-garrity.com/blog/2013/4/13/pc-sales-plunge.html, in the "Editor's Desk" section of the July 2013 PCWorld (following is the online version, which is a slightly different text than the one in print) -- 

PCs aren't dead, they're microwaves

Brad Chacos

Last week's news wasn't generous to PCs. In fact, half the Internet was ready to eulogize our beloved black boxes after market research showed that computer shipments fell by double-digit percentages in the first quarter. Stick a fork in 'em, the common wisdom declared. PCs are done.

But nothing could be further from the truth. PCs aren't dead—they're microwaves. But not for much longer.

Hear me out.

From marvelous to meh

Right up until the early ’90s, computers were a luxury, an oddity even. If your childhood chum had a 386, you were at his house every day, churning out ASCII art on a dot-matrix printer and playing asynchronous PBEM games or MUDs. Good times! Today, however, everyone in every neighborhood has a PC, just as everyone in every neighborhood has a stove, a refrigerator, and a microwave.

Our wondrous electronic windows into the world have evolved into ho-hum appliances—indispensable, yet unexciting.

Is it any surprise that shoppers treat these black holes of non-brilliance as appliances? The PC landscape has been devoid of any real hardware innovation for as long as memory serves.

Curious, I performed a quick, completely unscientific poll, asking about 20 nontechie friends, grandmothers, aunts, social-media acquaintances, and convenience-store employees the reason for their most recent computer purchase, whenever that may have been. The answers were unanimous across the board: They all bought their new computers when their previous computer broke.

And you do the very same thing with a stove, refrigerator, or microwave. You buy a new one when you absolutely have to, and not a moment sooner.

It's sad, really. (Do you realize how many microscopic, cutting-edge transistors are packed onto every single computer chip? Billions.) But it's not surprising. A whole range of factors have coalesced into a perfect storm, all helping to turn PCs into a commodity appliance.

Article continues at link. 

All that matters to me is a (fast) network connection, a (good) browser (I prefer Chrome), and a screen that's as large as the situation allows -- large-screen smartphone for pocket, ultrabook as laptop, ultrabook attached to large monitor for home. It really doesn't matter what OS the machine uses, how fast its processor is, how much disc storage the machine has.

 

Saturday
Apr132013

PC Sales Plunge?

There's been a lot of press lately about the decline in sales of PCs. Even assuming that the definition of "personal computer" is a traditional laptop or desktop -- I would extend the definition to include

  • tablets of any size, and
  • smartphones, not to mention
  • netbooks and ultrabooks --

I doubt that the reason is limited to any one cause, such as

  • the burgeoning sales of tablets,
  • the greater proportion of mobile phones that is smartphones,
  • the market's rejection (or not) of Windows 8,
  • the growth of Chromebooks and other consumer-oriented thin client-based devices,
  • the collapse of netbook sales,
  • the ascending popularity of Apple generally,
  • the ascendancy of cloud computing --

for example, see http://mashable.com/2013/04/12/windows-8-pc-sales-woes/,
http://www.zdnet.com/whos-killing-the-pc-blame-the-cloud-7000013954/, among other reports. 

Rather, what I think is really grounding this phenomenon is akin to what is reported in this story -- "The real reason for the PC sales plunge: The era of 'good enough' computing," by Simon Bisson on April 11 in http://www.zdnet.com -- computers only need to be so good. 

IDC's PC sales numbers show a dramatic fall, but they're not the whole story.

Before we blame one thing we need to take a much more nuanced view and look at the last decade of the IT world. After all, nothing is ever as simple as it seems.

We're at an interesting inflexion point in the IT industry where innovation is moving away from desktop PC hardware into software and into the server and up to the cloud.

The truth is quite simple: PCs are lasting longer, they're not getting measurably faster, and software is getting better. Why do you need to buy a new PC when you can get better performance with a software upgrade on your old hardware? [Emphasis mine.]

 f I was to put a finger on the point where everything changed, where Windows stopped being the driver for PC sales, I’d have to point at Windows Vista.

That was the point where Microsoft and the PC OEMs stopped trusting each other. Microsoft made a bet on PC hardware and capabilities, and the PC industry pulled the rug out from under it, forcing the mess that was Vista Basic on users as they tried to sell cheap PCs with old graphics hardware.

That meant Microsoft had to change. It couldn't make that same bet on hardware anymore. It didn't trust OEMs to deliver on the promises the silicon vendors were making (and if we look at the initial Windows 8 hardware, it's pretty clear it was right to make that decision). So it made the software better instead.

New releases of Windows would need fewer resources, offer better performance, and (particularly important to mobile users) use less power.

So we shouldn't have been surprised when Windows 7 came along, bringing all that better performance on the same hardware. There wasn't a reason to buy a new PC for a new Windows any more.

We could just buy a cheap upgrade and get more life from our PCs. My Vista-era desktop systems got a performance bump because the software got better, taking advantage of the older hardware. I didn't need new PCs, I didn't even need a new graphics card.

I only bought my current PC last year because a hardware failure fried the Vista machine's motherboard. If I hadn't had a hardware failure I suspect I'd still be using that PC today.

The new machine has the same hard disks, even the same graphics card, using the same multi-monitor setup as that original Vista-era machine. It wasn't any faster, but it got another performance bump when I upgraded it to Windows 8 last summer. We even saw significant improvements on XP-era test hardware.

So yes, that means Windows 8 is one thing that's to blame for a slow-down in PC sales. You don't need a new PC to see a benefit from it, especially when you're getting a 10 percent speed bump over Windows 7 running on Vista-era hardware, and an extra hour or so battery life on a three year old laptop.

A cheap upgrade download and your old PC gets a new lease of life. Why do you need to spend several hundred pounds or dollars for extra performance when it comes with an operating system upgrade for a fraction of the cost?

So if our software gets better on older hardware, so what about all that new hardware? 

See full story at link.