25
Jul 13

Make stuff that just works, or go home.

“This is what customers pay us for–to sweat all these details so it’s easy and pleasant for them to use our computers. We’re supposed to be really good at this. That doesn’t mean we don’t listen to customers, but it’s hard for them to tell you what they want when they’ve never seen anything remotely like it.” – Steve Jobs

The job of the the software developer and the hardware engineer is to make experiences. They deliver these experiences for users by building things. Software, devices, and the marriage of the two. Sounds easy enough. The detail is in making things into experiences that “just work”.

Long ago, I started a blog post about my (now returned) Leap Motion controller, and stalled out on it. Listen – I don’t want to pile on a small company trying to make their own way. I think that Leap Motion has done some things very well. They made a good looking device, with nice packaging and even a pretty darn good (low friction) application store. But when the rubber meets the road, the device… just didn’t work how I expected it to. I’ve got a general post I’m mulling over the next few weeks where I’ll talk about that, as well as Kinect for Windows/Kinect in general.

Where the Leap Motion controller falls down, though, is actually in use. There are promises that their marketing materials make that simply don’t work in real life.

A few years ago, my youngest daughter asked for a toy (for either Christmas or her birthday) that she had seen during the little bit of TV she watches that had ads. Upon opening it, she seemed underwhelmed. When I asked her what was wrong, she noted, “It’s smaller than I thought it would be, and this thing doesn’t really work.” She was pointing at a pretend water faucet, if I recall correctly.

There are few things that are more disappointing – for a grownup or a child – when something we spend our money on or get as a gift doesn’t work the way we’re led to believe it should work – or even worse, the way we’ve been told it will work.

As I was boxing up my Leap Motion, a colleague came by and asked me to take a quick look at their computer. Unfortunately whether it’s a colleague, a parent, or a friend, that’s rarely something fun. Indeed, something had happened when they had rebooted their computer, and it had now somehow lost the trust relationship with the domain (and as a result, my colleague couldn’t log on). If that sounds gross and confusing to you, it should. It’s Windows showing you how the sausage is made. There’s an easy fix. Unplug it from the network, and Windows logs on using cached credentials. From there, you can do a bunch of things (including jamming the metaphoric sausage back into the casing) – but it will almost always let you fix the problem at hand. This hack is how Windows has worked for years when this scenario happens. My question? Why doesn’t Windows just say, “You may be able to log on to your computer and fix the problem if you unplug the network connection.”

Shame on me for not thinking of that solution 15 years ago and submitting a bug on it – and driving it to a fix.

I’ve been looking at a new book from Addison-Wesley called Learning iOS Design. The timing of the book may not be great, given the iOS 7 design refresh, but a lot of the principles still apply. It’s a good read so far, and most importantly, is really thoughtful in terms of sincerity of design. As I first thumbed through the book, I happened to land almost immediately on page 146. On that page is a section entitled:

The Moment of Uncertainty

It reads: “A crucial instant occurs every time a user provides a bit of input to a piece of technology … if the designers have done their part to create a graceful experience, then the instant will pass without notice by the user. The technology responds, the user’s expectation is met, and he continues getting done whatever it is he’s using the app to get done.”

This text, and much of this book, resonates with me. Over the last few years, I’ve begun looking around at the technology in my house, my workplace, and my life as a whole. Hardware, software, apps, appliances, everything. If something makes my life easier, I keep it. If it makes my life more tedious than if the item wasn’t there, it gets tossed. (I’m looking at you, Sony alarm clock on my nightstand.)

Imagine a device or an application. As it is designed, we can consider the potential discomfort (let’s just call it pain) of using this thing. The goal of a good designer is to eat the pain for the user. Imagine the below as a horizontal slider where the pain of a good interface can be alternately felt by the user or the designer. Iterations, refinement, tossed prototypes, all go in to building an end result that pulls the slider below away from the user and toward the designer. The work the designer performs upfront leads to a more efficient, predictable, and enjoyable experience for the end user.Painpoints

Arthur C. Clarke said, “Any sufficiently advanced technology is indistinguishable from magic.” The trick is to make magic where people can’t see the strings, and can’t see the cards up your sleeve.

I’ve mentioned this video before, but I think two of Jonathan Ives’ quotes in it are really relevant here:

“A lot of what we seem to be doing in a product like that (the iPhone) is getting design out of the way.”

and secondarily:

“It’s really important in a product to have a sense of the hierarchy of what’s important and what’s not important by removing those things that are all vying for your attention.”

I’ve said on Twitter before that it’s wrong that we’ve let people tell themselves that they’re stupid when the computer doesn’t work the way they expect it to. When you turn on the hot water and it comes out cold, does that make you stupid? No. So why do we put up with this from computers, devices, and software/apps?

I believe in the end, the technologies that win will be those that just work for users. Those that are predictable, easy, and enjoyable to use will win with consumers. Devices and software that are not focused innately on the needs of the end user will not survive. They will be discarded for solutions that do.


10
Apr 13

Windows XP – Hitting the Wall

Just under one year from now, on April 8, 2014, Windows XP leaves Extended Support.

There are three key questions I’ve been asked a lot during the past week, related to this milestone:

  1. What even happens when Windows XP leaves Extended Support?
  2. Will Microsoft balk, and continue to support Windows XP after that date?
  3. What will happen to systems running Windows XP after that date?

All important questions.

The first question can be exceedingly complex to answer. But for all intents and purposes, the end of Extended Support means that you will receive absolutely no updates – including security updates – after that date. While there are some paid support options for Windows XP after 4/8/2014, however as we understand it they will be very tightly time limited, very expensive, and implemented with a contractual, date-driven expectation for a retirement of the organization’s remaining Windows XP desktops. There’s no “get out of jail” card, let alone a “get out of jail free” card. If you have Windows XP desktops today, you have work to do, and it will cost you money to migrate away.

If you want to look for yourself, you can go to Microsoft’s downloads site and look – but Windows XP still receives patches for both Windows itself or Internet Explorer (generally 6,7, and 8 all get patched) every month. From April 2012 to April 2013, every month saw security updates to either Windows XP or IE on it – and 8 of the 13 months saw both. Many of these are not pretty vulnerabilities, and if left unpatched, could leave targeted organizations exceedingly vulnerable after that date.

This leads us to the second question. In a game of chicken, will Microsoft turn and offer support after 4/8/2013?

Why are you asking? Seriously. Why? I was on the team that shipped Windows XP. I wish that, like a work of art, Windows XP could be timeless and run forever. But it can’t (honestly, that theme is starting to get rather long in the tooth too). It’s a piece of machinery – and machinery needs maintenance (and after a time, it usually needs replacement). Windows 2000 received it’s last patch the month before it left Extended Support. So, while 4/8/2014 is technically a Patch Tuesday, and Microsoft might give you one last free cup of joe, I’d put a good wager down that if you want patches after that day, you’d better plan your migration, get on the phone to Microsoft relatively soon, get a paid support contract in place, and be prepared to pay for the privilege of support while you migrate away.

Companies that are running Windows XP today – especially in any sort of mission critical or infrastructure scenario – especially if connected to the Internet, need to have a migration plan away to a supported operating system.

At a security startup I used to work at (not that long ago), it shocked me how many of our prospects had Windows 2000, Windows NT, or even older versions of NT or 9x, in production (and often connected to networks or the Internet. Even more terrifying, many of these were mission critical systems.

And this segues us to the third question. What happens to systems running after 4/8/2014? You can quote Clint Eastwood’s “Dirty Harry” character, “Do I feel lucky? Well do you?” It’s not a good bet to make. Again, we’ve seen some nasty bugs patched in IE 6,7, and 8, and Windows XP itself over the last year. While one would hope an OS 12 years out would be battle-hardened to the point of being bulletproof, that is not the case. Windows XP isn’t bulletproof. It’s weary. It’s ready to be retired. Organizations with critical infrastructure roles still running Windows XP will have giant targets on them after next April, and no way to defend those systems.

A common thread I’ve also seen is a belief that a wave of Windows XP migrations over the next 12 months will mean anything, economically. It really isn’t likely to. While we will likely see a good chunk of organizations move away from Windows XP over the next year, doing so may mean finding budget to replace 5+ year old PCs, and patch, update, or purchase replacement Windows, Java, and Web applications that can run on newer operating systems. Most of the easy lifting has already been done. The last customers remaining are likely extremely hard, extremely “financially challenged”, or both. It may be unfortunate, but this time next year (and likely the year after that, and years after that), there will still be Windows XP systems out there, some of them running in highly critical infrastructure. Dangerous, but unfortunately, likely to be the case.


27
Mar 13

The Stigma of Mac Shaming

I recall hearing a story of a co-worker at Microsoft, who was a technical assistant to an executive, who had a Mac. It wouldn’t normally be a big deal, except he worked directly for an executive. As a result, this Mac was seen in many meetings across campus – it’s distinct aluminum body and fruity ghost shining through the lid a constant reminder that this was one less PC sold (even if it ran Windows through Boot Camp or virtualization software. Throughout most of Microsoft, there was a strange culture of “eww, a Mac”. Bring a Mac or an iPod to work, feel like an outcast. This was my first exposure to Mac Shaming.

I left Microsoft in 2004, to work at Winternals in Austin (where I had the last PC I ever really loved – a Toshiba Tecra A6). In 2006, on the day Apple announced Boot Camp, I placed an order for a white Intel iMac. This was just over three months before Winternals was acquired by Microsoft (but SHH… I wasn’t supposed to know that yet). This was my first Mac. Ever.

Even though I had worked at Microsoft for over 7 years, and was still writing for Microsoft’s TechNet Magazine as a monthly Contributing Editor, I was frustrated. My main Windows PC at home was an HP Windows XP Media Center PC. Words cannot express my frustration at this PC. It “worked” as I originally received it – but almost every time it was updated, something broke. All I wanted was a computer that worked like an appliance. I was tired of pulling and pushing software and hardware to try and get it to work reliably. I saw Windows Vista on the horizon and… I saw little hope for me coming to terms with using Windows much at home. It was a perfect storm – me being extreme underwhelmed with Windows Vista, and the Mac supporting Windows so I could dual-boot Windows as I needed to in order to write. And so it began.

Writing on the Mac was fine – I used Word, and it worked well enough. Running Windows was fine (I always used VMware Fusion), and eventually I came to terms with most of the quirks of the Mac. I still try to cut and paste with the Ctrl key sometimes, but I’m getting better.

I year later, I flipped from a horrible Windows CE “smartish” phone from HTC on the day that Apple dropped the price of the original iPhone to $399. Through two startups – one a Windows security startup, the other a Web startup, I used two 15″ MacBook Pros as my primary work computer – first the old stamped MBP, then the early unibody.

For the last two years, I’ve brought an iPad with me to most of the conferences I’ve gone to – even Build 2011, Build 2012, and the SharePoint Conference in 2012. There’s a reason for that. Most PCs can’t get you on a wireless network and keep you connected all day, writing, without needing to plug in (time to plug in, or plugs to use, being a rarity at conferences). Every time I whipped out my iPad and it’s keyboard stand with the Apple Bluetooth keyboard, people would look at me curiously. But quite often, as I’d look around, I’d see many journalists or analysts in the crowd also using Macs or iPads. The truth is, tons of journalists use Macs. Tons of analysts and journalists that cover Microsoft even use Macs – many as their primary device. But there still seems to be this weird ethos that you should use Windows as your primary device if you’re going to talk about Windows. If you are a journalist and you come to a Microsoft meeting or conference with a Mac, there’s all but guaranteed to be a bit of an awkward conversation if you bring it out.

I’m intimately familiar with Windows. I know it quite well. Perhaps a little too well. Windows 8 and I? We’re kind of going in different directions right now. I’m not a big fan of touch. I’m a big fan of a kick-ass desktop experience that works with me.

Last week, my ThinkPad died. This was a week after my iMac had suffered the same fate, and I had recovered it through Time Machine. Both died of a dead Seagate HDD. I believe that there is something deeper going on with the ThinkPad, as it was crashing regularly. While it was running Windows 8, I believe it was the hardware failing, not the operating system, that led to this pain. In general, I had come to terms with Windows 8. Because my ThinkPad was touch, it didn’t work great for me, but worked alright – though I really wasn’t using the “WinRT side” of Windows 8 at all, I had every app I used daily pinned to the Taskbar instead. Even with the Logitech t650, I struggled with the WinRT side of Windows 8.

So here, let me break this awkward silence. I bought another Mac, to use as my primary writing machine. A 13″ Retina MacBook Pro. Shun me. Look down upon me. Shake your head in disbelief. Welcome to Mac shaming. The machine is beautiful, and has a build quality that is really unmatched by any other OEM. A colleague has a new Lenovo Yoga, and I have to admit, it is a very interesting machine – likely one of the few that’s out there that I’d really consider – but it’s just not for me. I also need a great keyboard. The selection of Windows 8 slates with compromised keyboards in order to be tablets is long. I had contemplated getting a Mac for myself for some time. I still have a Windows 8 slate (the Samsung), and will likely end up virtualizing workloads I really need in order to evaluate things.

My first impression is that, as an iPad power user (I use iOS gestures a lot) it’s frighteningly eerie how powerful that makes one on a MBP with Mountain Lion and fullscreen apps. But I’ll talk about that later.

I went through a bit of a dilemma about whether to even post this or not, due to the backlash I expect. Post your thoughts below All I request? I invoke Wheaton’s Law at this point.


17
Oct 12

VDI? OMG.

For two days last week, I was at the annual Chicago installment of our Microsoft Licensing Boot Camp. I’ve been to several of our camps to help present a couple of the topics. I’ve also noticed something unusual (and somewhat frightening) occurring.

What I’ve seen is the growth of – or at least growth of the interest of – Virtual Desktop Infrastructure (VDI). In VDI, the desktop operating system that a user interacts with is virtualized (and often remotely located) rather than being a desktop PC or even a laptop with Windows that the user runs locally. The theory is that by virtualizing, you can centralize deployment, management and servicing, spin VMs up or down as you need them, and sometimes use layering technologies to make this management more efficient. In an environment where you task users with buying/bringing their own work PC, VDI also gives you a way to secure the user’s work environment by providing a common image to all users, secured through RDP.

I say theory because, barring dramatic improvements in how Windows handles state separation (user/app/OS), layering technologies are fraught with some peril. Perhaps some of Citrix’s offerings, or other companies I haven’t seen have unwound the Windows state problem and really enabled efficient virtualization that isn’t just N VM’s for N users. As I’ve never seen otherwise, though, I’m inclined to believe that VDI – and virtualization as a whole, save you money on hardware but do not save you nearly as much in terms of deployment, management and servicing as you might think. With client VDI in particular, you had 8 physical systems horizontally – now you have 8 virtual systems stacked vertically. Hope you’ve chosen a good hypervisor and clustered server to run it on so those virtual desktops have high availability.

VDI has this certain ring to it. If you’ve been in IT, you know the sound. It’s the sound of a technology your CIO asked you to investigate because he heard from another CIO on the golf course, “Wait. You haven’t deployed VDI yet?” Yes, it’s a bright shiny object (BSO) with untold perils if you don’t license it properly.

In NYC when we asked who was looking at doing VDI, two – maybe three – people raised their hands. In Chicago, it was easily 85% of the room either looking at it or doing it now. In NYC, an attendee quietly asked me, “Why would someone ever do VDI instead of Remote Desktop?” Logical question, given RDP’s easily understood – and enforced – licensing, highly scalable architecture (far more users in far less space, RAM, and processor utilization), fault tolerance, etc. I quietly replied back, “I have no idea.” In Chicago, when we had wrapped, an attendee walked up and basically asked me the same thing. He wanted me to help him understand why people are so in love with VDI. I told him, much like NYC, “I don’t understand it either.

VDI isn’t cheap. It’s definitely not free. While you can theoretically remove Windows desktops as the client endpoint and use an RDP dumb terminal (or an iPad), you face licensing complexities as a direct result of doing so.

Microsoft is a better chess player than you are when it comes to licensing. Depending on what you access a Windows VDI system from (using RDP from a user-owned Windows laptop, for example), sometimes you may have, or may not have, properly licensed the client system to ever connect. There’s no magical licensing to prevent you from doing the wrong thing – only the potential penalty of an audit for not having done so correctly. What I’m saying is, there are some huge licensing qualifications that you have to work through in order to implement VDI with the Windows desktop, and not understanding them before you ever look into implementing VDI is kind of like asking Felix Baumgartner to jump from his capsule without ever doing any sort of testing. You could very easily wind up hurting yourself.

As to using an iPad as a VDI client, I’m really confused as to who (if anyone, actually) does this. Accessing Win32 applications from an iPad is akin to torture. It’s a sub 10″ screen, with touch only, no mouse, and a soft keyboard. What kind of tasks are you asking users to perform with this? Either move the task to a proper task-optimized  Web app or iOS app, or give them a proper desktop or laptop system on which to perform their task. I may well dive into this topic in a future post. Sure to generate some conversation.

Are you using VDI? Do you understand the licensing of Windows, Office, and every other software component you’re using? Do you disagree with me that VDI is just a BSO (and believe that you’re saving tons of money with it)? Let me know what you think.


07
Feb 11

Hey kids, let’s go to Dubuque!

When you travel somewhere, especially somewhere new, somewhere eclectic – do you ever buy your airline ticket, hop on the plane, and eagerly look forward to planning your activities once you arrive?

No. No, you don’t. You plan a trip, buy tickets, get everything lined up long before you go. It’s been my contention for some time that buying a new computing device – smartphone, tablet/slate or other, is just like taking a trip. Also, unlike years ago where when we bought a computer, it was guaranteed to come with Windows and run all the old apps that for some reason we hang on to like hoarders on a TV show, today’s new devices come with a Baskin-Robbins assortment of operating systems – none of which will run Windows applications as-is (and that’s fine, as long as enough other apps are actually available for the device being considered).

With all due respect to the people of Dubuque, I call the act of buying a device without regard to how you’ll actually use it, “taking a trip to Dubuque“. I have been to Dubuque once, briefly while moving cross-country, but I can’t speak with authority as to the activities that avail themselves there (I’m sure there are some fun and interesting things to do). But having come from a similarly small town in Montana with a less catchy name, Dubuque works better as a destination that you’re going to want to plan for before you arrive, or you might be a little bored.

I was a fan of Microsoft’s Tablet PC platform when it first came on the scene – in fact my main computer at Microsoft for almost two years was a Motion Computing “slate” device (not a convertible, though I did order a Motion USB keyboard too). Unfortunately, my experience was that handwriting recognition, though handy, wasn’t perfect – and with my horrible handwriting, resulted in an archived database of my handwriting, not anything searchable or digitally usable. In essence, OneNote and a few drawing applications ( I didn’t have Photoshop, but surely it would be useful as well) were the only real applications that took advantage of the Tablet PC platform. That hasn’t changed much. Today the main reason why you’d buy a Tablet PC running Windows 7 is for pen input, not broad consumer scenarios (Motion Computing, which still makes great hardware has become soley focused on medical and services for exactly this reason). Though Windows 7 actually does have full multi-touch gesture support, most people don’t even know this, as witnessed during a recent webinar we had at work where people asked when Microsoft would introduce a version of Windows with touch support (they already do!) – and few applications make the most of it. I haven’t tried using Microsoft Office 2010 with a touch-focused PC, but I can’t imagine it being a great fit. Office, to date, is written to be driven via  a mouse (or a stylus, acting as a proxy-driven mouse). Touch requires a very different user interface design.

The iPad was successful from day 1 because it took advantage of the entire stable of iPhone applications, and simply doubled their resolution (to varying success), and used that to cantilever into motivating developers to build iPad optimized applications. No Android slate has established anywhere near the same market, most likely because of this aspect – when you get the device, what do you do with it? Sure. You’ll browse the web and check email. What else? How many consumers really want to pay $800+, plus data plans for a device that can just check email and browse the web? That’s not very viable. Today, HP announced new, pretty good looking all-in-one TouchSmart devices. Though one section of that article mentions them being consumer focused, the article ends with a fizzle, stating the systems are “designed with the ‘hospitality, retail, and health care’ industries in mind”. Yes, that’s right. Without a stable of consumer-focused multi-touch applications, devices like this, as great as they may sound at first glance, become just simple all-in-one PC’s for most, and touch-based only when damned into a career within a vertical industry with one or more in-house applications written just for touch, that they’ll run day in and day out until the device is retired.

It’s quite unfortunate how touch hasn’t taken off in Windows. ISVs don’t write apps because there aren’t enough touch-based Windows computers and no way to monetize to the ease and degree the Apple App Store has enabled, and yet people don’t buy touch-based Windows PCs for the same reason they don’t buy 3D TV’s – it’s a trip to Dubuque. Like most consumers, I’m not going to buy a ticket there until we’ve got some clear plans of what we’re going to do on the trip.