09
Apr 14

Startups and Getting Things Done

A year or so ago, a good friend from Microsoft told me he was leaving the company, and was pondering a few ideas about what do next. His ideas had one common trait, that he wanted to improve how people got things done, a desire I’ve highlighted in some blog posts before.

Working with a partner, he brainstormed a few ideas, and they focused in on the following use case:

When I post a job to a job board, my inbox gets inundated with resumes. The process of reviewing these is manual and painful, and makes me feel like I’m stuck in the 1990′s. Isn’t there any way to simplify this?

Their answer to that question is here at Jobvention.com, and I think it is pretty impressive (personally, I love apps that streamline tasks that are needlessly complex).

Simplistically, Jobvention enables an easy workflow for a hiring manager to process job candidates. It links together the job posting (from sites like Craigslist) and incoming resumes (from Gmail and Google Apps for Business, currently) together within Jobvention. It also enables bulk upload of resumes, to let you easily process them through Jobvention, and bulk download of any resumes stored in the system for later reference.

When email in your Gmail inbox is synchronized with Jobvention, messages matching the posting are processed, the app links each message to the job posting, and processes all of the resumes, displaying them directly as text within the app. As a result, you don’t have to download them and read them on your desktop, (but you can do so).

From there, candidates can be easily categorized as those you want to thank for their resume but decline, hold for later, or follow up with, according to how well they align with the job posting. Jobvention lets hiring managers send custom email out to categorized candidates, whether to thank them or engage them for follow-up, and shows you the email conversation history from directly within Jobvention (but does not store the messages). See below for a screenshot of Jobvention in action.

screenshot_jobvention_pipeline

 

Candidate resumes can also be kept for later reference if they might be a good fit for another posting down the line.

Today, Jobvention provides the key workflow stages needed to rapidly process potential job candidates. The service is currently free, and iterating pretty rapidly as they continue to refine the service. Personally, I wish I’d had Jobvention when I was hiring developers in Austin long ago, and look forward to seeing how the team moves it forward.


07
Apr 14

The end is near here!

Imagine I handed you a Twinkie (or your favorite shelf-stable food item), and asked you to hold on to it for almost 13 years, and then eat it.

Aw, c’mon. Why the revulsion?

It’s been hard for me to watch the excited countdown to the demise of Windows XP. Though I did help ship Windows Server 2003 as well, no one product (or service) that I’ve ever worked on became so popular, for so long – by any stretch of the imagination – as Windows XP did.

Yet, here we are, reading articles discussing the topic of what country or what company is now shelling out $M to get support coverage for Windows XP for the next 1, 2, or 3 years (getting financially more painful as the year count goes up). It’s important to note that this is no “get out of jail free” card. Nope. This is just life support for an OS that has terminal zero-day. These organizations still have to plan and execute a migration to a newer version of Windows that isn’t on borrowed time.

Why didn’t these governments and companies execute an XP evacuation plan? That’s a very good question. Putting aside the full blame for a second, there’s a bigger issue to consider.

Go back and think of that Twinkie. Contrary to popular opinion, Twinkies don’t last forever (most sources say it’s about 25 days). Regardless, you get the idea that for most normal things, even shelf-stable isn’t shelf-stable forever. Heck, even most MRE‘s need to be stored at a reasonable temperature and will taste suboptimal after 5 or more years.

While I can perhaps excuse consumers who decide to hang on to an operating system past it’s expiration date, I have a harder time understanding how organizations and governments with any long-term focus sat by and let XP sour on them. It would be one thing if XP systems were all standalone and not connected to the Internet. Perhaps then we could turn a blind eye to it. But that’s not usually the case; XP systems in business environments, which lack most of the security protections delivered later for Windows Vista, 7, and 8.x, are largely defenseless, and will be standing there waiting to get pwned as the vulnerabilities stack up after tomorrow. In my mind, the most dangerous thing is security vendors claiming to be able to protect the OS after April 8. In most cases, that’s an all but impossible feat, and instills a false sense of confidence in XP users and administrators.

The key concern I have is that people are looking at Windows XP as if software dying is a new thing, or something unusual. It isn’t. In fact, tomorrow, the entire spectrum of Office 2003 software (the Office productivity suite, SharePoint, Exchange, and more) also leave support and could have their own set of security compromises down the road. But as I said, this isn’t the first time software has entered an unsupportable realm, and it won’t be the last. It’s just a unique combination as we get the perfect storm of XP’s pervasiveness, the ubiquity of the Internet, and the increasing willingness of bad people to do bad things to computers for money. Windows Server 2003 (and 2003 R2) are next, coming up in July of 2015.

People across the board seem to have this odd belief that when they buy a perpetual license to software, it can be used forever (versus Office 365, which people more clearly understand as a subscription that expires if not paid in an ongoing manner). But no software, even if “perpetually licensed”, is actually perpetual. Like that Twinkie I’ve mentioned a few times, even good software goes bad. As an industry, we need to start getting customers throughout the world to understand that, and get more organizations to begin planning software deployments as an ongoing lifecycle, rather than a one-time expense that is ignored until it goes terminal.


12
Mar 14

The trouble with DaaS

I recently read a blog post entitled DaaS is a Non-Starter, discussing how Desktop as a Service (DaaS) is, as the title says, a non-starter. I’ll have to admit, I agree. I’m a bit of a naysayer about DaaS, just as I have long been about VDI itself.

In talking with a colleague the other day, as well as customers at a recent licensing boot camp, it sure seems like VDI, like “enterprise social” is a burger with a whole lot of bun, and not as much meat as you might hope for (given your investment). The promise as I believe it to be is that by centralizing your desktops, you get better manageability. To a degree, I believe that to be true. To a huge degree, I don’t. It really comes down to how standardized you make your desktops, how centrally you manage user document storage, and how much sway your users have (are they admin or can they install their own Win32 apps).

With VDI, the problem is, well… money. First you have server hardware and software costs, second, you have the appropriate storage and networking to actually execute a a VDI implementation, and third, you finally have to spend the money to hire people who can glue it all together in an end-user experience that isn’t horrible. It feels to me that a lot of businesses fall in love with VDI (true client OS-based VDI) without taking the complete cost into account.

With DaaS, you pay a certain amount per month, and your users can access a standardized desktop image hosted on a service provider’s server and infrastructure – which is created and managed by them. The OS here is actually usually Windows Server, not a Windows desktop OS – I’ll discuss that in a second. But as far as infrastructure, using DaaS from a service provider means you usually don’t have to invest the cash in corporate standard Windows desktops or laptops (or Windows Server hardware if you’re trying VDI on-premises), or the high-end networking and storage, or the people to glue that architecture together. Your users, in turn, get (theoretically) the benefits of VDI, regardless of what device they come at it with (a personally owned PC, tablet, whatever).

However, as with any *aaS, you’re then at the mercy of your DaaS purveyor. In turn, you’re also at the mercy of their licensing limitations as it regards Windows. This is why  most of them run Windows Server; it’s the only version of Windows that can generally be made available by hosting providers, and Windows desktop OSs can’t be. You also have to live within the constraints of their DaaS implementation (HW/SW availability, infrastructure, performance, and architecture, etc). To date, most DaaS offerings I’ve seen focused on “get up and running fast!”, not “we’ll work with you to make sure your business needs are solved!”.

Andre’s blog post, mentioned at the beginning of my post here, really hit the nail on the head. In particular, he mentioned good points about enterprise applications, access to files and folders the user needs, adequate bandwidth for real-world use, and DaaS vs. VDI.

To me, the main point here is that with a DaaS, your service provider, not you, get to call a lot of the shots here, and not many of them consider the end-to-end user workflow necessary for your business.

Your users need to get tasks done, wherever they are. Fine. Can they get access to their applications that live on premises, through VDI in the cloud, from a tablet at the airport? How about their files? Does your DaaS require a secondary logon, or does it support SSO from their tablet or other non-company owned/managed device? How fat of a pipe is necessary for your users before they get frustrated? How close can your DaaS come to on-premises functionality (as if the user was sitting at an actual PC with an actual keyboard and mouse (or touch)?

On Twitter, I mentioned to Andre that Microsoft’s own entry into the DaaS space would surely change the game. I don’t know anything (officially or unofficially) here, but it has been long suspected that Microsoft has planned their own DaaS offering.

When you combine the technologies available in Windows Server 2012 R2, Windows Azure, and Office 365, the scenario for a Microsoft DaaS actually starts to become pretty amazing. There are implementation costs to get all of this deployed, mind you – including licensing and deployment/migration. That isn’t free. But it might be worth it if DaaS sounds compelling and I’m right about Microsoft’s approach.

Microsoft’s changes to Active Directory in Server 2012 R2 (AD FS, the Web Application Proxy [WAP]) mean that users can get to AD from wherever they are, and Office 365 and third party services (including a Microsoft DaaS) can have seamless SSO.

Workplace Join can provide that SSO experience, even from a Windows 7, iOS, or Samsung Knox device, and the business can control which assets and applications the user can connect to, even if they’re on the inside of the firewall and the user is not (through WAP, mentioned previously), or available through another third party.

Work Folders enables synchronized access to files and folders that are stored on-premises in Windows file shares, to user devices. This could conceptually be extended to work with a Microsoft (or third-party) DaaS as well, and I have to think OneDrive for Business could be made to work as well given the right VDI/DaaS model.

In a DaaS, applications the user needs could be provided through App-V, RemoteApp running from an on-premises Remote Desktop server (a bit of redundancy, I know), or again, published out through WAP so users could connect to them as if the DaaS servers were on-premises.

When you add in Office 365, it continues building out the solution, since users can again be authenticated using their AD credentials, and OneDrive for Business can provide synchronization to their work PCs and DaaS, or access on their personally owned device.

Performance is of course a key bottleneck here, assuming all of the above pieces are in place, and work as advertised (and beyond). Microsoft’s RemoteFX technology has been advancing in terms of offering a desktop-like experience regardless of the device (and is now supported by Microsoft’s recently acquired RDP clients for OS X, iOS, and Android). While Remote Desktop requires a relatively robust connection to the servers, it degrades relatively gracefully, and can be tuned down for connections with bandwidth/latency issues.

All in all, while I’m still a doubter about VDI, and I think there’s a lot of duct tape you’d need to put in place for a DaaS to be the practical solution to user productivity that many vendors are trying to sell it as, there is promise here, and given the right vendor, things could get interesting.


07
Mar 14

Henry Ford on watches

“As a lad he became expert as an amateur watchmaker. Disliking farm work because, “considering the results, there was too much work on the place,” he became an apprentice mechanic in Detroit, and repaired watches in a jewelry shop at night. He flirted with the idea of entering the watch manufacturing business on a large scale, “but I did not because I figured out that watches were not universal necessities.” His apprenticeship over, he served with the local representative of the Westinghouse Company, setting up and repairing their road engines.”

– Excerpt From Automotive Giants of America (iBooks)

Given the constant rumormongering about the iWatch, reading this (from a book written in 1926) amused me.


05
Mar 14

Considering CarPlay

Late last week, some buzz began building that Apple, alongside automaker partners, would formally reveal the first results of their “iOS in the Car” initiative. Much as rumors had suspected, the end result, now dubbed CarPlay, was demonstrated (or at least shown in a promo video) by initial partners Ferrari, Mercedes-Benz, and Volvo. If you only have time to watch one of them, watch the video of the Ferrari. Though it is an ad-hoc demo, the Ferrari video isn’t painfully overproduced as the Mercedes-Benz video unfortunately is, and isn’t just a concept video as the Volvo is.

The three that were shown are interesting for a variety of reasons (though it is also notable that all three are premium brands). The Ferrari and Volvo videos demonstrate touch-based navigation, and the Mercedes-Benz video uses what (I believe) is their knob-based COMAND system. While CarPlay is navigable using all of them, using the COMAND knob to control the iOS-based experience feels somewhat contrived or forced; like using an old iPod click wheel to navigate a modern iPhone). It just looks painful (to me that’s a M-B issue, not an Apple issue).

Outside of the initial three auto manufacturers, Apple has said that Honda, Hyundai, and Jaguar will also have models in 2014 with CarPlay functionality.

So what exactly is CarPlay?

As I initially looked at CarPlay, it looked like a distinct animal in the Apple ecosystem. But the more I thought about it, the more familiar it looked. Apple pushing their UX out into a new realm, on a device that they don’t own the final interface of… It’s sort of Apple TV, for the car. In fact, pondering what the infrastructure might look like, I kept getting flashbacks to Windows Media Center Extenders, which are remote thin clients that rendered a Windows Media Center UI over a wired or wireless connection.

Apple’s  CarPlay involves a cable-based connection (this seems to be a requirement at this point, I’ll talk about it a bit later) which is used to remotely display several key functions of your compatible iPhone (5s, 5c, 5) on the head unit of your car. That is, the display is that of your auto head unit – but for CarPlay features, your iPhone looks to be what’s actually running the app, and the head unit is simply a dumb terminal rendering it. All data is transmitted through your phone, not some in-car LTE/4G connection, and all of the apps reside, and are updated on your phone, not on the head unit. CarPlay seems to be navigable regardless of the type of touch support your screen has (if it has touch), but also works with buttons, and again, works with knob-based navigation like COMAND.

Apple seems to be requiring two key triggers for CarPlay – 1) a voice command button on the steering wheel, and 2) an entry point into CarPlay itself, generally a button on the head unit (quite easy to see if you watch the Ferrari video, labeled APPLE CARPLAY). Of course these touches are in addition to integrating in the required Apple Lightning cable to tether it all together.

In short, Apple hasn’t done a complete end around of the OEM – the automaker can still have their own UI for their own in-car functions, and then Apple’s distinct CarPlay UI (very familiar to anyone who has used iOS 7) is there when you’re “in CarPlay”, if you will. It seems to me that CarPlay can best be thought of as a remote display for your iPhone, designed to fit the display of your car’s entertainment system. Some have said that “CarPlay systems” are running QNX – perhaps some are. The head unit manufacturer doesn’t really appear to be important here. The main point of all of this is it appears the OEM doesn’t have to do massive work to make it functional, it really looks to primarily be integrating in the remote display functionality and the I/O to the phone. In fact, the UI of the Ferrari as demonstrated doesn’t look to be that different from head units in previous versions of the FF (from what I can see). Also, if you watch the Apple employee towards the end, you can see her press the FF “app”, exiting out to the FF’s own user interface, which is distinctly different from the CarPlay UI. The CarPlay UI, in contrast, is remarkably consistent across the three examples shown so far. While the automakers all have their own unique touches, and controls for the rest of the vehicle, these distinct things that the phone is, frankly, better at, are done through the CarPlay UI.

The built-in iPhone apps supported with CarPlay at this point appear to be:

  • Phone
  • Messages
  • Maps
  • Music
  • Podcasts

The obvious scenarios here are making/receiving phone calls or sending/receiving SMS/iMessages with your phone’s native contact list, and navigation. Quick tasks. Not surfing or searching the Web while you’re driving. Yay! The Maps app has an interesting touch that the Apple employee chose to highlight in the Ferrari video, where maps you’ve been sent in messages are displayed in the list of potential destinations you can choose from. Obviously the CarPlay solution enables Apple’s turn-by-turn maps. If you’re an Apple Maps fan, that’s great news (I’m quite happy with them at this point, personally). If you like using Google Maps or another mapping/messaging or VOIP solution, it looks like you’re out of luck at this point.

In addition to touch, button, or knob-based navigation, Siri is omnipresent in CarPlay, and the system can use voice as your primary input mechanism (triggered through a voice command button on the steering wheel), and is used for reading text messages out loud to you, and responding to them. I use that Siri feature pretty often, myself.

The Music and Podcasts seem like obvious apps to make available, especially now that iTunes Radio is available (although most people either either love or hate the Podcasts app). Just as importantly, Apple is making a handful of third-party applications at this point. Notably:

  • Spotify
  • iHeartRadio
  • Stitcher

Though Apple’s CarPlay site does call out the Beats Music app as well, I noticed it was missing in the Ferrari demo.

Overall, I like Apple’s direction with this. Of course, as I said on Twitter, I’m so vested in the walled garden, I don’t necessarily care that it doesn’t integrate in with handsets from other platforms. That said, I do think most OEMs will be looking at alternatives and implementing one or more of them simultaneously (hopefully implementing all of them that they choose to in a somewhat consistent manner).

Personally, I see quite a few positives to CarPlay:

  • If you have an iPhone, it takes advantage of the device that is already your personal  hub, instead of trying to reinvent it
  • It isolates the things the manufacturer may either be good at or may want to control, and the CarPlay UX. In short, Apple gets their own UX, presented reliably
  • It uses your existing data connection, not yet another one for the car
  • It uses one cable connection. No WiFi or BLE connectivity, and charges while it works
  • I trust Apple to build a lower-distraction (Siri-centric) UI than most automakers
  • It can be updated by Apple, independent of the car head unit
  • Apple can push new apps to it independent of the manufacturer
  • Apple Maps may suck in some people’s perspective (not mine), but it isn’t nearly as bad as some in-dash nav systems (watch some of Brian’s car reviews if you don’t believe me), and doesn’t require shelling out for shiny-media based updates!

Of course, there are some criticisms I or others have already mentioned on Twitter or in reviews:

  • It requires, and uses, iOS 7. Don’t like the iOS 7 UI? You’re probably not going to be a fan
  • It requires a cable connection. Not WiFi or BLE. This is a good/bad thing. I think in time, we’ll see considerate design of integrated phone slots or the like – push the phone in, flat, to dock it. The cables look hacky, but likely enable the security, performance, low latency, and integrated charging that are a better experience overall (also discourages you from picking the phone up while driving)
  • Apple Maps. If you don’t like it, you don’t like it. I do, but lots of people still seem to like deriding it
  • It is yet another Apple walled garden (like Apple TV, or iOS as a whole). Apple controls the UI of CarPlay, how it works, and what apps and content are or are not available. Just like Apple TV is at present. The fact that it is not an open platform or open spec also bothers some.

Overall, I really am excited by what CarPlay represents. I’ve never seen an in-car entertainment system I really loved. While I don’t think I really love any of the three head units I’ve seen so far, I do relish the idea of being able to use the device I like to use already, and having an app experience I’m already familiar with. Now I just need to have it hit some lower-priced vehicles I actually want to buy.

Speaking of that; Apple has said that, beyond the makers above, the following manufacturers have also signed on to work with CarPlay:

BMW Group (which includes Mini and Rolls-Royce), Chevrolet, Ford, Kia, Land Rover, Mitsubishi, Nissan, Opel PSA Peugeot Citroen, Subaru, Suzuki, and Toyota.

As a VW fan, I was disheartened to not see VW on the list. Frankly I wouldn’t be terribly surprised to see a higher-end VW marque opt into it before too long (Porsche, Audi, or Bentley seem like obvious ones to me – but we’ll see). Also absent? Tesla. But I wouldn’t be surprised to see that show up in time as well.

It’s an interesting start. I look forward to seeing how Google, Microsoft, and others continue to evolve their own automotive stories over the coming years – but I think one thing is for sure; the beginning of the phone as the hub of the car (and beyond) is just beginning.


03
Mar 14

Here’s a fun game… guess the executive

No Googling – that’s cheating. Tell me the executive (a former CEO) and the company. I’ve paraphrased a couple of parts that would give it away.

“<he’s> been the primary architect of a failed transformation of <the company> from its core <redacted> heritage to some expansive consumer-centric organization, which we think employees, <partners>, and investors have found to varying degrees to be somewhat incomprehensible.”

 

Answer key: The above is a redacted quote about Jacques Nasser, the ousted CEO of Ford.


17
Jan 14

Running Windows XP after April? A couple of suggestions for you

Yesterday on Twitter, I said the following:

Suggestion… If you have an XP system that you ABSOLUTELY must run after April, I’d remove all JREs, as well as Acrobat Reader and Flash.

This was inspired by an inquiry from a customer about Windows XP support that arrived earlier in the day.

As a result of that tweet, three things have happened.

  1. Many people replied “unplug it from the network!” 1
  2. Several people asked me why I suggested doing these steps.
  3. I’ve begun working on a more comprehensive set of recommendations, to be available shortly. 2

First off… Yes, it’d be ideal if we could just retire all of these XP systems on a dime. But that’s not going to happen. If it was easy (or free), businesses and consumers wouldn’t have waited until the last second to retire these systems. But there’s a reason why they haven’t. Medical/dental practices have practice management or other proprietary software that isn’t tested/supported on anything newer, custom point of sale software from vendors that disappeared, were acquired, or simply never brought that version of their software… There’s a multitude of reasons, and these systems aren’t all going to disappear or be shut off by April. It’s not going to happen. It’s unfortunate, but there are a lot of Windows XP systems that will be used for many years still in many places that we’d all rather not see happen. There’s no silver bullet for that. Hence, my off the cuff recommendations over Twitter.

Second, there’s a reason why I called out these three pieces of software. If you aren’t familiar with the history, I’d encourage you to go Bing (or Google, or…) the three following searches:

  1. zero day java vulnerability
  2. zero day Flash vulnerability
  3. zero day Acrobat vulnerability

Now if you looked carefully, each one of those, at least on Bing, returned well over 1M results, many (most?) of them from the last three years. In telling me that these XP systems should be disconnected from the Web, many people missed the point I was making.

PCs don’t get infected from the inside out. They get infected from the outside in. When Microsoft had the “Security Push” over ten years ago that forced us to reconsider how we designed, built and tested software, it involved stopping where we were, and completely thinking about how Windows was built. Threat models replaced ridiculous statements like, “We have the very best xx encryption, so we’re ‘secure’”. While Windows XP may be more porous than Vista and later are (because the company was able to implement foundational security even more deeply, and engineer protections deeply into IE, for example, as well as implement primordial UAC), Windows XPSP2 and later are far less of a threat vector than XPSP1 and earlier were. So if you’re a bad guy and you want to get bad things to happen on a PC today, who do you go after? It isn’t Windows binaries themselves, or even IE. You go next for the application runtimes that are nearly as pervasive. Java, Flash, and Acrobat. Arguably, Acrobat may or may not be a runtime, depending on your POV. But the threat is still there, especially if you haven’t been maintaining these as they’ve been updated over the last few years.

As hard as Adobe and Oracle may try to keep these three patched, these three codebases have significant vulnerabilities that are found far too often. Those vulnerabilities, if not patched by vendors and updated by system owners incredibly quickly, become the primary vector of infecting both Windows and OS X systems by executing shellcode.

After April, Windows XP is expected to get no updates. Got that? NO UPDATES. NONE. Nada. Zippo. Zilch. So while you may get antivirus updates from Microsoft and third parties, but at that point you honestly have a rotting wooden boat. I say this in the nicest way possible. I was on the team shipping Windows XP, and it saddens me to throw it under the bus, but I don’t think people get the threat here. Antivirus simply cannot protect you from every kind of attack. Windows XP and the versions of IE (6-8) have still regularly received patches almost every month for the past several years. So Windows XP isn’t “war hardened”, it is brittle. So after April, you won’t even get those patches trying to spackle over newly found vulnerabilities in the OS and IE. Instead, these will become exploit vectors ready to be hit by shellcode coming in off of the Internet (or even the local network) and turned into opportunistic infections.

Disclaimer: This is absolutely NOT a guarantee that systems won’t get infected, and you should NOT remove these or any piece of Microsoft or third-party software if a business-critical application actually depends on them or if you do not understand the dependencies of the applications in use on a particular PC or set of PCs! 

So what is a business or consumer to do? Jettison, baby. Jettison. If you can’t retire the entire Windows XP system, retire every single piece of software on that system that you can, beginning with the three I mentioned above. Those are key connection points of any system to the Web/Internet. Remove them and there is a good likelihood of lessening the infection vector.   But it is a recommendation to make jetsam of any software on those XP systems that you really don’t need. Think of this as not traveling to a country where a specific disease is breaking out until the threat has passed. In the same vein, I’d say blocking Web browsers and removing email clients coming in a close second, since they’re such a great vector for social engineering-based infections today.

Finally, as I mentioned earlier, I am working on an even more comprehensive set of recommendations to come in a more comprehensive report to be published for work, in our next issue, which should be live on the Web during the last week of January. My first recommendation would of course be to, if at all possible, retire your Windows XP systems as soon as possible. But I hope that this set of recommendations, while absolutely not a guarantee, can help some people as they move away, or finally consider how to move away, from Windows XP.

Footnotes

  1. Or unplug the power, or blow it up with explosives, or…
  2. These recommendations will be included in the next issue of Update.

14
Jan 14

What did I learn from Nest?

So today Google announced that they will pay US$3.2B for Nest Labs. Surely the intention here is to have the staff of Nest help Google with home automation, the larger Internet of Things (IoT) direction, and user interfaces. All three of these are, frankly, trouble spots for Google, and if they nurture the Nest team and let them thrive, it’ll be a good addition to Google. Otherwise, they will have wound up paying a premium to buy out a good company and lose the employees as soon as they can run.

In 2012, just after I received it, I wrote about my experience with the first generation Nest thermostat. As I said on Monday evening when asked how I liked my Nest, I said:

It hasn’t exactly changed my life, but it has saved on energy costs, and it’s not hideous like most thermostats.

As I noted on Twitter as well, today’s news makes me sad. I bought Nest because it felt like they truly cared about thoughtful design. I also got the genuine feeling from the beginning that they cared genuinely about privacy.

Last year, I wrote the following about the dangers in relying on software (and hardware) that relies upon subscriptions:

Google exemplifies another side of this, where you can’t really be certain how long they will continue to offer a service. Whether it’s discontinuing consumer-grade services like Reader, or discontinuing the free level of Apps for Business, before subscribing to Google’s services an organization should generally not only raise questions around privacy and security, but just consider the long-term viability of the service. “Will Google keep this service alive in the future?” Perhaps that sounds cynical – but I believe it’s a legitimate concern. If you’re moving yourself or your business to a subscription service (heck, even a free one), you owe it to yourself to try and ascertain how long you’ve got before you can’t even count on that service anymore.

Unfortunately, my words feel prophetic now. If I’d known two years ago what I know today, maybe I’d have wavered more and decided against the Nest. Maybe not.

As I look back at Nest, it helps me frame the logic I’ll personally use when considering future IoT purchases. Ideally from now on, I’d like to consider instead:

  1. Buying devices with open APIs or open firmware. If the APIs or firmware of Nest were opened up, the devices could have had alternative apps built against them by the open-source community (to generally poor, but possible, effect). This is about as likely to happen now as Nest sharing their windfall with early adopters like myself.
  2. Buying devices with standards-based I/O (Bluetooth 4.0, Wi-Fi) and apps that can work without a Web point of contact. While a thermostat is a unique device that does clamor for a display, I think that most devices on the IoT should really have a limited, if any, display and rely on Web or smart phone apps over Wi-Fi or BT 4.0 in order to be configurable. Much like point 1, this would mean some way out if the company shutters its Web API.
  3. Buying devices from larger companies. Most of the major thermostat manufacturers are making smarter thermostats now, although aesthetically, most are still crap.
  4. Buying “dumb” alternatives. A minimalist programmable or simple non-programmable thermostat again.

In short, it’ll probably be a while before I spend money – especially premium money – on another IoT device.

Peter Bright wrote a great piece the other day on why “smart devices” were a disaster waiting to happen. Long story short, hardware purveyors suck at creating devices that stand any sort of chance of being updated. In many ways, the unfortunate practice we’ve seen with Android phones will likely become the norm with lots of embedded devices (in cars or major appliances). What seems so cool and awesome the day we buy a new piece of technology will become frustrating as all hell when it won’t work with your new phone or requires a paid subscription but used to be free.

In talking with a colleague today, I found myself taking almost a Luddite’s perspective on smart devices and the IoT. It isn’t that these devices, done right, can’t make our lives easier. It’s that we always must be wary of who we’re buying them from, whether they truly make our life easier or not, and what future they have. I’ve never been a huge believer in smart devices, but if designed considerately, I think they can be beneficial. As for me, I think the main thing I learned from Nest is to always consider the worst possible outcome of the startup I buy hardware from (yes, to me, Google was just shy of the worst possible outcome, which would have been seeing it shut down).

While I had hopes that Apple would buy Nest, as I noted on Twitter, that idea probably never really made sense. Nest made custom hardware and custom (non Apple, of course) software that had far more to do with Google’s software realm than Apple’s. I also think that while the thermostat is a use case that lots of people “just get”, I’m not sure that the device fits well in Apple’s world. While the simple UI of the Nest is very Apple-like, it doesn’t seem like a war Apple would choose to fight. I think when it comes to home automation, Apple will be standing back and letting Bluetooth 4.0 interconnected home devices take the helm in the smart home, but having iOS play the role of conductor. I also had hopes that Nest could try to be bold and push the envelope of home automation beyond the hacky do-it-yourself approaches that have been around for years before the Nest arrived, but I’m fearful whether the Nest team will succeed with that at Google. I guess time will tell. It pains me to see Nest become part of Google, but I have to congratulate the Nest team on pushing the envelope as they did, and I hope for their sake and Google’s that they can continue to push that envelope successfully from within Google.


05
Jan 14

Bimodal tablets (Windows and Android). Remember them when they’re gone. Again.

I hope these rumors are wrong, but for some odd reason, the Web is full of rumors that this year’s CES will bring a glut of bimodal tablets; devices that are designed to run Windows 8.1, but also feature an integrated instance of Android. But why?

For years, Microsoft and Intel were seemingly the best of partners. While Microsoft had fleeting dalliances with other processor architectures, they always came back to Intel. There were clear lines in the sand;

  1. Intel made processors
  2. Microsoft made software
  3. Their mutual partners (ODMs and OEMs) made complete systems.

When Microsoft announced the Surface tablets, they crossed a line. Their partners (Intel and the device manufactures) were stuck in an odd place. Continue partnering just with Microsoft (now a competitor to manufacturers, and a direct purveyor of consumer devices with ARM processors), or find alternative counterpoints to ensure that they weren’t stuck in the event that Microsoft harmed their market.

For device manufacturers, this has meant what we might have thought unthinkable 3 years ago, with key manufacturers (now believing that their former partner is now also a competitor) building Android and Chrome OS devices. For Intel, it has meant looking even more broadly at what other operating systems they should ensure compatibility with, and evangelization of (predominantly Android).

While the Windows Store has grown in terms of app count, there are still some holes, and there isn’t really a gravitational pull of apps leading users to the platform. Yet.

So some OEMs, and seemingly Intel, have collaborated on this effort to glue together Windows 8.1 and Android on a single device, with the hopes that the two OSs combined in some way equate to “consumer value”. However, there’s really no clear sign that the consumer benefits from this approach, and in fact they really lose, as they’ve now got a Windows device with precious storage space consumed by an Android install of dubious value. If the consumer really wanted an Android device, they’re in the opposite conundrum.

Really, the OEMs and Intel have to be going into this strategy without any concern for consumers. It’s just about moving devices, and trying to ensure an ecosystem is there when they can’t (or don’t want to) bet on one platform exclusively. The end result is a device that instead of doing task A well, or task B well, does a really middling job with both of them, and results in a device that the user regrets buying (or worse, regrets being given).

BIOS manufacturers and OEMs have gone down this road several times before, usually trying to put Linux either in firmware or on disk as a rapid-boot dual use environment to “get online faster” or watch movies without waiting for Windows to boot/unhibernate. To my knowledge most devices that ever had these modes provided by the OEM were rarely actually used. Users hate rebooting, they get confused by where their Web bookmarks are (or aren’t) when they need them, etc.

These kinds of approaches rarely solve problems for users; in fact, they usually create problems instead, and are a huge nightmare in terms of management. Non-technical users are generally horrible about maintaining one OS. Give them two on a single device? This will turn out quite well, don’t you think? In the end, these devices, unless executed flawlessly, are damaging to both the Windows and Android ecosystems, the OEMs, and Intel. Any bad experiences will likely result in returns, or exchanges for iPads.


29
Dec 13

My predictions for wearables in 2014

It’s the season for predictions, so I thought I’d offer you my predictions about wearables in 2014.

  1. Wearables will continue to be nerd porn in 2014 (in other words, when you say “wearable devices”, most normal people will respond, “what?”)
  2. Many wearable devices will be proposed by vendors.
  3. Too many of those will actually make it to market.
  4. A few of those will be useful.
  5. A handful of those will be aesthetically pleasing.
  6. A minute number (possibly 0) of those will actually be usable.