05
Apr 13

A few of my favorite movies (or: Hollywood – death by 1,000 paper cuts)

Over a year ago, I wrote a missive about how idiotic Hollywood’s “Rental Window” was. Luckily, things have changed, and now you can easily stream a movie from almost any streaming service the same day it comes out on Blu-ray and DVD.

I kid. You know that’s never going to happen. I recall as a kid when we wanted to rent a movie on VHS, we had to go to the video store and see if they a) stocked it or b) had a copy in stock (or were all out). If you really wanted a movie, say for a child’s birthday party or similar, you might have to hit several video stores before you could find a copy (and hopefully it wasn’t the copy with the spot of the tape that was bent or demagnetized. TRACKING!!! SOMEBODY ADJUST THE TRACKING!

But I digress. My point is that we’ve come more than 30 years into the future from when that era started, yet we haven’t moved at all. I was out with some friends the other day, and mentioned a few of my favorite movies. They asked for a list, and I thought – sure, in fact, I’ll add links to streamable versions for those I can.

But the insanity that is trying to find streamed movies results in so much chaos. There are dozens of streaming services, each with different catalogs. And some movies – either because they’re too old, or the studio is holding them hostage for a later day, aren’t available on streaming, only on shiny media.

Enter Can I Stream It?, a service that attempts to tell you where you can stream (or rent) movies. It’s an IMDB of movie availability, and also has apps available for several platforms.

Here are some of my favorite drama or suspense movies:

 
Here are some of my favorite comedy movies (I accept no liability if you watch them and hate them):

 

 

 

 


30
Mar 13

The mythology of MinWin, MinKernel, and BaseFS

Beginning with Windows NT Embedded, an effort was started to refactor the Windows operating system in such a way that embedded device manufacturers could tweak the operating system to only include the parts that they needed to run their application or workload. Though it wasn’t completely decomposed/recomposable, it was a huge step in the right direction. By removing components not necessary to run a certain workload (say a point of sale terminal or a control system for manufacturing), the security footprint, as well as the overall footprint of the OS (sometimes CPU, but more often HDD/RAM and video) could be constrained, resulting in a device that was better at what it was supposed to do, while minimizing security and cost risks. Take a look at the claims on that Windows NT Embedded page – 9MB for a standalone OS runtime that didn’t include the Windows shell or networking.

This effort kept progressing throughout Windows, and the next major release, Windows XP Embedded, took the effort, referred to as componentization, even further. This effort required the XP Embedded (XPE) team to go out across Windows and evangelize componentization, and convince teams to help them break down their dependencies up and down the stack. You would be shocked to learn about how intertwined pieces of Windows were with each other. Nobody had, for the life of NT, ensured that components at the top levels of Windows had logically defined or clarified their dependencies down the stack. As a result, if you added one component of the Windows shell, you could wind up bringing in a gigantic stack of components. Add a certain runtime, and you needed to add a control panel (.cpl) file, because… well, it made sense at the time to put that library inside the cpl file instead of another DLL.

At that same time during Windows XP (Windows Whistler) development, I was on the setup and deployment team, which was separate from Windows Embedded (but would later merge). I was the second program manager (I  took the specs from the customers to the developers) for WinPE, also known as Windows Preinstallation Environment. If you’ve never heard of Windows PE, imagine a tiny OS runtime, about 120MB, that could run in 96MB or less of RAM. It featured TCP/IP networking, NTFS and FAT disk access, and cmd.exe as it’s shell. Most importantly, WinPE booted from CD, and was intended to replace MS-DOS for the pre-installation needs of Windows, whether you needed to kick off a scripted install of setup, or throw down an image to disk. WinPE was created primarily because the new Itanium platform had no deployment OS to help OEMs deploy Windows to their hardware. While Windows on Itanium died, WinPE became the underpinning of effectively every deployment and recovery tool in Windows.

I mention WinPE for one reason – I explicitly lived the XPE team’s pain trying to understand dependencies. Not long after we revealed WinPE to OEMs, they came back with wish lists of things they wanted added – including a Web browser. Eventually, I was able to talk them down from a Web browser, because they really wanted HTML Applications (HTA). I also hacked together Windows Script Host support and ADO connectivity to SQL Server. You cannot imagine the work I did to find the dependencies for those three components – which had not been defined individually in a way I could just grab and reuse. Finding someone who knew IE internals to the level I needed was almost impossible.

So what does this have to do with MinWin? Not long after Windows XP wrapped, there was a goal to come up with a minimal version of Windows (MinWin) that could serve as the centerpiece of every version of Windows. Whether you were building Windows Server or a Windows client release, or even WinPE, MinWin was the idea of a bootable version of Windows that featured the core components of the OS. I guess you could perhaps think of it as the quark of Windows. A common misstatement I’ve heard is that “MinWin is the minimal Windows kernel” or anything relating just to the kernel. It is more than just the kernel. MinWin, like the development of Windows on ARM that also occurred while I was at Microsoft, was never officially discussed outside, other than a few leaks. As a result, there was always much confusion about it. Most importantly, MinWin was never “a thing”. You never would be able to buy MinWin by itself, it was always intended to be a core of every version of Windows. In essence, it wasn’t about making a thing, it was about refactoring what we already had, to make it more nimble, both for continuing and growing embedded scenarios for Windows, but also to create products like Windows Server Core is today, as well as the current version of WinPE.

I encourage you to keep that in mind as you read this piece on MinKernel and BaseFS. While I have not talked with anyone inside Microsoft about either of these projects, I can only assume that they are the continuation of the refactoring of Windows. The Windows kernel was, for a long time, much more regulated than most of Windows. However, it still grew “big-boned” over time as things like Win32, networking, video,  .NET, WinSxS, and now WinRT required the kernel to be constantly updated – in a completely monolithic form. That would be fine, if every device required everything that was in the kernel. But it doesn’t. Windows Phone 8 likely includes at least some aspects of Win32, even though it will never run a full legacy Win32 application. So here again, my assumption is that the MinKernel project isn’t anything new per se, rather it is a modular, rather than monolithic, approach to building the Windows kernel. The result would be a layer of native-mode pieces that can be combined as needed for numerous platforms (Windows Phone 8+, Windows 8+, Windows RT+, Windows Azure, and even potentially the next generation Xbox and other hardware we aren’t aware of yet). The same could exist in the BaseFS realm. NTFS includes a ton of baggage, much of which is used by a small number of customers. Many of the NTFS features I’m alluding to as baggage didn’t even make it into the first public iteration of ReFS, which attempts to replicate much – but not all – of the functionality of NTFS. So what could BaseFS be? Likely the minimal NTFS, ReFS, or perhaps a shared driver for both, that implements a minimal set of filesystem functionality.

One can imagine this kind of refactoring of both to result in not only a well-factored and malleable kernel and filesystem driver, but also the kind of thin working set that might be ideal for, perhaps, running Windows Azure or Windows Hyper-V Server on, where the role of the server is solely as a host. I’m guessing here, and time will tell if I’m right or if I’ve missed some greater scenario. But by and large, much as MinWin was never a tangible thing, I’m not clear at this point that BaseFS or MinKernel will be much that even power users will ever get to see, directly.


28
Mar 13

Theology… theology… theology…

Feedback on yesterday’s post, both here and on Twitter, seemed to generally be relatively uniform. Not so much divisive, but more along the lines of, “You think you’ve got it bad? Try bringing a Windows PC to a Mac environment.”

You all bring up a fair point. Personally, I find it amusing that I know of not one, but two technology journalists who at one time or another covered the religion beat on a local newspaper. Why is that amusing? Because technology isn’t really that different.

Think about it; in Windows, Apple, and Linux, we’ve got all the makings of three religions that can never be at peace with each other.

Each has fundamental belief systems, theological figureheads, and in Redmond and Cupertino, at least two of them have a central place where nerds of that respective tech cult frequently flock to.

Most significantly, though, each frequently brings with it’s theological belief system intolerance of the others. Each adopts gross generalizations about “how the other two-thirds live”. We’ve all heard it.

When it comes to non-tech theology, I have my own belief system. But you know what? When it comes to religion, politics, or technology, I’m a big believer that Wheaton’s Law still applies. Don’t be a dick to other people just because they do something that doesn’t mirror the choices you make.

Every technology (heck, every belief system) has pros and cons. Many of the pros one side will hold up are viewed by the other side(s) as cons. We don’t all have to agree on what technology is best. But can you imagine where we could get if we all could take a step back and observe the world from the perspective of other people who aren’t fanbois of our respective belief system (religion, politics, or technology? I think that could really take us beyond the angry comment troll realm to a world where we could actually move forward as a species.


27
Mar 13

The Stigma of Mac Shaming

I recall hearing a story of a co-worker at Microsoft, who was a technical assistant to an executive, who had a Mac. It wouldn’t normally be a big deal, except he worked directly for an executive. As a result, this Mac was seen in many meetings across campus – it’s distinct aluminum body and fruity ghost shining through the lid a constant reminder that this was one less PC sold (even if it ran Windows through Boot Camp or virtualization software. Throughout most of Microsoft, there was a strange culture of “eww, a Mac”. Bring a Mac or an iPod to work, feel like an outcast. This was my first exposure to Mac Shaming.

I left Microsoft in 2004, to work at Winternals in Austin (where I had the last PC I ever really loved – a Toshiba Tecra A6). In 2006, on the day Apple announced Boot Camp, I placed an order for a white Intel iMac. This was just over three months before Winternals was acquired by Microsoft (but SHH… I wasn’t supposed to know that yet). This was my first Mac. Ever.

Even though I had worked at Microsoft for over 7 years, and was still writing for Microsoft’s TechNet Magazine as a monthly Contributing Editor, I was frustrated. My main Windows PC at home was an HP Windows XP Media Center PC. Words cannot express my frustration at this PC. It “worked” as I originally received it – but almost every time it was updated, something broke. All I wanted was a computer that worked like an appliance. I was tired of pulling and pushing software and hardware to try and get it to work reliably. I saw Windows Vista on the horizon and… I saw little hope for me coming to terms with using Windows much at home. It was a perfect storm – me being extreme underwhelmed with Windows Vista, and the Mac supporting Windows so I could dual-boot Windows as I needed to in order to write. And so it began.

Writing on the Mac was fine – I used Word, and it worked well enough. Running Windows was fine (I always used VMware Fusion), and eventually I came to terms with most of the quirks of the Mac. I still try to cut and paste with the Ctrl key sometimes, but I’m getting better.

I year later, I flipped from a horrible Windows CE “smartish” phone from HTC on the day that Apple dropped the price of the original iPhone to $399. Through two startups – one a Windows security startup, the other a Web startup, I used two 15″ MacBook Pros as my primary work computer – first the old stamped MBP, then the early unibody.

For the last two years, I’ve brought an iPad with me to most of the conferences I’ve gone to – even Build 2011, Build 2012, and the SharePoint Conference in 2012. There’s a reason for that. Most PCs can’t get you on a wireless network and keep you connected all day, writing, without needing to plug in (time to plug in, or plugs to use, being a rarity at conferences). Every time I whipped out my iPad and it’s keyboard stand with the Apple Bluetooth keyboard, people would look at me curiously. But quite often, as I’d look around, I’d see many journalists or analysts in the crowd also using Macs or iPads. The truth is, tons of journalists use Macs. Tons of analysts and journalists that cover Microsoft even use Macs – many as their primary device. But there still seems to be this weird ethos that you should use Windows as your primary device if you’re going to talk about Windows. If you are a journalist and you come to a Microsoft meeting or conference with a Mac, there’s all but guaranteed to be a bit of an awkward conversation if you bring it out.

I’m intimately familiar with Windows. I know it quite well. Perhaps a little too well. Windows 8 and I? We’re kind of going in different directions right now. I’m not a big fan of touch. I’m a big fan of a kick-ass desktop experience that works with me.

Last week, my ThinkPad died. This was a week after my iMac had suffered the same fate, and I had recovered it through Time Machine. Both died of a dead Seagate HDD. I believe that there is something deeper going on with the ThinkPad, as it was crashing regularly. While it was running Windows 8, I believe it was the hardware failing, not the operating system, that led to this pain. In general, I had come to terms with Windows 8. Because my ThinkPad was touch, it didn’t work great for me, but worked alright – though I really wasn’t using the “WinRT side” of Windows 8 at all, I had every app I used daily pinned to the Taskbar instead. Even with the Logitech t650, I struggled with the WinRT side of Windows 8.

So here, let me break this awkward silence. I bought another Mac, to use as my primary writing machine. A 13″ Retina MacBook Pro. Shun me. Look down upon me. Shake your head in disbelief. Welcome to Mac shaming. The machine is beautiful, and has a build quality that is really unmatched by any other OEM. A colleague has a new Lenovo Yoga, and I have to admit, it is a very interesting machine – likely one of the few that’s out there that I’d really consider – but it’s just not for me. I also need a great keyboard. The selection of Windows 8 slates with compromised keyboards in order to be tablets is long. I had contemplated getting a Mac for myself for some time. I still have a Windows 8 slate (the Samsung), and will likely end up virtualizing workloads I really need in order to evaluate things.

My first impression is that, as an iPad power user (I use iOS gestures a lot) it’s frighteningly eerie how powerful that makes one on a MBP with Mountain Lion and fullscreen apps. But I’ll talk about that later.

I went through a bit of a dilemma about whether to even post this or not, due to the backlash I expect. Post your thoughts below All I request? I invoke Wheaton’s Law at this point.


25
Mar 13

The care and feeding of software

App hoarding. The dark, unspoken secret. We’ve all done it. I logged on to a Windows 8 tablet I hadn’t used for quite some time, and I was so ashamed of myself. So much junk, so many free apps I downloaded, tried, and abandoned. Only recently have I begun steadfastly maintaining a “two screen” limit on iOS to try and keep the applications on my devices solely to those that I use regularly.

This isn’t new, mind you. Enterprises have been doing this for years. Sometimes the “application” is an Excel spreadsheet. Sometimes it’s an old database application, or some other piece of old code, owned by a developer who long since ran from the organization.

For a long time, like Microsoft and comprehensive security ahead of the Windows security push, customers could turn a blind eye to application proliferation. Like feral rabbits, one will lead to many, and if you don’t manage or cut them back, they get out of control. Unfortunately, many enterprise applications are borne out of short-term necessity, without a great period of design forethought. Just as unfortunately, nobody goes around every year and does an “application census” in most organizations to figure out which applications are dead, abandoned, unused, or worst of all – insecure or unsecurable.

A colleague today was telling me about an antivirus application from a major vendor that relies on Java. Terrifying. But that’s nothing. Java is still supported (how well supported is arguable). If your organization is of any significant size, you’ve got applications based around ancient versions of Microsoft Office, SQL Server, or other products and platforms that are likely long past their expiration date. No updates, no patches, nothing. Yet your organization depends on them, and likely has no security mitigation or migration story in play. With the current crop of vulnerabilities we’ve seen recently in Java, Flash, and Acrobat Reader, I’ve been growing increasingly concerned with how dependent so many organizations are upon all three, yet how laissez-faire they seem to be about eliminating or at least reducing their dependency upon all three.

On a similar note, as someone who helped ship Windows XP, I love how well it has stood the test of time. But it was not engineered for today’s world – from an always-on connection to the Internet to the threat vectors being thrown at it and the software running on it.

It concerns me that so many organizations aren’t cognizant of what software (operating systems, platforms, and applications) are running in their organizations. They talk big of cloud, and how it’s better that they run the software on their premises. Yet they’re running old, unpatched software, often with known, never-to-be-patched vulnerabilities, and no plan to consolidate applications and remove dead, unsupported operating systems, platforms, and applications. It’s the equivalent of every enterprise having a bunch of storage units full of random crap you keep around because “someone might need it someday”.

Microsoft has been beating a drum about Windows XP – if you look at it closely, it sounds more like a marketing message. But whether you view it as that or not, and whether Windows 7, Windows 8, or something else entirely is in the cards for you, your business has barely one year to get off of Windows XP (April 8, 2014). We’ve heard from some customers that they have heard of custom support options after that time, but they are on a per-desktop basis, and the adage, “if you have to ask, you probably can’t afford it” appears to apply quite well. Windows XP (officially at death’s door) and Office 2003, also very widely used still, both pass into the great beyond on that same day.

Whether it is Windows XP, Office 2003, more porous (hard or impossible to patch) platform components, or custom applications on top of them, it’s imperative that organizations start managing and monitoring – and deprecating/discontinuing – applications that rely on dead software to exist. They’re putting your organization at risk. For me, there are two takes to this – cut back the applications you already have, and more importantly, carefully regulate how you build and deploy new ones, with a keen eye on the support lifecycle – and the patchability/supportability – of the OS, runtimes, and applications that you build upon. Applications can seem quick and easy to build on a whim. But like a puppy, or perhaps even more like a parrot, applications aren’t free to build or maintain. They are a long-term commitment.


24
Mar 13

One release away from irrelevance

A few weeks ago on Twitter, I said something about Apple, and someone replied back something akin to, “Apple is only one release away from irrelevance.”

Ah, but you see… we all are. In terms of sustainability, if you believe “we get this version released, and we win”, you lose. Whether you have competitors today, or you have a market that is principally yours, if there is enough opportunity for you, there’s enough appeal for someone else to enter it too.

A book I recently read discussed the first generation Ford Taurus. Started at the cusp of the 1980’s, after a decade of largely mediocre vehicles from Ford, the Taurus (and a handful of other vehicles that arrived near the same time) changed the aesthetic experience we expected from cars. The book’s author comments that Ford had even largely stopped using it’s blue oval insignia during the 1970’s, perhaps out of concerns that the vehicles didn’t represent the quality values that the blue oval should represent. Thing is, you very clearly get a picture that as the vehicle neared completion, the team “hit the wall” in marathoning parlance. They shipped, congratulated each other, and moved on to other projects. Rather than turning around and immediately beginning work on a next model to iterate the design and own the market, they stalled out for nearly a decade, only to do the same massive run in order to get the next iteration of the vehicle out the door (documented in yet another book). But I digress.

Many people often ask who Microsoft’s biggest competitor is. It isn’t Oracle. It isn’t startups. It’s Microsoft. Every 2-5 years, Microsoft replaces (and sometimes displaces) their own shipped X-1 products with new versions. If those new versions don’t include enough features and value so that customers can feel they are getting their money’s worth, they’ll stall out on older versions. We’ve seen this with Windows, where many businesses – and consumers, have stalled out on a 12 year old OS because “it’s good enough”, or Office 2003, because not only is it “good enough”, but the Ribbon (and it’s half-completed existence in Office 2007) scared away many customers. It’s gotten better in each iteration since – but the key question is always, “is there enough value in there to pull customers forward”?

I believe that the first thing you have to firmly grasp in technology – or really in business as a whole – is that nothing is forever.  You must figure out how to out-innovate yourself, to evolve and grow, even if it means jettisoning or submarining entire product lines – in order to create new ones that can take you forward again. Or disappear.

I’ve been rather surprised when I’ve said this, how defensive some people have gotten. Most people don’t like to ponder their own mortality. They sure don’t like to ponder the mortality of their employer or the platform that they build their business upon. But I think it is imperative that people begin doing exactly that.

There will come a day when we will likely talk about every tech giant of today in the past tense. Many may survive, but will be shadows (red dwarves, as I said on Twitter last night) of themselves. Look at how many tech giants of the 1970’s-1990’s are gone today – or are largely services driven organizations rather than technological innovators.

When that follower said that Apple was only one release away from irrelevance, I replied back with something similar to, “Almost every company is. It’s just a question of whether they realize it and act on it or not.”


23
Mar 13

The death of the pixel

It really didn’t hit me until recently. Something I’ve worked with for years, is being forced to retire. Well, not really retire, but at least asked to take a seat in the background.

My daughters love it when I tell them stories about “When I was little…” – the stories always begin with that saying. They usually have a lot to do with technology, and now things have changed over the last 40 years. You know the drill – phones with self-coiling cords that were stuck to the wall, payphones, Disney Read-Along books (records and then tapes), etc. Good times.

Two days ago, I had been working with a Retina Macbook Pro earlier in the day, and then it was time to put my 8 year old to bed. I told her about the Apple IIe my parents had bought when I was younger – the computer that I used through my first year of college.

Though my parents had even opted for the 80-column text card, as I look back now, the things that stick out in my mind were using The Print Shop to create horribly pixelated banners and signs, and using AppleWorks to create documents – all the way through that first year of college. I told her all about the tiny, block-like dots that made up everything on the screen, and everything that we printed.

The pixel was an essential part of technology then. We were on the other end of the spectrum from today; that is, “how many pixels do you need to make a bunch of pixels look kind of like the letter ‘o'”. I have to look back now and laugh a bit, because I recall how – while it was amazing to have computers at all – this early era of Apples and PCs is laughable from a user experience perspective. Like cars with tillers and no windscreen, these were good enough to work, for the time being.

With my iPhones, I’ve appreciated how amazing the pixel-dense “Retina” displays are. In particular, reading text is incredibly pleasant, as you can often forget you’re reading off of pixelated glass. But whether you’re consuming or creating content on that size of screen, it’s hard to get “immersed” in it.

Only as I used that Retina Macbook (a 13″), did I really realize how far we’ve come. Now it isn’t, “how many pixels do you need to make it look like an ‘o'”, it’s “how small do the pixels need to be so that you can’t see the pixels in the ‘o'”. Instead of looking like a bunch of dots creating the illusion of a letter on the screen, it’s the feeling of ink and a magical typewriter that delivers a WYSIWYG experience with digital ink on digital paper. Truly amazing.


22
Mar 13

You’re only as safe as your last backup

This week, for the second time in a year, I lost the hard drive in my main computer, a 2010 ThinkPad W510 running Windows 8. I swear I was good to the computer – I don’t know why this second Seagate 500GB drive (yes, the first one was too!) decided to hit the floor. I’ve had so many hardware problems with this system – BSODs, weird display problems, and more, over the last year, that rather than try to jam it back together for one more gig with the band, I am putting my ThinkPad out to pasture, and have replaced it.

I’ll tell you what – when you have a HDD fail, Twitter is all aflutter with people offering posthumous advice on what you could have done to avoid data loss. SkyDrive, CrashPlan, Dropbox, Windows 8 backup utilities… Like free advice, everybody had wisdom to offer… Unfortunately, it was too late. The damage was done. While I didn’t lose the latest draft of my book (THANKS SkyDrive!!!), I did lose an article draft I had been working on for some time. I’m not happy about that. Here’s how it happened.

On Wednesday morning, the date of my PC’s demise, I got up early, as I often have to do, to take my eldest to ice skating before school. The day before, I had checked out a key work file from our work file server (classic SMB Windows server file share, not SharePoint). Failure 1: I skipped a step, and pulled it locally, instead of archiving it to the server and making a copy. Our process is arcane and complex at times, but it works. The document was a rather complex outline for a lengthy piece around SharePoint Search.

While I was working at the skating rink, I wrote a good 1,000 words, getting towards more than half of the article. Failure 2: I was working with the file on my desktop, not my SkyDrive folder. Failure 3: I wasn’t on the Internet while I was at the skating rink – they have no free WiFi available. As I wrote the piece, I noticed that my system was behaving really erratically. Apps were hanging and whitescreening, only to eventually come back. Running Process Explorer, I couldn’t see anybody pegging the CPU, so I couldn’t find an obvious culprit to blame. Looking back, the warning signs of impending HDD failure were all there. I had a bunch of USB Flash Drives (UFDs) with me, so I could have, and should have copied the file off. At the moment, I’m so terrified of HDD data loss, that I’m saving things into synchronized folders all over the place, and backing up everything to everywhere.

When my daughter was done skating, we headed home, and my wife took her and her sister to school as I headed to the office. I logged on, and my computer failed to resume – it was hibernated, and tried starting – only to BSOD. After the BSOD, it just hung at the Windows 8 whirligig on the boot screen. Once put in any other machine, the drive simply clicks away, and fails to mount. Dead.

Fortunately, I had been using Windows 8’s File History to back up my files. Failure 4: Because I was using it with an external USB HDD, I was inconsistent about backing it up, and hadn’t done so in a week. Meaning my outline file was dead. Gone. MIA.

I have to look back at my criticism of Windows To Go and even renew it a bit. The thought of creating content on the go, unless you have WiFi or 3G/4G connectivity back to SharePoint, SkyDrive, Dropbox, etc, it’s an invitation to lose work as I did.

I often say that if you make a user opt-in to a process, they never will. My new backup mechanism involves technologies that all happen in the background, automatically, and don’t let me opt out, as I had done with Windows 8’s File History. Though nothing aside from me bailing the file before the HDD died on Wednesday could have saved it, at least I would have had the outline from backing it up earlier. But through a series of lazy step skipping on my behalf, I hosed myself. I am disappoint.

Given that I’ve had three HDDs die on me over the last year, and have lost a spot of data during all of them other than my iMac dying (thanks to Time Machine), I still ponder why modern operating systems seem to have inadequate or ineffective means to tell the user that their drive is failing and about to die.


21
Mar 13

What’s your definition of Minimum Viable Product?

At lunch the other day, a friend and I were discussing the buzzword bingo of “development methodologies” (everybody’s got one).

In particular, we honed in on Minimum Viable Product (MVP) as being an all-but-gibberish term, because it means something different to everyone.

How can you possibly define what is an MVP, when each one of us approaches MVP with predisposed biases of what is viable or not? One man’s MVP is another’s nightmare. Let me explain.

For Amazon, the original Kindle, with it’s flickering page turn, was an MVP. Amazon, famous for shipping… “cost-centric” products and services was traditionally willing to leave some sharp edges in the product. For the Kindle, this meant flickering page turns were okay. It meant that Amazon Web Services (AWS) didn’t need a great portal, or useful management tools. Until their hand was forced on all three by competitors. Amazon’s MVP includes all the features they believe it needs, whether they’re fully baked or usable, or whether the product still has metaphoric splinters coming off from where the saw blade of feature decisions cut it. This often works because Amazon’s core customer segment, like Walmart’s, tends to be value-driven, rather than user-experience driven.

For Google, MVP means shipping minimal products that they either call “Beta”, or that behave like a beta, tuning them, and re-releasing them . In many ways, this model works, as long as customers are realistic about what features they actually use. For Google Apps, this means applications that behave largely like Microsoft Office, but include only a fraction of the functionality (enough to meet the needs of a broad category of users). However Google traditionally pushed these products out early in order to attempt to evolve them over time. I believe that if any company of the three I mention here actually implement MVP as I believe it to be commonly understood, it is Google. Release, innovate, repeat. Google will sometimes put out products just to try them, and cull them later if the direction was wrong. If you’re careful about how often you do this, that’s fine. If you’re constantly tuning by turning off services that some segment of your customers depend on, it can cost you serious customer goodwill, as we recently saw with Google Reader (though I doubt in the long run that event will really harm Google). It has been interesting for me to watch Google build their own Nexus phones, where MVP obviously can’t work the same. You can innovate hardware Release over Release (RoR), but you can’t ever improve a bad hardware compromise after the fact – just retouch the software inside. Google has learned this. I think Amazon learned it after the original Kindle, but even the Fire HD was marred a bit by hardware design choices like a power button that was too easy to turn off while reading. But Amazon is learning.

For Apple, I believe MVP means shipping products that make conscious choices about what features are even there. With the original iPhone, Apple was given grief because it wasn’t 3G (only years later to be berated because the 3GS, 4, and 4S continued to just be 3G). Apple doesn’t include NFC. They don’t have hardware or software to let you “bump” phones. They only recently added any sort of “wallet” functionality… The list goes on and on. Armchair pundits berate Apple because they are “late” (in the pundit’s eyes) with technology that others like Samsung have been trying to mainstream for 1-3 hardware/software cycles. Sometimes they are late. But sometimes they’re “on-time”. When you look at something like 3G or 4G, it is critical that you get it working with all of the carriers you want to support it, and all of their networks. If you don’t, users get ticked because the device doesn’t “just work”. During Windows XP, that was a core mantra of Jim Allchin’s – “It just works”. I have to believe that internally, Apple often follows this same mantra. So things like NFC or QR codes (now seemingly dying) – which as much as they are fun nerd porn, aren’t consumer usable or viable everywhere yet – aren’t in Apple’s hardware. To Apple, part of the M in MVP seems to be the hardware itself – only include the hardware that is absolutely necessary – nothing more – and unless the scenario can work ubiquitously, it gets shelved for a future derivation of the device. The software works similarly, where Apple has been curtailing some software (Messages, for example) for legacy OS X versions, only enabling it on the new version. Including new hardware and software only as the scenarios are perfect, and only in new devices or software, rather than throwing it in early and improving on it later, can in many ways be seen as a forcing function to encourage movement to a new device (as Siri was with the 4S).

I’ve seen lots of geeks complain that Apple is stalling out. They look at Apple TV where Apple doesn’t have voice, doesn’t have an app ecosystem, doesn’t have this or that… Many people complaining that they’re too slow. I believe quite the opposite, that Apple, rather than falling for the “spaghetti on the wall” feature matrix we’ve seen Samsung fall for (just look at the Galaxy S4 and the features it touts), takes time – perhaps too much time, according to some people – to assess the direction of the market. Apple knows the whole board they are playing, where competitors don’t. To paraphrase Wayne Gretzky, they “skate to where the puck is going to be, not where it has been.” Most competitors seem more than happy to try and “out-feature” Apple with new devices, even when those features aren’t very usable or very functional in the real world. I think they’re losing touch of what their goal should be, which is building great experiences for their users, and instead believing their brass ring is “more features than Apple”. This results in a nerd porn arms race, adding features that aren’t ready for prime time, or aren’t usable by all but a small percentage of users.

Looking back at the Amazon example I gave early on, I want you to think about something. That flicker on page turn… Would Apple have ever shipped that? Would Google? Would you?

I think that developing an MVP of hardware or software (or generally both, today) is quite complex, and requires the team making the decision to have a holistic view about what is most important to the entire team, to the customer, and to the long-term success of your product line and your company – features, quality, or date. What is viable to you? What’s the bare minimum? What would you rather leave on the cutting room floor? Finesse, finish, or features?

Given the choice would you rather have a device with some rough edges but lots of value (it’s “cheap”, in many senses of the word)? A device that leads the market technically, but may not be completely finished either? A device that feels “old” to technophiles, but is usable by technophobes?

What does MVP mean to you?


19
Mar 13

Bill Hill and Homo Sapiens 2.0

Working on another blog post, and ran across an interview of Bill Hill from 2009. Bill reinvented himself many times in his career, from a newspaperman to someone who fundamentally worked to change the way the world read text on a digital screen. It harkens back to yesterday’s post, as well as my post on the machines coming for your job. Specifically, at about 19 minutes in, this conversation comes up:

Interviewer: “In this economy…What’s the relationship between fear…and taking chances…?”

Bill Hill: “Well that’s just the whole point. I mean, it’s very easy to get kinda cozy, and do ordinary stuff.” and “You can’t allow yourself to get paralyzed.”

Bill never stopped moving, never stopped reinventing himself. Weeks before he passed, he and I had a conversation about eBooks, almost 13 years after I first met him as we talked the same subject. You can’t stop moving, and can’t stop reinventing yourself.