05
Jan 17

Kaby Lake Haters…

There has been much written over the past year about Intel and the arrival of the end of Moore’s law – at least as we knew it.

Earlier today, a friend sent me a link to an Ars Technica piece discussing Kaby Lake, and what a letdown it was in terms of desktop CPU performance momentum.

I’m going to let you in on a little secret. The desktop CPU is dead. Don’t tell your friends who are big desktop gamers… they’ll never forgive you for crushing their dreams. But it is true. Gaming and VR will surely continue to have a place in the PC realm. But these aren’t mainstream scenarios – they aren’t the things that the broadest section of normals seek to have their personal computing devices do for them at home. They want a (personal) device that they can use for Web browsing, email, video, and perhaps productivity and some casual gaming. This device is most likely not plugged in all that often, and lives… around the house, not tethered to a dusty desk.

When I started working with Windows 25 years ago, it was an ongoing battle of wits, where Windows was constantly pushing the boundaries of what software asked from hardware. It practically felt like it was Microsoft’s goal… or… responsibility? to keep pushing the CPU requirements. This helped drive a virtuous cycle where Windows demanded a new computer, which demanded a new version of Windows, etc. I’m particularly inclined to recall a Christmas with my ex-wife’s aunt and uncle when we were newlyweds in 1996, when their big gift to the whole family was a new Gateway PC.

The irony of the PC was that, for a long time it wasn’t really a PC (Personal Computer). It was a Family Computer. That’s what my family had when we owned a //e. That’s what my ex-wife’s cousins got that Christmas. PCs were so expensive, they were a family purchase every few years, for the family to share.

Most of my generation may remember a single family phone line, which splintered into call waiting, multiple lines, and finally personal cell phones and the (near) death of the communal home phone line. The arrival of cheap PCs, tablets, and most importantly, very full-featured smartphones that, for many people, have replaced the desk-dependent PC. With the 15th anniversary of the release of Windows XP just behind us, I recall the Fast User Switching feature it delivered, and how it was, in a way, a nod to the future, where the devices around us would be truly personal – in terms of ownership, how they’re chosen/tweaked/replaced, and (sigh) managed or not.

I can’t speak to explicitly why Intel chose to insert Kaby Lake into their release cycle. Much like I can’t speak to why Apple chose to release a MacBook Pro in 2016 that was based off of Skylake (the prior year’s chip). I can’t speak to either one because I wasn’t included in any of the design meetings where the decisions behind both were made.

But I can pull out my handy-dandy Jump to Conclusions mat, and suggest the reasons why both of these things happened.

The market is changing underneath Intel, and underneath the Mac. I firmly believe that Apple invests in the Mac proportionally to the revenue the Mac returns – and if Apple chose to assign additional millions of dollars/year in Mac engineering R&D, it would not result in sales growth that reflects the investment. The scenario appears somewhat similar for the iPad lineup. However, Apple invests significantly in the iPhone, where R&D in equates to sales out. The device we’ll likely see in 2017 for the 10th anniversary of the iPhone will most likely reflect this. The Mac lineup, on the other hand, will be a bit of a cobbler’s child… getting hand-me-down technology.

Apple is building the device that addresses their mass market. But wait, why am I talking about Apple so much, in a blog post about Intel?

Intel is facing the real complexities to the end of Moore’s Law as we know it. There’s a finite end to how far you can take Intel’s “tick-tock” cycle to shrink CPUs using today’s technology.

More importantly, chip performance or “horsepower” isn’t the main thing most consumers shop for anymore. This is akin to people today who still love massive V-8 engines when most manufacturers are working on squeezing performance out of turbocharged V-6 or even inline fours in order to achieve better energy efficiency (and more importantly for their federal compliance, fleet vehicle efficiency). Consumers want battery life, quiet, and minimal heat. These are things that do not equate to monster innovations in raw CPU performance.

The move towards electric automobiles is of course another great analogy to take forward here – with ARM taking the role of electrics, and Intel being conventional fuels. Intel is working diligently to do performance limbo, and take their x64 architecture as low as it can go, to deliver great battery life while delivering good performance and great video playback. (See the earlier Ars Technica reference and the mention on 4K video in Kaby Lake.) Here Intel is focused on building distinct innovation into CPU releases that directly benefit the broadest section of the market, not just people looking for raw performance gains YoY.

In the consumer “PC” market (accepting the squishy, ever-evolving definitions of “tablet” and “PC”), any chance for Intel’s broader market strength will come from trying to compete with rapidly increasing performance from ARM chips. (Witness the Snapdragon 835 chip, which Microsoft plans to run x86 legacy Windows [Win32] applications on. We’ll see in time how that works out.)

For people who care more about speed and raw performance from their processors, the future isn’t likely full of roses. It’s likely to look more and more disappointing, much like the future for “motor heads” is. They’ll have to shop at the highest end of the market, or – longshot – hope for another vendor to address it with (likely expensive, niche, products).


08
Dec 16

Windows 10 on ARM. What does it mean?

Yesterday, when I heard the news from Microsoft’s WinHEC announcements stating, “Windows 10 is coming to ARM through a partnership with Qualcomm”, my brain went through a set of loops, trying to get what this really was, and what it really meant.

Sure, most of us have seen the leaks over the past few weeks about x86 on ARM, but I hadn’t seen enough to find much signal in the noise as to what this was.

But now that I’ve thought about it, most of it makes sense, and if we view the holistic Windows 10 brand as step 1, this is step 2 of blurring the line of what a Windows PC is.

Before we look forward, a bit of history is important. Windows RT was a complex equation to try and reduce – that is, why did it fail? The hardware was expensive, it wasn’t <ahem/> real Windows, it couldn’t run legacy applications at all, and the value proposition and branding were very confusing. Wait. Was I talking about Windows RT, or Windows on Itanium? Hah. Tricked you – it applies to both of them. But let’s let sleeping dogs be.

So if the lack of support for Windows legacy applications is a problem, and ARM processors are getting faster, how to best address this? Windows 10, the last version of Windows. Now available in a complex amalgam that will be ARM64 native, but run Win32 x86 applications through emulation.

Let’s take a look at a couple of things here, in terms of Q&A. I have received no briefing from Microsoft on this technology – I’m going to make some suppositions here.

Question 1: What is meant by x86 Win32 applications? Everything? How about 64-bit Win32 applications?

This is actually pretty straightforward. It is, as the name would imply, x86 Win32 applications. That means the majority of the legacy applications written during the lifetime of Windows (those capable of running on 32-bit Windows 10 on x86) should work when running on 64-bit Windows 10 on ARM. In general, unless there are some hardware shenanigans performed by the software, I assume that most applications will work. In many ways, I see this emulation behaving sort of like Win32 virtualization on AMD64 systems, albeit with very different internals.

Question 2: Ah, so this is virtualization?

No, this is emulation. You’re tricking x86 Win32 applications into thinking they’re running on a (low-powered) x86 processor.

Question 3: Why only 32-bit?

See a few of the next answers for a crucial piece of this answer, but in short, to save space. You could arguably have it add support for Win64 (x64, 64-bit) Windows desktop applications, but this would mean additional bloat for the operating system, and offer rapidly diminishing returns. You’re asking a low-powered ARM processor to really run 64-bit applications and make the most of them? No. Get an x64 processor and don’t waste your money.

Question 4: What is the intent here?

As I said on Twitter this morning, “This is not the future of personal computing. This is a nod to the past.” I have written far more words than justified on why Windows on ARM faced challenges. This is, in many ways, the much-needed feature to make it succeed. However, this feature is also a subtle admission of… the need for more. In order to drive Windows the platform forward on ARM, and help birth the forthcoming generations of UWP-optimal systems, there is a need to temper that future with the reality of the past – that businesses and consumers have an utterly wacky amount of time and money involved in legacy Windows desktop applications, and… something something, cold, dead hands. Thus, we will now see the beginning of x86 support on these ARM processors, and a unified brand of Windows that addresses “How do I get this?” For consumers, it will mean a lack of confusion. Buy this PC, and it will be a great tablet when you want a tablet, but it will also run all of that old stuff.

Question 5: Why not just use Project Centennial, and recompile these old desktop apps for UWP?

First, for this to succeed, it must be point-and-shoot. No repackaging. No certificate games. No weird PowerShell scripts. No recompilation. Take my ancient printer driver, and it just works. Take my old copy of MS Money that I shouldn’t be using. It just works. Etc. We’re talking old apps that should be out to pasture. On the consumer side, there is no code, and no ISV in their right mind will spend time going back and doing the work to support something like this. On the business side, there’s likely nobody around who understands the code or wants to break it. Centennial is a great idea if you are an ISV or enterprise and you want to take your existing Win32 app and begin transmogrifying it into a UWP application through the non-trivial steps needed. But it’s certainly not always the best answer, and doesn’t do the same thing this will.

Question 6: Wait. So won’t I be able to get ransomware too, then?

I would have to assume the answer to that is… yes. However, it is important to note that Terry showed off Windows 10 Enterprise edition in yesterday’s demo. Why does that matter? Because there, you have the option to use DeviceGuard to lock down the device, on these PCs that will ship with OEM Windows. That is one step, for orgs willing to pay for Enterprise. I also assume that there will be the option to turn off the Win32 layer through configuration and GPO.

Question 7: So this is like Virtual PC on PowerPC Macs?

Not exactly. That’s a fine example of emulation, but that was Windows stacked on top of the Mac OS. This looks to be, as it should be, a more side-by-side emulation. Run a UWP app, and all of your resources are running on the ARM side natively.  Run a legacy app, and all your resources are running on the x86 side. Again, the experience should be much like running 32-bit applications on 64-bit Windows, without directory tricks to do it. That’s certainly what I saw in Terry’s demo. Importantly, this means a couple of things. First, you service the whole thing together. This isn’t a VM, and doesn’t require additional steps to service it. Second, where Terry mentions “CHPE = Compiled Hybrid Portable Executable” here, unless I’m misunderstanding, he’s saying that Windows 10 on ARM is basically running fat binaries. It’s two, two, two OS’s in one.

Question 8: Wait. What does that mean?

Well, if I’m understanding their direction correctly, the build includes resources for A64 and x86 in one binary. Meaning that you only need to service one binary to service both… modes? of the OS. Significantly, this also means some on-disk bloat. You’re going to need to have more space for this to work, as you’ve basically got two installs of the OS glued together. Significantly, this is also why you don’t have x64 support too. Because if my theory above holds, adding Win64 would… do amazing things to your remaining disk space.

Question 9: Ah, so UWP is dead?

Heck no. If anything, as I said earlier, this helps UWP in the long run, by reestablishing what Windows is. UWP is still what developers must target if they care about selling anything new, designing for touch, or reaching the collection of devices that Microsoft is driving UWP forward on. I also can envision that this functionality only works when a device is Continuum’d. That is, when you’re docked and ready to work at your desk. This is all about legacy, and your desktop.

Question 10: Ah, so Intel processors are dead?

LOLNO. This is an ARM processor running x86 software. No x64 support. Performance may wind up being fair, but an ARM system will hardly be your destination if you want to do hardcore gaming, data work, development, run VMs… and then there’s the server side, where ARM still has a huge uphill battle ahead of it. This will fill a hole for consumers and low-mid tier knowledge workers. If you cared that the new MBP didn’t have more than 16GB of RAM, well… I digress.

Question 11: Ah, so Windows Mobile is dead?

No. At least not yet. Windows Mobile won’t include this layer, which will likely mean that it also won’t require the storage space. In the long run, a Windows-based ARM64 phone could indeed run Windows 10, and finally blur the line as to what is a Windows phone and what is a Windows PC – and also make Continuum incredibly useful.

 


30
Nov 16

Tired Mac prose

Over the last several weeks, a Skylake full of ink has been spilled over this fall’s Apple crop. Actually, the press seems fascinated with three distinct topics:

  1. Insufficient magic in the 2016 MacBook Pros
  2. Apple “sticking it to pros” by offering limited RAM in the MBP
  3. Apple “sticking it to pros” by not updating the Mac Pro desktop since 2013.

Issue number 1: Beginning the next day after the announcement, I had non-technical friends asking me, “what’s the deal with poor, old, beleaguered Apple?”

Okay, I’m exaggerating. That’s not really what they asked. But they were underwhelmed. Tell you what? I was too. I’m not sure what I was expecting, but I was expecting a bit more. The Touch Bar is interesting, but hardly world-changing. The presence of Touch ID is also interesting, and frankly, more relevant, especially for business users of Macs. (Dare I say it, “Mac-using pros”.) But most relevant, IMHO, is the fact that it is thinner and lighter (both also useful to pros who remove it from their desks). The move to USB-C is perhaps annoying today, but in time, will not be a big deal, and potentially very useful in terms of Thunderbolt 3 extensibility.

So is it earth-shattering? No. But it’ll do just fine at filling the backlog of orders that came after Apple had let the MacBook Pro lay dormant for a good long time.

Issue number 2: Apple only provides up to 16GB of RAM (and they didn’t go full Kaby Lake). Last thing first, it’s just late 2016. Nobody goes full Kaby Lake. But to suggest that Apple missed the boat by skipping a secondary tock is wacky. Apple rarely takes a bullet for the industry. We’ll see Kaby Lake and beyond come to the MBP. But it makes no sense to rush it this year.

Now we come to the real meat of the outrage. There’s this fascination – dare I suggest it is a feedback loop, that Apple completely doomed the MBP by not enabling more than 16GB of RAM in any of the new devices. That these devices are (paraphrasing) “unsuitable for pros”.

Please.

I offer you a challenge. Using Google, or any tool you’d like, find links to the following three things:

  1. The US$15 burger on the McDonald’s menu
  2. The Tesla convertible
  3. The page on Microsoft’s site where I can build or configure a Surface Pro 4 or Surface Book with more than 16GB of RAM.

Too tongue in cheek? Seriously though… The first two would exist, if there was a large enough market for them. The third would as well, although Microsoft most likely chose to cap it at 16GB for many of the same reasons that Apple did (Spoiler alert: it was about a compromise of what most users need in terms of RAM, and battery life). You’ll note that every Mac you can plug in, short of the [somewhat] budget conscious Mac Mini, does offer options for configuring more than 16GB of RAM, if that is what a user needs.

I’m admittedly on the low-end of the “pro” user market anymore. I couldn’t readily make my living doing what I do without a Windows PC or a Mac. But I don’t ever run an IDE. Like a band saw, that is a thing I’m not qualified to do, and it’s in nobody’s best interest that I do it. I also am a firm believer for about 4 years now in not virtualizing diddly on my Mac. I cut my teeth on the Mac running VMware Fusion from the beginning. Frankly, licensing (Windows-based) stuff to run in VMs on a Mac is a hot mess that your org should be very careful about doing. But that’s not why I don’t do it. I don’t do it because it’s a hot mess of RAM and storage requirements, in an era when both are more limited than in desktop-class laptops of the past. For my needs, I’m better served by buying a laptop that focuses on being a kick-ass laptop (minimal CPU, the RAM and SSD I really need, and a battery that lasts for a delightfully long time, and running VMs in Azure, AWS, or on a desktop. (Or more desktop-like “laptop” that would probably burn my crotch if I really used it for that.) I’m not convinced that the top-tier MBP that Apple created still can’t meet the needs of many (most?) of those who truly need a laptop to do their work.

I feel like a lot of the issue here can be summed up by a tweet of mine from 2014…

untitled

There is a number, greater than 0, of business Mac users who truly need a laptop with more than 16GB of RAM, and would pay what it costs for Apple to build in the technology+battery needed to make it happen. I believe that if Apple saw that that number was significant enough, they would build it. That’s what they do. They built an oversized iPhone, when we all said they wouldn’t. They offered a stylus for the iPad, even though that would mean they blew it. If a market that is willing to pay a premium exists, Apple will build a thing to address it. (This also likely describes why Apple is letting their displays go fallow, and perhaps will even let them die completely. We will see if, perhaps, new displays arrive the next time we see an iMac refresh, likely in 2017.) But I honestly would love to see more detailed scenario descriptions where people need more than 16GB in a laptop day-to-day, where having a secondary desktop or using cloud-based virtualization wouldn’t meet or exceed their needs instead – especially in cases where people aren’t willing to pay the premium Apple would need to charge for an MBP that could meet those needs. Thoughts on that? Blog on what you do, what you need, and why those wouldn’t work, and post a link to my Twitter.

Issue number 3: Finally, we come to the Mac Pro and signs of life. It has been almost 1,100 days since the last update to the Mac Pro, a desktop high-end Mac that is, significantly, a) really expensive, and b) uniquely, assembled in Austin, TX.

As a result of a headline with a recent year stating, “New Mac Pro announced”, many have marked the line for death. Why shouldn’t they? Apple used to make servers. They don’t anymore. Apple used to make wireless routers. They don’t anymore. Apple used to make displays. Whoops, my bad. We’ll see in 2017, but that might be the case as well. I don’t know the stats on how many Mac Pros Apple sells annually, or what the ASP of those units is. It could potentially be a reasonably large chunk of cash, but even with the price of the units, is most likely a pittance of the actual revenue compared to what Apple makes on iPhones, or even on the rest of the mobile Mac+iMac lines. And for better or worse, Apple’s focus, as most companies of late, has been on shareholder value/returns. Apple gives what it gets. Like a “MacBook Pro Plus” (or whatever the ultimate nerd-spec MBP would be branded), the cost to address the market in a timely manner don’t likely mesh with more aggressive research spend to deliver it more rapidly than the cadence we’re seeing.

So will we see a new Mac Pro anytime? Perhaps.

But there seems to be a fair amount of rumors that say Apple would rather build a tier of iMac that could address some (but not all) of the scenarios the Mac Pro instead of building a new top-shelf desktop PC. Because we are at a reasonable plateau of display technology – of sorts, I could reasonably set aside my distaste for AIOs and say maybe that isn’t a horrible idea. Is it ideal? Not really. The Mac Pro is minimally extensible, and a (27″, Kaby Lake) iMac that addressed it’s space would need to rely completely upon external extensibility… not that the current Mac Pro doesn’t, outside of RAM or SSD). Any iMac would also be seriously challenged to address the caliber of GPU or CPU power possible with the Mac Pro. The replacement, or suggested replacement for a Mac Pro is, IMHO, very likely to arrive in 2017.

In terms of both the MBP and Mac Pro, I think Apple will try their best to continue to address the high-end pro market as best they can. But time will tell. In the end, some pros may find Apple’s innovations discouraging. Some will possibly switch to Windows-based PCs, but short of building their own PC, I think many will find the PC OEMs driving towards similar modularity and cost reductions, and feel constrained – if less so – when buying a PC of any kind.

There’s a whole other topic to discuss another day, which is the rumbling “Microsoft has stolen the creative mantle from Apple” theory. More on that later…


27
Nov 16

Goodbye, Twitter

Almost exactly three years ago, I decided to kill my Facebook account. Not log off. Delete it. It’s been gone since then, and honestly, I never miss it.

When I signed on to Twitter for the first time in May of 2008, I had no idea what I would do with it. The running joke at the time was that Twitter was primarily used to let others know you were going to/were in/were back from, the bathroom. Colleagues at a startup in Austin even registered a domain name (whospoopin.com) as a joke, in the hopes of creating a “competitor” to Twitter.

During the last 8 and a half years, I’ve used Twitter to make new friends, find old ones, and make connections that sometimes even translate from binary into analog, meeting Twitter connections for the first time. As I once said on Twitter, when I was young, I didn’t get the point of pen pals, but now with Twitter, I did, and had pen pals around the world.

I’m very disappointed with the state of the world at the moment, and I find that Twitter lately only adds despair, rage, or both to my mood.

In my life, I try to be mindful of how tools work for me or not, and discard them if they cost me more than they benefit me. Recently, I’ve realized that Twitter’s return for my investment has greatly diminished, and I’m wasting far more on it vs. what it gives me. Sure, some of this has to do with this insane election. But it is not just that. Pondering what Twitter means to me has helped me highlight a desire to focus inward on my own personal priorities – my health, my weight, my job, reading and learning, and  my other personal interests, ahead of the seemingly empty calories that Twitter provides to me of late.

For the time being, I’m not deleting my Twitter account, deleting any tweets, or even taking the account dark. But I have deleted the apps from my computers and iPhone, and I don’t intend to check Twitter with any regularity any longer. I continue to be reachable via email, and cell phone, as well as Signal and Telegram.


28
Aug 16

It doesn’t have to be a crapfest

A  bit ago, this blog post crossed my Twitter feed. I read it, and while the schadenfreude made me smirk for a minute, it eventually made me feel bad.

The blog post purports to describe how a shitty shutdown dialog became a shitty shutdown dialog. But instead, it documents something I like to call “too many puppies” syndrome. If you are working on high visibility areas of a product – like the Windows Shell – like Explorer in particular, everybody has an belief that their opinion is the right direction. It’s like dogs and a fire hydrant. My point really isn’t to be derisive here, but to point out that the failure of that project does not seem to be due to any other teams. Instead, it seems to have been due to some combination of unclear goals and a fair amount of the team he was on being lost in the wilderness.

I mentioned on Twitter that, if you are familiar with the organizational structure of Windows, that you can see the cut lines of those teams in the UI. A reply to that mentioned Conway’s law – which I was unfamiliar with, but basically states that as a general concept, a system designed by an organization will reflect the structure of that organization.

But not every project is doomed to live inside its own silo. In fact, some of my favorite projects that I worked on while I was at The Firm were ones that fought the silo, and the user won. Unfortunately, this was novel then, and still feels novel now.

During the development of Windows Server 2003, Bill Veghte, a relatively new VP on the product, led a series of reviews where he had program managers (PMs) across the product walk through their feature area/user scenario, to see how it worked, didn’t work, and how things could perhaps be improved. Owning the enterprise deployment experience for Windows at the time, I had the (mis?)fortune of walking Bill through the setup and configuration experience with a bunch of people from the Windows Server team.

When I had joined the Windows “Whistler” team just before beta 2, the OS that became Windows XP was described by a teammate as a “lipstick on a chicken” release was already solidifying, and while we had big dreams of future releases like “Blackcomb” (never happened), Whistler was limited largely by time to the goal of shipping the first NT-based OS to both replace ME and the 9X family for consumers, and Windows 2000 in business.

Windows Server, on the other hand, was to ship later. (In reality, much, much later, on a branched source tree, due to the need to <ahem/> revisit XP a few times after we shipped it.) This meant that the Windows Server team could think a bit bigger about shipping the best product for their customers. These scenario reviews, which I really enjoyed attending at the time, were intended to shake out the rattles in the product and figure out how to make it better.

During my scenario review, we walked through the entire setup experience – from booting the CD to configuring the server. If you recall, this meant walking through some really ugly bits of Windows. Text-mode setup. F5 and F6 function keys to install a custom HAL or mass-storage controller drivers during text-mode setup. Formatting a disk in text-mode setup. GUI-mode setup. Fun, fun stuff.

Also, some forget, but this was the first time that Windows Server was likely to ship with different branding from the client OS. Yet the Windows client branding was… everywhere. Setup “billboards” touting OS features that were irrelevant in a server, wizards, help files, even the fact that setup was loading drivers for PCMCIA cards and other peripherals that a server would never need or use in the real world, or verbs on the shutdown menu that made no sense on a server, like standby or hibernate.

A small team of individuals on the server team owned the resulting output from these walkthroughs, which went far beyond setup, and resulted in a bunch of changes to how Windows Server was configured, managed, and more. In terms of my role, I wound up being their liaison for design change requests (DCRs) on the Windows setup team.

There were a bunch of things that were no-brainers – fixing Windows Setup to be branded with Windows Server branding, for example. And there were a ton of changes that, while good ideas, were just too invasive to change, given the timeframe that Windows Server was expected to ship in, (and that it was still tethered to XP’s codebase at that time, IIRC). So lots of things were punted out to Blackcomb, etc.

One of my favorite topics of discussion, however, became the Start menu. While Windows XP shipped with a bunch of consumer items in the Start menu, almost everything it put there was… less than optimal on a server. IE, Outlook Express, and… Movie Maker? Heck, the last DCR I had to say no to for XP was a very major customer telling us they didn’t even want movie maker in Windows XP Pro! It had no place on servers – nor did Solitaire or the Windows XP tour.

So it became a small thing that David, my peer on the server team, and I tinkered with. I threw together a mockup and sent it to him. (It looked a lot like the finished product you see in this article.) No consumer gunk. But tools that a server administrator might use regularly. David ran this and a bunch of other ideas by some MVPs at an event on campus, and even received applause for their work.

As I recall, I introduced David to Raymond Chen, the guru of all things Windows shell, and Raymond and David wound up working together to resolve several requests that the Windows Server team had in the user interface realm. In the end, Windows Server 2003 (and Server SP1, which brought x64 support) wound up being really important releases to the company, and I think they reflected the beginning of a new maturity at Microsoft on building a server product that really felt… like a server.

The important thing to remember is that there wasn’t really any sort of vehicle to reflect cross-team collaboration within the company then. (I don’t know if there is today.) It generally wasn’t in your review goals (those all usually reflected features in your team’s immediate areas), and compensation surely didn’t reflect it. I sat down with David this week, having not talked for some time, and told him how most of my favorite memories of Microsoft were working on cross-team projects where I helped other teams deliver better experiences by refining where their product/feature crossed over into our area, and sometimes beyond.

I think that if you can look deeply in a product or service that you’re building, and see Conway’s law in action, you need to take a step back. Because you’re building a product for yourself, not for your customers. Building products and services that serve your entire customer base means always collaborating, and stretching the boundaries of what defines “your team”. I believe the project cited in the original blog post I referenced above failed both because there were too many cooks, but also because it would seem that anyone with any power to control the conversation actually forgot what they were cooking.

 

 


27
Jun 16

Compute Stick PCs – Flash in the pan?

A few years ago, following the success of many other HDMI-connected computing devices, a new type of PC arrived – the “compute stick”. Also referred to sometimes as an HDMI PC or a stick PC, the device immediately made me scratch my head a bit.

If Windows 10 still featured a Media Center edition, I guess I could sort of see the point. But Windows, outside of Surface Hub (which seemingly runs a proprietary edition of Windows), no longer features a 10′ UI in the box. Meaning, without third-party software and nerd-porn duct tape, it’s a computer with a TV as a display, and a very limited use case.

Unlike Continuum on Windows 10 Mobile, I’ve never had a licensing boot camp attendee ask me about compute sticks (almost none ever asked us about Windows To Go, the mode of booting Windows Enterprise edition off of USB on a random PC).

The early sticks featured 2GB of RAM or less, really limiting their use case even further. With 4GB, more modern versions will run Windows 10 well, but to what end?

I can see some cases where compute sticks might make sense for point of service, but a NUC is likely to be more affordable, powerful, and expandable, and not suffer from heat exhaustion like a compute stick is likely to.

I’ve also heard it suggested that a compute stick is a good potential for the business traveler. But I don’t get that. Using a compute stick requires you to have a keyboard and pointing device with you, and find an AC power source behind a hotel TV or shared workspace. Now I don’t know about you, but while I used to travel with a keyboard to use with my iPad, I don’t anymore… and I never travel with a spare pointing device. And as to finding AC power behind a hotel TV? Shoot me now.

The stick PC has some use cases, sure. Home theater where the user is willing to assemble the UX they want. But that’s nerd porn, not a primary use case, and not a long-term use case (see Media Center edition).

You eventually reach a point where, if you want a PC while you’re on the go, you should haul a PC with you. Laptops, convertibles, and tablets are ridiculously small, and you don’t always have to tote peripherals with you to make them work.

In short, I can see a very limited segment of use cases where compute sticks make sense. (Frankly, it’s a longer list than Windows To Go.) But I think in most cases, upon closer inspection, a NUC (or larger PC), Windows 10 tablet or laptop, or <gasp/> a Windows 10 Mobile device running Continuum is likely to make more sense.

 


24
Jun 16

An iPad Pro is not a Mac

Last year, Christopher Mims wrote about how Apple should kill off the Mac. Just this week, Apple alumnus Michael Gartenberg wrote that the iPad Pro is the new Mac.

It’s human nature to try and match things up… to simplify, organize, and categorize data points. To say a thing is like another thing, or a thing can replace another thing. But I think doing so today only confuses normal users.

A few months ago, I wrote a post about how you shouldn’t cross-shop the iPad Pro and Surface Pro (or Surface, for that matter) because people kept pondering the two as alternatives of each other.

Someday, we will arrive at the point that an iOS device will be able to meet the requirements of many, perhaps even most, macOS (nee OS X) users. This day is not that day, and this year is not that year.

I travel a fair amount. Almost every other month, I have to fly for work. While my old 15″ Retina MacBook Pro had served me well for some time, I was growing frustrated with three issues (in order):

  1. Battery life
  2. Heat
  3. Screen size.

My Mac’s battery was to the point where no matter what I did, unless I dialed every possible thing back that I could, it was less than 3 hours of battery life. I write a lot… and I like to write remote. Having to find AC power all the time gets really frustrating, and AC also isn’t always available.

I use my laptop as a… laptop. The i5 in my old MBP got hot. Not as bad as the i7 in my old ThinkPad, but toasty – limiting when and where I could <ahem/> comfortably use it.

Finally, with the great unbundling, coach class seating is now hostile to machines over 13″. I found that on Alaska’s planes, if the seat in front reclined on me, I wasn’t going to be working.

So I needed something smaller. Lighter. More efficient.

I’m not a developer. So I don’t need Xcode. I don’t work with Mac versions of most legacy multimedia software from Apple, Adobe, or others. I don’t even play games on my computers. But I work in Microsoft Office every single day. And there are things that I need there. There is the mobile version of the Office applications, and I have an E3 subscription that entitles me to using them.

So as I winnowed down my device options, I was seriously looking at the large iPad Pro. While I’m all thumbs when it comes to drawing (or hand-writing), the Smart Keyboard and iPad Pro make an acceptable (although compromising) combination.

In particular, as I pondered life with the iPad Pro, several caveats came up with the hardware, before I’d even considered the software capabilities.

  1. Not “lappable”
  2. Keyboard of great compromise
  3. Fixed position screen
  4. No secondary pointing device.

Lappability. I hate the term. But it is a thing. “Lappability”. The iPad Pro, like the Surface line (outside of the Surface Book, which is arguably somewhat lappable) is not lappable. It isn’t. If you have to care about where the device sits on your lap before it falls (or how long you can leave it on your lap before the kickstand feels like it is cutting into your flesh), it is not “lappable”.

Compromising keyboard. As I said earlier, I write a lot. I’ve really fallen in love with the keyboard on my old MBP. It is really pleasant to use. The iPad Pro’s keyboard, like Microsoft’s original Touch Covers for the Surface devices, is squishy and has strange key travel. For a writer, I just find the contraption too compromising to work well. I would imagine most developers would as well. Frankly, I’d love to see Apple try a Surface Book like approach for keyboard (sans the wacky GPU in the base).

Fixed screen. In terms of the screen, sure – the position is probably positioned pretty well. But the inflexibility drives me nuts. Sometimes you’re in a plane or conference center, and the sun is hitting the screen just right so you can’t work. Or your neck hurts, so you want to subtly reposition it. Good luck fixing that.

Touch only. Finally, the lack of a pointing device, and the requirement to smear your screen to navigate the device, while standard operating procedure with iOS, and acceptable with certain device use cases, makes me stabby on my daily use work device. I’m staring at Word, PowerPoint, the Web, and a handful of other things throughout the day. I don’t want to be cleaning my screen all day.

So if I’d been willing to compromise on those 4 (I wasn’t), the iPad Pro might’ve been capable of becoming my primary device. But then we hit the software caveats.

  1. Word on iOS is far from full-featured
  2. Working with files in iOS is still a bear
  3. Collaboration through SMB shares is unworkable
  4. Tools I use regularly for workflow are absent.

Word limits. Word on the iPad is very limited compared to Word on the desktop (even just comparing Word on the Mac, let alone Windows. I don’t even use VBA, so don’t care that that is missing. As I mentioned, I have Office 365 for work, so don’t need additional licensing. But the editing tools on iOS are very… constrained. Tables and outlining, for example, are things I use all the time in Word on the Mac and Windows. No go on iOS. I also find the document reviewing tools on iOS excruciatingly frustrating to use vs. desktop equivalents.

File handling. Much has been made of the lack of a Finder equivalent in iOS. iOS doesn’t need  a finder per se. But it does need the ability to share certain “universal” files in one location and have any other app be able to open them. Trying to open a PPTX file with rich content in PowerPoint on iOS is ugly. Basically have to copy the file. Need to make edits and save the file back for a colleague to read? Good luck. You’re gonna hurt yourself by the time you finish.

Legacy collaboration. Collaboration through old Windows shares is not workable on iOS. If your org has moved completely to Dropbox or OneDrive (which would be impressive), then you can make this work. Otherwise, you’re using kludgy apps that try to make SMB fit within the parameters of iOS, and create similar problems to the ones I just outlined. (Even Microsoft’s own Work Folders technology seems basically dead on the vine in favor of OneDrive for Business. iOS was designed to be standalone and not need file shares. Which is all well and good if you’re a sole proprietor, Web-only or your whole org is all-in on SaaS-based collaboration software. But most orgs aren’t.

Specialty software. I have several tools that I use regularly – notably BetterTouchTool, and Paw, for work. These don’t have equivalents. I could perhaps get used to not having them, or perhaps find alternatives, but I’d rather not.

Contrary to what you might think, I wouldn’t describe myself as a power user. I run terminal on OS X about as often as I ran regedit on Windows (and for the same duct tapey reasons). But in the end, I found that the iPad Pro and iOS would not, in terms of either hardware or software, meet my needs, without me needing a Mac in addition for certain things.

In the end, I wound up getting the new MacBook, consciously choosing the low end model with the Intel m3 processor. It feels like I see beachballs a little more than with my old MBP, but it isn’t that frequent. More importantly, I have a screen that works great on flights, it runs cool almost all the time (plus it has no fan!), and I can go an insane amount of time without needing my charger.

Apple will surely come out with more iPad Pro hardware/peripherals over time, which will enable new scenarios and flexibility. And iOS and macOS will continue to harmonize, while iOS moves upmarket, to enable more and more software scenarios that were previously exclusive to the Mac. It’s a delicate dance. Building a walled garden around macOS, while expanding the walled garden of iOS.

But the reality is also that there are certain scenarios people should not ever expect iOS to support, like SMB file shares in-box, or replacing built-in apps with third-party equivalents. I just believe that’s not the kind of things that you should expect Apple to do.

In several years, perhaps as few as 2, maybe as many as 5, iOS devices will likely be able to meet the needs of most people who use Macs or Windows PCs today. Some users will compromise their behavior or requirements early and go to iOS. Some will find that iOS just meets their needs, and switch. Some will continue to use Windows and macOS for the foreseeable future. Some scenarios, like developing fully-featured OS X and macOS apps (or developing for Windows clients or Linux server on Macs), will continue to require a Mac, even as Swift development tools likely gain capabilities on iOS.

In the meantime, I think that saying the Mac should go away, or that the iPad is workable for most normals who are knowledge workers, is a real stretch. Probably in time. It’s the direction. But we’re not there yet… not for some time.


07
Jun 16

The Autostadt, brand spaces, and marketing

Following my recent trip to Germany, I’ve spent the last month thinking about the idea of brand spaces. By brand space, I mean the use of a space – be it a single store, a building, or a multi-building space, that a business uses to establish or grow a marketing relationship with their consumers.

Although I hate to fly, I love to travel. (As I like to tell people, “I like to be places”.) I took a few days this year between work events to visit Volkswagen’s hometown of Wolfsburg, Germany, and the Autostadt, located there, as well as VW’s museum. A child of the 1970’s, there is a special place in my heart for the Volkswagen brand. My parents owned a VW Fastback before I was born, and a Dasher and Westfalia camper when I was young, and that’s when I fell in love with cars… So there are considerable emotional links for me associated with the brand.

I had hoped to visit the factory, but it was closed while I was going to be there. But the trip was still worthwhile to me, just to visit the Autostadt.

If you haven’t heard of the Autostadt before, you may not completely understand what this place is. The Autostadt, which is completely owned by VW, was first opened in 2000.

If you’ve ever been to Epcot (as it was created, not as Walt imagined it), or a World’s Fair, you can get an idea what the Autostadt is like. It reminded me more than a little bit of Expo ’86, in Vancouver. Imagine an automotive Epcot, where each brand has a pavilion, instead of each country. It was apparent that brands each had a very different idea of what to do with their pavilion. (A few really struggle to tell a cohesive story.) But Škoda and Porsche’s pavilions, in particular, tell stories that really align with their brands. (The lack of any Bentley presence was odd to me, given that even Ducati, Lamborghini, and Bugatti – brands people often aren’t familiar with the ownership of – are represented at the Autostadt.)

When you enter the Autostadt, you go through a main pavilion to the “park” itself, where each of the pavilions are. It’s much like any Disney park. Single point of entry, and once you’re inside, it’s rather easy to get disoriented. Thankfully since it is situated next to the factory with its tall smokestacks, and the Autostadt itself features two large car towers that store 800 cars, it isn’t quite as mentally all-encompassing as Disney’s parks.

Supposedly, VW spent nearly US$500M initially building the Autostadt. What I can tell you is that they’ve created a fascinating tourist destination for car fans, car-tolerant families of car fans, and families picking up a VW for delivery. An on-premises Ritz-Carlton (with subsidized rooms for those picking up a car), numerous restaurants, several stores and a large design museum create a space that can occupy a day, at least, for fans of one or more of the VW brands. In essence, it’s a brand theme park.

As I walked the Autostadt, I began thinking about the fact that I paid what I did… to visit a marketing exercise. Like Apple and their retail stores, VW built the Autostadt to establish a physical presence for their brand(s) with consumers that want to engage with it. Sure, it’s synthetic. (So is Disney.) Sure, it’s contrived. Some of the brands honestly fail at making the most of their opportunity at the Autostadt (looking at you, Lamborghini), but in terms of creating consumer engagement, it’s an interesting concept.

This all loops back to the concept of brand spaces. A Disney theme park. An Apple Store. More and more consumer brands are struggling with how to nurture a brand space to be. Gateway tried for years. Sony did too. Microsoft tried in San Francisco, then tried again with their current foray into retail. I think that a brand space is a space that is welcoming, and created in a manner that reflects the design aesthetics of that brand. But you can’t force it. You can’t just riff off of Apple’s design aesthetic to build a space that consumers will just swarm to. Your brand space needs to reflect your brand, and what consumers like about your brand.

Years ago, Best Buy had a brand space. You walked in, and it felt like Best Buy. You either liked it or hated it. But now Best Buy, like most big box retailers, has turned into a micro-mall, infected with store-within-a-store parasites, compromising their own overarching brand. I believe that the creation of a unique brand space is an important component of brands that want to – and will increasingly need to – stay directly engaged with their consumers.

Bear in mind, there are brands, like Intel, that make no sense to establish a brand space for. Intel, like BASF, is really a wholesale brand, no matter how much people wish it wasn’t. Focusing on consumer-level messaging if you aren’t selling to consumers is tilting at windmills. (Here again, Microsoft’s massive presence in enterprise and struggles in the consumer space post-XP make a similar argument.)

I’m not saying that every consumer brand needs to build a retail presence or a theme park. PLEASE, no. But I am saying that it will become increasingly important for brands to consider when and how engagement with their consumers, in a physical way, makes sense, given their brand, their consumers, and the purchasing cycle of those consumers.


09
Feb 16

Taken for a ride

Last week, as I entered the elevator of the building, another tenant turned to me and gleefully exclaimed, pointing across the garage at a new Jeep, “See that Jeep? I think I’m going to buy it.” 

I could tell immediately that this guy (a younger man, in his 20’s) was in trouble. He was smitten. He was a stranger, but sharing all of this, unprovoked. I had just come home from work, after a long editorial review meeting, followed by a trip to the grocery. So the logic gates in my mind were pretty shot. But the dialog basically went as follows:

Him: “Yeah, I just went over to test drive it, and they wanted over $600 a month for it. But I got them down to $350.”

Me: (still sort of shocked at being pulled into this conversation): “Wow. That is… a big difference. Interesting that they’d do that after the end of the month.”*

Him: “I thought so too. He said their sales month ends on the second day of the month.”**

Me: “That’s… different.”

Him: “I told them that I needed to go home and let my dogs out, but if they’d get it down to $300, we’d have a deal.”

We arrive at his floor before mine, and I can’t process the whole conversation before he exits the elevator. He steps off, I say, “Well, good luck in your decision!”

I go up to my floor and head to my apartment. A few minutes pass, and I crunch those data points in my head. I think to myself, “Holy shit. I need to stop that kid.”

I went downstairs, but the Jeep was already gone. I felt guilty for a little bit, but I noticed the next day that the Jeep wasn’t there, an older vehicle was instead. Obviously the deal hadn’t occurred.

Let me explain why I was concerned. When I used to sell cars, there was this bullshit form that they insisted that we use as we sold the car. It was a con. This sales grid is basically it.

It works by trying to nail you down on one or two data points, by making all of the other (important) data points variable. For example, if they see you focus on the monthly payment as the most important vector, they’ll swizzle the deal by making the payment term longer, or lifting your down payment. Other variables that come into play are your trade-in (but you’ll generally get stiffed on that), any options or extended warranties they’ll try to tack on, or the interest rate you’ll pay. Effectively it’s the old shell game con – and they try to appease you by meeting the one or two numbers you won’t waffle on, but do so by being vague about others.

Of course, all of these terms are available when you’re signing the contract, but by that time, backing out of it can be physically or psychologically challenging, if your brain can clearly process all of the key paperwork being thrown at you.

I don’t necessarily wish I could go back and talk the young man out of buying the Jeep, per se. But I would really like to know more of the metrics involved in the dealer’s side of the negotiations. I suspect that the dealer was primarily playing with the payment term or the down payment.

Many purchases, like a car or a home, can easily become driven primarily by emotion rather than logic. Some salespeople will prey upon this. Never be afraid to walk away from a potential purchase if your emotions are guiding you, and not your brain.

*This dialog happened on February 1. Car dealers are famous for offering deals that expire at the end of the month, as that’s often how sales bonuses from manufacturers line up.

**That the dealer told him this was extremely suspect. I don’t think I believe it. It’s possible, but doubtful. Instead, I think the date given to him was arbitrary, to try and close the deal.


03
Feb 16

Surface Pro and iPad Pro – incomparable

0.12 of a pound less in weight. 0.6 inches more in display area.

That’s all that separates the iPad Pro from the Surface Pro (lightest model of each). Add in the fact that both feature the modifier “Pro” in their name, and that they look kind of similar, and it’s hard to not invite comparisons, right? (Of course, what tablets in 2016 don’t look like tablets?)

Over the past few weeks, several reports have suggested that perhaps Apple’s Tablet Grande and Microsoft’s collection of tablet and tablet-like devices may have affected at holiday quarter sales of tablet-like devices from the other. Given what I’ve said above, I’ve surely even suggested that I might cross-shop one with the other when shopping. But man, that would be a mistake.

I’m not going to throw any more numbers at you to try and explain why the iPad Pro and Surface devices aren’t competitors, and shouldn’t be cross-shopped. Okay, only a few more; but it’ll be a minute. Before I do, let’s take a step back and consider the two product lines we’re dealing with.

The iPad Pro is physically Apple’s largest iOS device, by far. But that’s just it. It runs iOS, not OS X. It does not include a keyboard of any kind. It does not include a stylus of any kind. It can’t be used with an external pointing device, or almost any other traditional PC peripheral. (There are a handful of exceptions.)

The Surface Pro 4 is Microsoft’s most recent tablet. It is considered by many pundits to be a “detachable” tablet, which it is – if you buy the keyboard, which is not included. (As an aside, inventing a category called detachables when the brunt of devices in the category feature removable, but completely optional keyboards seems slightly sketchy to me.) Unlike the iPad Pro, the Surface Pro 4 does include the stylus for the device. You can also connect almost any traditional PC peripheral to a Surface Pro 4 (or Surface 3, or Surface Book.)

Again, at this point, you might say, “See, look how much they have in common. 1) A tablet. 2) A standardized keyboard peripheral. 3) A Stylus.”

Sure. That’s a few similarities, but certainly not enough to say they’re the same thing. A 120 volt light fixture for use in your home and a handheld flashlight also both offer a standard way to have a light source powered by electrical energy. But you wouldn’t jumble the two together as one category, as they aren’t interchangeable at all. You use them to perform completely different tasks.

The iPad Pro can’t run any legacy applications at all. None for Windows (of course), and none for OS X. There is it’s Achilles heel; it’s great at running iOS apps that have been tuned for it. But if the application you want to run isn’t there, or lacks features found in the Windows or OS X desktop variant you’d normally use (glares at you, Microsoft Word), you’re up the creek. (Here’s where someone will helpfully point out VDI, which is a bogus solution to running legacy business-critical applications that you need with any regularity.)

The Surface Pro offers a contrast at this point. It can run universal Windows platform (UWP) applications, AKA Windows Store apps, AKA Modern apps, AKA Metro apps. (Visualize my hand getting slapped here by platform fans for belaboring the name shifts.) And while the Surface Pro may have an even more constrained selection of platform-optimized UWP apps to choose from, if the one you want isn’t available in the Windows Store, you’ve got over two decades worth of Win32 applications that you can turn to.

Anybody who tells you that either the iPad Pro or the Surface Pro are “no compromise” devices is either lying to you, or they just don’t know that they’re lying to you. They’re both great devices for what they try to be. But both come with compromises.

Several people have also said that the iPad Pro is a “companion device”. But it depends upon the use case as to whether that is true or not. If you’re a hard-core Windows power user, then yes, the iPad Pro must be a companion device. If you regularly need features only offered by Outlook, Excel, Access, or similar Win32 apps of old, then the iPad Pro is not the device for you. But if every app you need is either available in the App Store, you can live within the confines of the limited versions of Microsoft Office for Office 365 on the iPad Pro, or your productivity tools are all Web accessible, then the iPad Pro might not only be a good device for you, but it might actually be the only device you need. It all comes down to your own requirements. Some PC using readers at this point will helpfully chime in that the user I’ve identified above doesn’t exist. Not true – they’re just not that user.

If a friend or family member came to me and said, “I’m trying to decide which one to buy – an iPad Pro or Surface Pro.”, I’d step them through several questions:

  1. What do you want to do with it?
  2. How much will you type on it? Will you use it on your lap?
  3. How much will you draw on it? Is this the main thing you see yourself using it for
  4. How important is running older applications to you?
  5. How important is battery life?
  6. Do you ever want to use it with a second monitor?
  7. Do you have old peripherals that you simply can’t live without? (And what are they?)
  8. Have you bought or ripped a lot of audio or video content in formats that Apple won’t let you easily use anymore? (And how important is that to you?)

These questions will each have a wide variety of answers – in particular question 1. (Question 2 is a trap, as the need to use the device as a true laptop will lead most away from either the iPad Pro or the Surface Pro.) But these questions can easily steer the conversation, and their decision, the right direction.

I mentioned that I would throw a few more numbers at you:

  • US$1,028.99 and
  • US$1,067.00

These are the base prices for a Surface Pro 4 (Core m3) and iPad Pro, respectively, equipped with a stylus and keyboard. Just a few cups of Starbucks apart from each other. The Surface Pro 4 can go wildly north of this price, depending upon CPU options (iPad Pro offers none) or storage options (iPad Pro only offers one). The iPad Pro also offers cellular connectivity for an additional charge in the premium storage model (not available in the Surface Pro). My point is, at this base price, they’re close to each other, but that is a matter of convenience. It invites comparisons, but deciding upon these devices based purely on price is a fool’s errand.

The more you want the Surface Pro 4 (or a Surface Book) to act like a workstation PC, the more you will pay. But there is the rub; it can be a workstation too – the iPad Pro can’t ever be. Conversely, the iPad Pro can be a great tablet, where it offers few compromises as a tablet – you could read on it, it has a phenomenal stylus experience for artists, and it’s a great, big, blank canvas for whatever you want to run on it (if you can run it). But it will never run legacy software.

The iPad Pro may be your ideal device if:

  1. You want a tablet that puts power optimization ahead of everything else
  2. Every application you need is available in the App Store
  3. The are available in an iPad Pro optimized form
  4. The available version of the app has all of the features you need
  5. All of your media content is in Apple formats or available through applications blessed by Apple.

The Surface Pro may be your ideal device if:

  1. You want a tablet that is a traditional Windows PC first and foremost
  2. Enough of the applications you want to run on it as a tablet are available in the Windows Store
  3. They support features like Snap and resizing when the app is running on the desktop
  4. You need to run more full-featured, older, or more power hungry applications, or applications that cannot live within the sandboxed confines of an “app store” platform
  5. You have media content (or apps) that are in formats or categories that Apple will not bless, but will run on Windows.

From the introduction of both devices last year, many people have been comparing and contrasting these two “Pro” devices. I think that doing so is a disservice. In general, a consumer who cross-shops the two devices and buys the wrong one will wind up sorely disappointed. It’s much better to figure out what you really want to do with the device, and buy the right option that will meet your personal requirements.