22
Aug 17

A few thoughts on Windows 10 S…

A few months ago, before Microsoft announced their new Surface Laptop or Windows 10 S, I had several conversations with reporters and friends about what might be coming. In particular, some early reports had hinted that this might be a revision of Windows, something designed for robustness. Some thought it might be more Chromebook-like. Given the experiences of my daughters with Chromebooks, those last two sentences are oxymorons. But I digress. What arrived, Windows 10 S (AKA “Windows 10 Pro in S Mode”) wasn’t a revision or really much of a refinement. It was a nuanced iteration of Windows 10 Pro, with built-in Device Guard policies, and some carefully crafted changes to the underlying OS infrastructure.

Putting the Surface Laptop aside for now (it’s not my laptop, and I’m not its customer), Windows 10 S seems to me to be an OS full of peculiar compromises, with a narrow set of benefits for end users, at least at this time.

I saw this tweet go by on Twitter a bit ago, and several more followed, discussing the shortcomings of Windows 10 S.

In most conversations I’ve had with reporters recently about Windows, I’ve reemphasized my point that what most customers want isn’t “an OS that does <foo>”. They want a toaster.

What do I mean by that? Think about a typical four-slice toaster:

You use it Sunday morning. It toasts.
You use it Monday morning. It toasts.
You use it Wednesday morning. It toasts.

This is what a huge percentage of the populace wants. A toaster. Normals want it. Schools want it. Most IT workers want it. Frankly, I think a lot of IT wants it, because they’re constantly being asked to do more, and given less money to do it with.

The era of tinkering with PCs being fun for normals, and even some technical people, has passed.

So that in mind, what’s wrong with Windows 10 S? Nothing, I guess. In a way, It is at least a more toasterish model for Windows than we’ve seen before. It’s constrained, and attempts to put a perimeter around the Windows desktop OS to reduce the risk posed by the very features of the OS itself.

I encourage you to read Piotr’s thread, above, before reading further.

Windows 10 S is not:

  • A new edition of Windows (or version, for that matter). It’s effectively a specially configured installation of Windows 10 Pro
  • Redesigned for use with touch or tablets, any more than 10 itself is
  • Cloud-backup enabled or cloud recoverable (this one is a shame, IMHO)
  • Free of Win32 and the quirks and challenges that it brings.

Those last two are important. Consumers with iOS devices today are generally used to toaster-like experiences when it comes to backing up and recovering their devices (yes, exceptions exist) to iCloud ideally, or a Mac or PC in certain circumstances. The last one is important because most of the troublesome battery life issues that hit lightweight, low-energy Windows devices can be easily pointed back to the cumbersome baggage of Win32 itself, and Win32 applications engineered for a time when energy was cheap because PCs were plugged in all the time, and everything was about processor power.

So if Windows 10 S isn’t “all new”, what is it?

Technologically, Windows 10 S is designed for the future. Or at least the future Microsoft wants:

  • It offers almost all features of Pro, and can be easily “upgraded” to Pro
  • It natively supports Azure Active Directory domain join and authentication as Pro does, but does not support joining Active Directory at all
  • It supports Windows Store applications only (UWP, Desktop Bridge if crafted correctly, etc), otherwise, no use of Win32 applications not in-box and approved by Microsoft
  • Secure by default, at least in the sense that the previous objective and the implementation of Device Guard + policies built in can deliver.

So it’s an OS that supports the directory, app store, and legacy app distribution models of the future.

A question I’ve been asked several times was, “why no AD join?” – Initially I was just going with the “it’s the directory of the future” theory. But there’s more to it. From the day that AD and Group Policy came into Windows, there was an ongoing struggle in terms of performance and cost. Ask anyone who had a Windows 2000 PC how long they had to wait when they logged on every day. A giant chunk of that was Active Directory. Over time,  Windows added increasing amounts of messaging to tell you what the OS was doing during logon.

If you go back and look at the 10 S reveal, logon performance was a touted feature. I’ve even seen people on Twitter say that’s why they like 10 S better. Why is it better? I’m sure there are some other reasons as well, but by completely obliterating AD integration, I’m certain that a huge performance win was observed.

When I look at 10 S then, particularly the Device Guard-based security, the defenestration of Active Directory, and the use of Pro as an underlying OS rather than a new edition, 10 S feels… kind of like a science experiment that escaped the lab. Frankly, Device Guard always kind of looked that way to me too.

But there’s another angle here too, and it’s kind of a weird one.

I don’t know how much Microsoft is selling Windows 10 S to OEMs for, but price is clearly a factor here. Some have assumed that because it’s based on Pro, that 10 S costs the same, or even costs as much as Home. It is not clear whether that is actually the case.

When announced, Microsoft stated that it would ship on PCs starting at US$189. As I said, price is clearly a factor. Given the fact that a one-time upgrade from 10 S to Pro costs US$49, it seems pretty apparent to me that with 10 S, Microsoft has shifted some costs for Pro that used to be borne by OEMs to consumers. While this US$49 upgrade is basically moot for the remainder of this calendar year, eventually it must be considered, as consumers (and some businesses) will need to pay if they require Pro-only functionality.

So the net effect then is that Windows 10 S devices can be cheaper, at least up-front, than Windows 10 Pro devices (and maybe Home). Users who need Pro can “upgrade” to it.

Here’s where I think this gets really interesting. Before too long, we can expect to see ARM-based devices running Windows 10. I think that these devices could likely come with 10 S on them, resulting in lower purchase prices, as well as a reduced risk vector if users don’t actually need to run their own library of Win32 applications. In a way then, “Windows 10 S on ARM” offers most of the actual value that Windows RT ever delivered, but would offer far more, by supporting Desktop Bridge applications, and a complete upgrade to Pro with support for x86 Win32 applications.

Consumers could pay for the upgrade to Pro if they need to run full Win32, or need to upgrade the device to Enterprise for work. In this scenario, I imagine that Chrome will likely be the reason why a number of 10 S users pay for an upgrade.

Just as with the vaguely unannounced “Windows 10 Pro for Workstations”, there’s always a reason why these changes occur, and a strategic objective that Microsoft has planned. For me, I think that 10S, especially with a pilot launch on Microsoft’s own Surface Laptop hardware, is pretty clearly a sign of a few directions there the company wants to go.

 


16
Mar 17

Creatures of habit

As I head into the weekend, I’m prepping for my second work trip of the year. First up was Orlando, back in January.

While we often don’t have a ton of free time in some of the cities we hit for our licensing boot camps, my colleague and I usually have a bit free, and in particular, have time for specific dining options.

San Diego is a funny one, because while I usually have a set of places I like to go to when we travel to this same set of cities over and over, here, my colleague has what has to be one of his favorite places – Chocolat, in downtown. As a fan of both dark chocolate and gelato, it is always on his hit list every year we visit.

It got me thinking about how I tend to hit the same places over and over… I find comfort in a thing, and then don’t want to miss it/try something new out of fear that it might not be as good as the thing I’ve created this neural investment in.

I’ve got such a regimented routine in some of these cities now; I feel like I need to try and come up with new and unique things to see and do while traveling, even for work, to make sure I’m really experiencing everything I can, rather than seeing the same things over and over again.


02
Feb 17

Little joys can make a difference

Yesterday on Twitter I said:

Find the little joys.

Treasure them.

I mean this sincerely. Our lives can be overwhelming. But I believe that the key to living a life worth living is to find these little joys.

If you follow me on Twitter, you may see these from me sometimes. A video of a bird singing. A photo of a flower. A sunset, or often a photo from a 737 as I fly to or from one of our licensing boot camps.

I may sound full of crap to you at the moment, but I believe this. I was walking out of a store the other day with my daughters, and my youngest just sort of walked out in front of me, and cut off a stranger who was trying to come in. I paused, let the stranger come in, let my eldest leave, and then left the coffee shop myself.

As we walked to the pet store, as we often do, I asked my youngest, “Can you do me a favor?”, followed by a “What’s that?”

I asked her to “Walk mindfully.”

If we go through life each watching out for each other, and perform small kindnesses, while noting these small joys, our lives can become calmer, more meaningful, and we can pass that along to others, who may even return the favor to others.

A few weeks prior to that incident, I was driving out of Kirkland after a busy morning at the office. There’s a construction project that has been going on forever, and will likely be going for a while more. The flagger at the project who controls the flow of trucks into and out of the project, has a tendency to wave as you drive by – almost like a “Hey, I hope you’re having a great day!” wave. When I passed her, I thought, that’s silly… but she insisted on making eye contact as I passed by – which made me sort of smirk – and she smiled.

I continued driving on this cold day, headed towards the grocery. As I pulled in, an older woman in the lot appeared confused, and a bit flustered. She was standing in the middle of the lane of traffic, clearly looking for her car. I pulled in and walked over to her, asking if she was okay. She explained her situation – she had gone to a medical appointment at a nearby clinic, and now could no longer remember where her car was parked. I felt awful for her, but she didn’t want to call family, out of embarrassment, and didn’t want a ride home. I gave her my card and told her to call me shortly if she needed a ride. I shopped, and upon coming out, saw her still wandering, and looking. I was worried, but stealthily drove around for a while, keeping an eye on her. Eventually, I saw her walking around with a nurse, and she must have called home for a ride. Short story, her husband called me and told me that the police were looking for her car, calling me back later with an update that they had found it.

That night, as I listened to my youngest play in a band concert, I pondered the events of the day. I’ll always do what I can to help others. But I believe the wave from the young woman in Kirkland had sort of slapped me back to mindfulness, and made me more aware of what was going on around me. As I said, we can pass these small joys, these small kindnesses, forward.

I love when my eldest gets trapped in a bout of giggle fits, or when she excitedly tells me about some academic accomplishment she has performed at school.

I love when my youngest locks her curiosity into high velocity, and assaults me with questions. Watching her brain work and digest facts is amazing. She also gives amazing bear hugs.

I love to learn – to learn something new – but I also love to teach. There’s nothing like teaching someone, and seeing that moment when… it clicks.

Since I was a child, I’ve had this tendency to get a shiver down my back as I get enveloped by something someone is teaching me. It reminds me of my favorite teachers over time, as it was a feeling I felt when I learned from them too.

I love the small beauties of nature that surround us every day. The mountains. The birds. A sunset. Clouds.

I love it when you’ve had a stressful day or week, and you meet with a good friend – and realize that they’ve got your back.

I love to close my eyes, and think back to the last time I was in Glacier, and how small it made me feel – while appreciating the sheer joy I felt, watching the sun set over Swiftcurrent Lake.


26
Jan 17

The cult of tribalism and the death of the United States

“Death of the United States?”, you ask, shaking your head at the lunacy of a blog post that dares to suggest such a thing.

As we sit here in 2017, days into a new administration, we are faced with a dangerously narcissistic man in the White House who has suggested voter fraud based on no provable facts, but instead based on his own opinion; a press secretary who parrots whatever he is told, whether it is provably false or not; a chief strategist who has openly discussed destroying the republic; and an advisor/press liaison who openly suggested that “alternative facts” are anything other than a lie.

A few weeks ago, I met with a friend for drinks. I shared with him a thesis that I had come up with earlier in the day, which went as follows:

There is an opportunity cost to immediate information. Connectedness, absent mindfulness, equals insanity.

What do I mean by that? I mean, with our rapid information consumption, through Twitter, Facebook, other social media, “always on” news, and innumerable sites competing for our eyes with rapid fire information that is rarely checked for accuracy, if we don’t stop to question things, reality disappears, and we wind up bathing in a cult of our own tribalism.

If you aren’t familiar with it, I encourage you to read Thinking, Fast and Slow to get a frame of reference here. Here’s a good summary.

In a nutshell, the two parts of our brain are constantly at odds – this entire presidential campaign, rather than being grounded in debate, logic, and considered thought (System 2), was grounded in emotion (System 1).

If you look carefully at the statements that DJT used throughout the campaign, and that he continues to use, there’s a common refrain. What is that refrain?

Fear

His entire campaign was about fear. His speeches preyed upon emotion, rather than logic. He was a fast-twitch candidate, if you will. His bold, often demonstrably false, claims fed the fears of his base. ISIS. Refugees. Immigrants. Overregulation. Jobs. Rampant crime/shootings/carnage. Voter fraud. (A card he continues to play, as it resonates, due to the popular/electoral mismatch.) But the same base that lovingly digested those lies would push back diligently throughout the campaign at press that questioned that “truth”, because doing so would make them question their own beliefs, and their own comfortable reality they had created.

As my friend and I talked, he suggested something I hadn’t considered. Maslow’s hierarchy. Humans crave food first – and only at the top are they able to become self-actualized. In other words, “I’m going to watch out for my own interests until I can ensure they’re safe.” In this cult of tribalism I discussed above, people refuse to question their tribe… to question their beliefs. I mean, sure, you should fear ISIS. But good grief. You’re throwing away the very foundation this country was built on if you say “immigrants aren’t welcome here”.

That’s just it. We’ve got this selective reality in this country now, where the hard left will tell you one thing, the hard right will tell you something completely different, the news media all digests it and spits it back out at high velocity. How on earth is anyone supposed to end up with anything but a subjective opinion that mirrors their own existing reality???

We choose whether to listen to others, or to close off and say “my way is right.” And I’ll admit, it’s going to be pretty hard to get someone to see something when their livelihood depends on them not seeing it. People in coal and petroleum industries will fight you tooth and nail about climate change, because their literal reality depends on your literal reality not being true?

How the hell are we supposed to move forward as a country, if we can’t all stop, and think for a moment? A friend used the expression “low vibration minds” as a gentle way to refer to people who are unable to, or unwilling to, think beyond themselves. That’s really what this all comes down to – a level of mindfulness. But if someone doesn’t want to listen – if listening means that you question, and or destroy the very fabric of who they are?

  • How do you get someone to listen?
  • How do you get someone to listen to the truth? (By this, I mean a calculated, proven, truth.)
  • How do you get someone to listen to the truth that undermines the truth as they understand it, and reality as they want it to be?

When we would fly as kids, I would often ask my mother what made the sky so blue. My brother would say, “It’s not blue. It’s pink.” This used to annoy the hell out of me, because it was provably false. As we find ourselves in this weird alternative reality, it’s important to realize the exact antics and approach being used by Steve Bannon and others occupying the White House who seem to, in my opinion, not have the best interests of the country in mind with their actions.

Fear is a powerful thing. It fed the marginal approval for Brexit. It fed the marginal approval for DJT. In fact, it’s important to unwind a truism before both of these votes – that they were polling that they weren’t going to pass. Why? My opinion is… fear. Those willing to vote for these actions, based upon ungrounded, potentially irrational, fears, weren’t willing to have those views questioned. With such overt xenophobia, racial hatred, and anger driving both – and the ricochet of hate that resulted from both, it’s not hard to see why someone might want to be a closet Brexit or Trump backer. Cowardly, IMHO, but not hard to understand – again, the position for pushing for both being based upon fear.

Unfortunately, as we already see six days into this administration, those leading it – not necessarily the guy in the chair – choose to continue the antics that played well to his base as standard operating procedure.

However, I would like to offer a few words of advice on dealing with the propaganda-based approach being deployed by this White House administration:

  1. It’s very important for all media, regardless of their political bent, to question provably false statements coming from them.
  2. But understand that when you do, you will be confronted by his staff, and challenged on it, because you are not endorsing the message they want to resonate.
  3. If you continue to try and question, you will in turn be questioned. Like a football star accused of sexual assault, their defense will focus not on debasing your statement, but on debasing you. Be strong, stand firm, and defend the truth.

I also think that it is critically important at this moment in time, that Americans – “left” or “right”, regardless of faith, gender, race, age, economic strata… that all Americans – including those who represent us in Congress – need to start listening to others, and understanding why they feel the way they feel, why they believe the way they believe, and why they fear the way they fear. We will not move forward as a country with this “my way or else” bullshit. We must work together, even where a precise common ground does not, or likely cannot, exist.

I started out this post with a bold claim. I genuinely believe we are at a dangerous point in our beautiful country’s life, when the men running this country are willing to blatantly lie for their own benefit, and to the detriment of the country, its citizens, and the world at large.


26
Jan 17

Is this for Dieter?

A quick audio clip I recorded last night, of an amusing event from my childhood. Apologies if the audio isn’t perfect.


05
Jan 17

Kaby Lake Haters…

There has been much written over the past year about Intel and the arrival of the end of Moore’s law – at least as we knew it.

Earlier today, a friend sent me a link to an Ars Technica piece discussing Kaby Lake, and what a letdown it was in terms of desktop CPU performance momentum.

I’m going to let you in on a little secret. The desktop CPU is dead. Don’t tell your friends who are big desktop gamers… they’ll never forgive you for crushing their dreams. But it is true. Gaming and VR will surely continue to have a place in the PC realm. But these aren’t mainstream scenarios – they aren’t the things that the broadest section of normals seek to have their personal computing devices do for them at home. They want a (personal) device that they can use for Web browsing, email, video, and perhaps productivity and some casual gaming. This device is most likely not plugged in all that often, and lives… around the house, not tethered to a dusty desk.

When I started working with Windows 25 years ago, it was an ongoing battle of wits, where Windows was constantly pushing the boundaries of what software asked from hardware. It practically felt like it was Microsoft’s goal… or… responsibility? to keep pushing the CPU requirements. This helped drive a virtuous cycle where Windows demanded a new computer, which demanded a new version of Windows, etc. I’m particularly inclined to recall a Christmas with my ex-wife’s aunt and uncle when we were newlyweds in 1996, when their big gift to the whole family was a new Gateway PC.

The irony of the PC was that, for a long time it wasn’t really a PC (Personal Computer). It was a Family Computer. That’s what my family had when we owned a //e. That’s what my ex-wife’s cousins got that Christmas. PCs were so expensive, they were a family purchase every few years, for the family to share.

Most of my generation may remember a single family phone line, which splintered into call waiting, multiple lines, and finally personal cell phones and the (near) death of the communal home phone line. The arrival of cheap PCs, tablets, and most importantly, very full-featured smartphones that, for many people, have replaced the desk-dependent PC. With the 15th anniversary of the release of Windows XP just behind us, I recall the Fast User Switching feature it delivered, and how it was, in a way, a nod to the future, where the devices around us would be truly personal – in terms of ownership, how they’re chosen/tweaked/replaced, and (sigh) managed or not.

I can’t speak to explicitly why Intel chose to insert Kaby Lake into their release cycle. Much like I can’t speak to why Apple chose to release a MacBook Pro in 2016 that was based off of Skylake (the prior year’s chip). I can’t speak to either one because I wasn’t included in any of the design meetings where the decisions behind both were made.

But I can pull out my handy-dandy Jump to Conclusions mat, and suggest the reasons why both of these things happened.

The market is changing underneath Intel, and underneath the Mac. I firmly believe that Apple invests in the Mac proportionally to the revenue the Mac returns – and if Apple chose to assign additional millions of dollars/year in Mac engineering R&D, it would not result in sales growth that reflects the investment. The scenario appears somewhat similar for the iPad lineup. However, Apple invests significantly in the iPhone, where R&D in equates to sales out. The device we’ll likely see in 2017 for the 10th anniversary of the iPhone will most likely reflect this. The Mac lineup, on the other hand, will be a bit of a cobbler’s child… getting hand-me-down technology.

Apple is building the device that addresses their mass market. But wait, why am I talking about Apple so much, in a blog post about Intel?

Intel is facing the real complexities to the end of Moore’s Law as we know it. There’s a finite end to how far you can take Intel’s “tick-tock” cycle to shrink CPUs using today’s technology.

More importantly, chip performance or “horsepower” isn’t the main thing most consumers shop for anymore. This is akin to people today who still love massive V-8 engines when most manufacturers are working on squeezing performance out of turbocharged V-6 or even inline fours in order to achieve better energy efficiency (and more importantly for their federal compliance, fleet vehicle efficiency). Consumers want battery life, quiet, and minimal heat. These are things that do not equate to monster innovations in raw CPU performance.

The move towards electric automobiles is of course another great analogy to take forward here – with ARM taking the role of electrics, and Intel being conventional fuels. Intel is working diligently to do performance limbo, and take their x64 architecture as low as it can go, to deliver great battery life while delivering good performance and great video playback. (See the earlier Ars Technica reference and the mention on 4K video in Kaby Lake.) Here Intel is focused on building distinct innovation into CPU releases that directly benefit the broadest section of the market, not just people looking for raw performance gains YoY.

In the consumer “PC” market (accepting the squishy, ever-evolving definitions of “tablet” and “PC”), any chance for Intel’s broader market strength will come from trying to compete with rapidly increasing performance from ARM chips. (Witness the Snapdragon 835 chip, which Microsoft plans to run x86 legacy Windows [Win32] applications on. We’ll see in time how that works out.)

For people who care more about speed and raw performance from their processors, the future isn’t likely full of roses. It’s likely to look more and more disappointing, much like the future for “motor heads” is. They’ll have to shop at the highest end of the market, or – longshot – hope for another vendor to address it with (likely expensive, niche, products).


08
Dec 16

Windows 10 on ARM. What does it mean?

Yesterday, when I heard the news from Microsoft’s WinHEC announcements stating, “Windows 10 is coming to ARM through a partnership with Qualcomm”, my brain went through a set of loops, trying to get what this really was, and what it really meant.

Sure, most of us have seen the leaks over the past few weeks about x86 on ARM, but I hadn’t seen enough to find much signal in the noise as to what this was.

But now that I’ve thought about it, most of it makes sense, and if we view the holistic Windows 10 brand as step 1, this is step 2 of blurring the line of what a Windows PC is.

Before we look forward, a bit of history is important. Windows RT was a complex equation to try and reduce – that is, why did it fail? The hardware was expensive, it wasn’t <ahem/> real Windows, it couldn’t run legacy applications at all, and the value proposition and branding were very confusing. Wait. Was I talking about Windows RT, or Windows on Itanium? Hah. Tricked you – it applies to both of them. But let’s let sleeping dogs be.

So if the lack of support for Windows legacy applications is a problem, and ARM processors are getting faster, how to best address this? Windows 10, the last version of Windows. Now available in a complex amalgam that will be ARM64 native, but run Win32 x86 applications through emulation.

Let’s take a look at a couple of things here, in terms of Q&A. I have received no briefing from Microsoft on this technology – I’m going to make some suppositions here.

Question 1: What is meant by x86 Win32 applications? Everything? How about 64-bit Win32 applications?

This is actually pretty straightforward. It is, as the name would imply, x86 Win32 applications. That means the majority of the legacy applications written during the lifetime of Windows (those capable of running on 32-bit Windows 10 on x86) should work when running on 64-bit Windows 10 on ARM. In general, unless there are some hardware shenanigans performed by the software, I assume that most applications will work. In many ways, I see this emulation behaving sort of like Win32 virtualization on AMD64 systems, albeit with very different internals.

Question 2: Ah, so this is virtualization?

No, this is emulation. You’re tricking x86 Win32 applications into thinking they’re running on a (low-powered) x86 processor.

Question 3: Why only 32-bit?

See a few of the next answers for a crucial piece of this answer, but in short, to save space. You could arguably have it add support for Win64 (x64, 64-bit) Windows desktop applications, but this would mean additional bloat for the operating system, and offer rapidly diminishing returns. You’re asking a low-powered ARM processor to really run 64-bit applications and make the most of them? No. Get an x64 processor and don’t waste your money.

Question 4: What is the intent here?

As I said on Twitter this morning, “This is not the future of personal computing. This is a nod to the past.” I have written far more words than justified on why Windows on ARM faced challenges. This is, in many ways, the much-needed feature to make it succeed. However, this feature is also a subtle admission of… the need for more. In order to drive Windows the platform forward on ARM, and help birth the forthcoming generations of UWP-optimal systems, there is a need to temper that future with the reality of the past – that businesses and consumers have an utterly wacky amount of time and money involved in legacy Windows desktop applications, and… something something, cold, dead hands. Thus, we will now see the beginning of x86 support on these ARM processors, and a unified brand of Windows that addresses “How do I get this?” For consumers, it will mean a lack of confusion. Buy this PC, and it will be a great tablet when you want a tablet, but it will also run all of that old stuff.

Question 5: Why not just use Project Centennial, and recompile these old desktop apps for UWP?

First, for this to succeed, it must be point-and-shoot. No repackaging. No certificate games. No weird PowerShell scripts. No recompilation. Take my ancient printer driver, and it just works. Take my old copy of MS Money that I shouldn’t be using. It just works. Etc. We’re talking old apps that should be out to pasture. On the consumer side, there is no code, and no ISV in their right mind will spend time going back and doing the work to support something like this. On the business side, there’s likely nobody around who understands the code or wants to break it. Centennial is a great idea if you are an ISV or enterprise and you want to take your existing Win32 app and begin transmogrifying it into a UWP application through the non-trivial steps needed. But it’s certainly not always the best answer, and doesn’t do the same thing this will.

Question 6: Wait. So won’t I be able to get ransomware too, then?

I would have to assume the answer to that is… yes. However, it is important to note that Terry showed off Windows 10 Enterprise edition in yesterday’s demo. Why does that matter? Because there, you have the option to use DeviceGuard to lock down the device, on these PCs that will ship with OEM Windows. That is one step, for orgs willing to pay for Enterprise. I also assume that there will be the option to turn off the Win32 layer through configuration and GPO.

Question 7: So this is like Virtual PC on PowerPC Macs?

Not exactly. That’s a fine example of emulation, but that was Windows stacked on top of the Mac OS. This looks to be, as it should be, a more side-by-side emulation. Run a UWP app, and all of your resources are running on the ARM side natively.  Run a legacy app, and all your resources are running on the x86 side. Again, the experience should be much like running 32-bit applications on 64-bit Windows, without directory tricks to do it. That’s certainly what I saw in Terry’s demo. Importantly, this means a couple of things. First, you service the whole thing together. This isn’t a VM, and doesn’t require additional steps to service it. Second, where Terry mentions “CHPE = Compiled Hybrid Portable Executable” here, unless I’m misunderstanding, he’s saying that Windows 10 on ARM is basically running fat binaries. It’s two, two, two OS’s in one.

Question 8: Wait. What does that mean?

Well, if I’m understanding their direction correctly, the build includes resources for A64 and x86 in one binary. Meaning that you only need to service one binary to service both… modes? of the OS. Significantly, this also means some on-disk bloat. You’re going to need to have more space for this to work, as you’ve basically got two installs of the OS glued together. Significantly, this is also why you don’t have x64 support too. Because if my theory above holds, adding Win64 would… do amazing things to your remaining disk space.

Question 9: Ah, so UWP is dead?

Heck no. If anything, as I said earlier, this helps UWP in the long run, by reestablishing what Windows is. UWP is still what developers must target if they care about selling anything new, designing for touch, or reaching the collection of devices that Microsoft is driving UWP forward on. I also can envision that this functionality only works when a device is Continuum’d. That is, when you’re docked and ready to work at your desk. This is all about legacy, and your desktop.

Question 10: Ah, so Intel processors are dead?

LOLNO. This is an ARM processor running x86 software. No x64 support. Performance may wind up being fair, but an ARM system will hardly be your destination if you want to do hardcore gaming, data work, development, run VMs… and then there’s the server side, where ARM still has a huge uphill battle ahead of it. This will fill a hole for consumers and low-mid tier knowledge workers. If you cared that the new MBP didn’t have more than 16GB of RAM, well… I digress.

Question 11: Ah, so Windows Mobile is dead?

No. At least not yet. Windows Mobile won’t include this layer, which will likely mean that it also won’t require the storage space. In the long run, a Windows-based ARM64 phone could indeed run Windows 10, and finally blur the line as to what is a Windows phone and what is a Windows PC – and also make Continuum incredibly useful.

 


30
Nov 16

Tired Mac prose

Over the last several weeks, a Skylake full of ink has been spilled over this fall’s Apple crop. Actually, the press seems fascinated with three distinct topics:

  1. Insufficient magic in the 2016 MacBook Pros
  2. Apple “sticking it to pros” by offering limited RAM in the MBP
  3. Apple “sticking it to pros” by not updating the Mac Pro desktop since 2013.

Issue number 1: Beginning the next day after the announcement, I had non-technical friends asking me, “what’s the deal with poor, old, beleaguered Apple?”

Okay, I’m exaggerating. That’s not really what they asked. But they were underwhelmed. Tell you what? I was too. I’m not sure what I was expecting, but I was expecting a bit more. The Touch Bar is interesting, but hardly world-changing. The presence of Touch ID is also interesting, and frankly, more relevant, especially for business users of Macs. (Dare I say it, “Mac-using pros”.) But most relevant, IMHO, is the fact that it is thinner and lighter (both also useful to pros who remove it from their desks). The move to USB-C is perhaps annoying today, but in time, will not be a big deal, and potentially very useful in terms of Thunderbolt 3 extensibility.

So is it earth-shattering? No. But it’ll do just fine at filling the backlog of orders that came after Apple had let the MacBook Pro lay dormant for a good long time.

Issue number 2: Apple only provides up to 16GB of RAM (and they didn’t go full Kaby Lake). Last thing first, it’s just late 2016. Nobody goes full Kaby Lake. But to suggest that Apple missed the boat by skipping a secondary tock is wacky. Apple rarely takes a bullet for the industry. We’ll see Kaby Lake and beyond come to the MBP. But it makes no sense to rush it this year.

Now we come to the real meat of the outrage. There’s this fascination – dare I suggest it is a feedback loop, that Apple completely doomed the MBP by not enabling more than 16GB of RAM in any of the new devices. That these devices are (paraphrasing) “unsuitable for pros”.

Please.

I offer you a challenge. Using Google, or any tool you’d like, find links to the following three things:

  1. The US$15 burger on the McDonald’s menu
  2. The Tesla convertible
  3. The page on Microsoft’s site where I can build or configure a Surface Pro 4 or Surface Book with more than 16GB of RAM.

Too tongue in cheek? Seriously though… The first two would exist, if there was a large enough market for them. The third would as well, although Microsoft most likely chose to cap it at 16GB for many of the same reasons that Apple did (Spoiler alert: it was about a compromise of what most users need in terms of RAM, and battery life). You’ll note that every Mac you can plug in, short of the [somewhat] budget conscious Mac Mini, does offer options for configuring more than 16GB of RAM, if that is what a user needs.

I’m admittedly on the low-end of the “pro” user market anymore. I couldn’t readily make my living doing what I do without a Windows PC or a Mac. But I don’t ever run an IDE. Like a band saw, that is a thing I’m not qualified to do, and it’s in nobody’s best interest that I do it. I also am a firm believer for about 4 years now in not virtualizing diddly on my Mac. I cut my teeth on the Mac running VMware Fusion from the beginning. Frankly, licensing (Windows-based) stuff to run in VMs on a Mac is a hot mess that your org should be very careful about doing. But that’s not why I don’t do it. I don’t do it because it’s a hot mess of RAM and storage requirements, in an era when both are more limited than in desktop-class laptops of the past. For my needs, I’m better served by buying a laptop that focuses on being a kick-ass laptop (minimal CPU, the RAM and SSD I really need, and a battery that lasts for a delightfully long time, and running VMs in Azure, AWS, or on a desktop. (Or more desktop-like “laptop” that would probably burn my crotch if I really used it for that.) I’m not convinced that the top-tier MBP that Apple created still can’t meet the needs of many (most?) of those who truly need a laptop to do their work.

I feel like a lot of the issue here can be summed up by a tweet of mine from 2014…

untitled

There is a number, greater than 0, of business Mac users who truly need a laptop with more than 16GB of RAM, and would pay what it costs for Apple to build in the technology+battery needed to make it happen. I believe that if Apple saw that that number was significant enough, they would build it. That’s what they do. They built an oversized iPhone, when we all said they wouldn’t. They offered a stylus for the iPad, even though that would mean they blew it. If a market that is willing to pay a premium exists, Apple will build a thing to address it. (This also likely describes why Apple is letting their displays go fallow, and perhaps will even let them die completely. We will see if, perhaps, new displays arrive the next time we see an iMac refresh, likely in 2017.) But I honestly would love to see more detailed scenario descriptions where people need more than 16GB in a laptop day-to-day, where having a secondary desktop or using cloud-based virtualization wouldn’t meet or exceed their needs instead – especially in cases where people aren’t willing to pay the premium Apple would need to charge for an MBP that could meet those needs. Thoughts on that? Blog on what you do, what you need, and why those wouldn’t work, and post a link to my Twitter.

Issue number 3: Finally, we come to the Mac Pro and signs of life. It has been almost 1,100 days since the last update to the Mac Pro, a desktop high-end Mac that is, significantly, a) really expensive, and b) uniquely, assembled in Austin, TX.

As a result of a headline with a recent year stating, “New Mac Pro announced”, many have marked the line for death. Why shouldn’t they? Apple used to make servers. They don’t anymore. Apple used to make wireless routers. They don’t anymore. Apple used to make displays. Whoops, my bad. We’ll see in 2017, but that might be the case as well. I don’t know the stats on how many Mac Pros Apple sells annually, or what the ASP of those units is. It could potentially be a reasonably large chunk of cash, but even with the price of the units, is most likely a pittance of the actual revenue compared to what Apple makes on iPhones, or even on the rest of the mobile Mac+iMac lines. And for better or worse, Apple’s focus, as most companies of late, has been on shareholder value/returns. Apple gives what it gets. Like a “MacBook Pro Plus” (or whatever the ultimate nerd-spec MBP would be branded), the cost to address the market in a timely manner don’t likely mesh with more aggressive research spend to deliver it more rapidly than the cadence we’re seeing.

So will we see a new Mac Pro anytime? Perhaps.

But there seems to be a fair amount of rumors that say Apple would rather build a tier of iMac that could address some (but not all) of the scenarios the Mac Pro instead of building a new top-shelf desktop PC. Because we are at a reasonable plateau of display technology – of sorts, I could reasonably set aside my distaste for AIOs and say maybe that isn’t a horrible idea. Is it ideal? Not really. The Mac Pro is minimally extensible, and a (27″, Kaby Lake) iMac that addressed it’s space would need to rely completely upon external extensibility… not that the current Mac Pro doesn’t, outside of RAM or SSD). Any iMac would also be seriously challenged to address the caliber of GPU or CPU power possible with the Mac Pro. The replacement, or suggested replacement for a Mac Pro is, IMHO, very likely to arrive in 2017.

In terms of both the MBP and Mac Pro, I think Apple will try their best to continue to address the high-end pro market as best they can. But time will tell. In the end, some pros may find Apple’s innovations discouraging. Some will possibly switch to Windows-based PCs, but short of building their own PC, I think many will find the PC OEMs driving towards similar modularity and cost reductions, and feel constrained – if less so – when buying a PC of any kind.

There’s a whole other topic to discuss another day, which is the rumbling “Microsoft has stolen the creative mantle from Apple” theory. More on that later…


27
Nov 16

Goodbye, Twitter

Almost exactly three years ago, I decided to kill my Facebook account. Not log off. Delete it. It’s been gone since then, and honestly, I never miss it.

When I signed on to Twitter for the first time in May of 2008, I had no idea what I would do with it. The running joke at the time was that Twitter was primarily used to let others know you were going to/were in/were back from, the bathroom. Colleagues at a startup in Austin even registered a domain name (whospoopin.com) as a joke, in the hopes of creating a “competitor” to Twitter.

During the last 8 and a half years, I’ve used Twitter to make new friends, find old ones, and make connections that sometimes even translate from binary into analog, meeting Twitter connections for the first time. As I once said on Twitter, when I was young, I didn’t get the point of pen pals, but now with Twitter, I did, and had pen pals around the world.

I’m very disappointed with the state of the world at the moment, and I find that Twitter lately only adds despair, rage, or both to my mood.

In my life, I try to be mindful of how tools work for me or not, and discard them if they cost me more than they benefit me. Recently, I’ve realized that Twitter’s return for my investment has greatly diminished, and I’m wasting far more on it vs. what it gives me. Sure, some of this has to do with this insane election. But it is not just that. Pondering what Twitter means to me has helped me highlight a desire to focus inward on my own personal priorities – my health, my weight, my job, reading and learning, and  my other personal interests, ahead of the seemingly empty calories that Twitter provides to me of late.

For the time being, I’m not deleting my Twitter account, deleting any tweets, or even taking the account dark. But I have deleted the apps from my computers and iPhone, and I don’t intend to check Twitter with any regularity any longer. I continue to be reachable via email, and cell phone, as well as Signal and Telegram.


28
Aug 16

It doesn’t have to be a crapfest

A  bit ago, this blog post crossed my Twitter feed. I read it, and while the schadenfreude made me smirk for a minute, it eventually made me feel bad.

The blog post purports to describe how a shitty shutdown dialog became a shitty shutdown dialog. But instead, it documents something I like to call “too many puppies” syndrome. If you are working on high visibility areas of a product – like the Windows Shell – like Explorer in particular, everybody has an belief that their opinion is the right direction. It’s like dogs and a fire hydrant. My point really isn’t to be derisive here, but to point out that the failure of that project does not seem to be due to any other teams. Instead, it seems to have been due to some combination of unclear goals and a fair amount of the team he was on being lost in the wilderness.

I mentioned on Twitter that, if you are familiar with the organizational structure of Windows, that you can see the cut lines of those teams in the UI. A reply to that mentioned Conway’s law – which I was unfamiliar with, but basically states that as a general concept, a system designed by an organization will reflect the structure of that organization.

But not every project is doomed to live inside its own silo. In fact, some of my favorite projects that I worked on while I was at The Firm were ones that fought the silo, and the user won. Unfortunately, this was novel then, and still feels novel now.

During the development of Windows Server 2003, Bill Veghte, a relatively new VP on the product, led a series of reviews where he had program managers (PMs) across the product walk through their feature area/user scenario, to see how it worked, didn’t work, and how things could perhaps be improved. Owning the enterprise deployment experience for Windows at the time, I had the (mis?)fortune of walking Bill through the setup and configuration experience with a bunch of people from the Windows Server team.

When I had joined the Windows “Whistler” team just before beta 2, the OS that became Windows XP was described by a teammate as a “lipstick on a chicken” release was already solidifying, and while we had big dreams of future releases like “Blackcomb” (never happened), Whistler was limited largely by time to the goal of shipping the first NT-based OS to both replace ME and the 9X family for consumers, and Windows 2000 in business.

Windows Server, on the other hand, was to ship later. (In reality, much, much later, on a branched source tree, due to the need to <ahem/> revisit XP a few times after we shipped it.) This meant that the Windows Server team could think a bit bigger about shipping the best product for their customers. These scenario reviews, which I really enjoyed attending at the time, were intended to shake out the rattles in the product and figure out how to make it better.

During my scenario review, we walked through the entire setup experience – from booting the CD to configuring the server. If you recall, this meant walking through some really ugly bits of Windows. Text-mode setup. F5 and F6 function keys to install a custom HAL or mass-storage controller drivers during text-mode setup. Formatting a disk in text-mode setup. GUI-mode setup. Fun, fun stuff.

Also, some forget, but this was the first time that Windows Server was likely to ship with different branding from the client OS. Yet the Windows client branding was… everywhere. Setup “billboards” touting OS features that were irrelevant in a server, wizards, help files, even the fact that setup was loading drivers for PCMCIA cards and other peripherals that a server would never need or use in the real world, or verbs on the shutdown menu that made no sense on a server, like standby or hibernate.

A small team of individuals on the server team owned the resulting output from these walkthroughs, which went far beyond setup, and resulted in a bunch of changes to how Windows Server was configured, managed, and more. In terms of my role, I wound up being their liaison for design change requests (DCRs) on the Windows setup team.

There were a bunch of things that were no-brainers – fixing Windows Setup to be branded with Windows Server branding, for example. And there were a ton of changes that, while good ideas, were just too invasive to change, given the timeframe that Windows Server was expected to ship in, (and that it was still tethered to XP’s codebase at that time, IIRC). So lots of things were punted out to Blackcomb, etc.

One of my favorite topics of discussion, however, became the Start menu. While Windows XP shipped with a bunch of consumer items in the Start menu, almost everything it put there was… less than optimal on a server. IE, Outlook Express, and… Movie Maker? Heck, the last DCR I had to say no to for XP was a very major customer telling us they didn’t even want movie maker in Windows XP Pro! It had no place on servers – nor did Solitaire or the Windows XP tour.

So it became a small thing that David, my peer on the server team, and I tinkered with. I threw together a mockup and sent it to him. (It looked a lot like the finished product you see in this article.) No consumer gunk. But tools that a server administrator might use regularly. David ran this and a bunch of other ideas by some MVPs at an event on campus, and even received applause for their work.

As I recall, I introduced David to Raymond Chen, the guru of all things Windows shell, and Raymond and David wound up working together to resolve several requests that the Windows Server team had in the user interface realm. In the end, Windows Server 2003 (and Server SP1, which brought x64 support) wound up being really important releases to the company, and I think they reflected the beginning of a new maturity at Microsoft on building a server product that really felt… like a server.

The important thing to remember is that there wasn’t really any sort of vehicle to reflect cross-team collaboration within the company then. (I don’t know if there is today.) It generally wasn’t in your review goals (those all usually reflected features in your team’s immediate areas), and compensation surely didn’t reflect it. I sat down with David this week, having not talked for some time, and told him how most of my favorite memories of Microsoft were working on cross-team projects where I helped other teams deliver better experiences by refining where their product/feature crossed over into our area, and sometimes beyond.

I think that if you can look deeply in a product or service that you’re building, and see Conway’s law in action, you need to take a step back. Because you’re building a product for yourself, not for your customers. Building products and services that serve your entire customer base means always collaborating, and stretching the boundaries of what defines “your team”. I believe the project cited in the original blog post I referenced above failed both because there were too many cooks, but also because it would seem that anyone with any power to control the conversation actually forgot what they were cooking.