Tuesday, June 23, 2020

What Apple Announced at WWDC 2020

What Apple Announced at WWDC 2020

It looks like I got some things right, and some things wrong, in my previous post. Let’s look at  what Apple actually announced.

Mac with Apple Processors, across the line

Yes, they really are transitioning the whole Mac line to Apple’s own processors. The timeline they announced is “within 2 years” to transition the whole line. I still expect that they’ll end up well within two years, maybe even closer to a year. Similar to how the Intel transition was announced as “a year and a half”, but ended up being shorter.

The first ARM Macs will be available this year. This was a surprise to a lot of pundits, but it makes total sense to me, given that several major third-party software vendors (Adobe, Microsoft, and Epic) are on-board with the switch, and have their software working already. I was expecting both Adobe and Microsoft to show working pre-release software, just because it really is that easy to move a modern Mac code base to ARM, and they’ve both recently gone through a fairly-painful 64-bit transition for Catalina.

Rosetta 2, to run existing Intel Mac Applications

The big surprise for me is that they did include x64 compatibility in Big Sur. I’m happy to be wrong about that, it’s obviously good news for users. I just figured that the chance to make a clean break would be very tempting to Apple.

Rosetta 2 uses a combination of translation at install time for applications, and translation at load time for plugins. I think that ahead of time translation is a good tradeoff between taking some extra time to get started, in exchange for getting better translation. JIT translation of machine code is hard to balance between performance and latency.

The Rosetta 2 documentation is pretty sparse right now, but I did get the impression that x64-JIT compilers are supported in Rosetta apps, which is interesting. Presumably, when you make the syscall to make a segment of code executable, they translate it then. Pretty slick, though I wonder how much it’ll cause performance hiccups in, for example, web browsers, which rely heavily on JIT to get adequate performance.

Running iPad and iPhone applications without modification

Another thing that seems to have taken a bunch of people by surprise is that ARM Macs will be able to run iPad and iPhone apps without any modifications. This is a logical outgrowth of the Catalyst software that lets you rebuild iPad apps to run on the Intel version of MacOS. You just don’t need to recompile on the ARM Mac, because they’re already the same processor architecture.

Just like existing Catalyst applications, it’s possible for developers to add Mac-specific features (e.g. menus),  to create a better user experience on the Mac. This really does make UIKit (or SwiftUI) the “obvious” choice for new application development.

Porting assistance for Open-Source apps and frameworks

An interesting news item that came out of WWDC is that Apple is going to make patches available to a number of Open Source projects, to help get them up and running on ARM Macs. This includes Chromium, Node.js, OpenJDK, and Electron, which should mean that those projects won’t lag far behind.

So, what’s it all mean?

To me, this seems like just about entirely a win. The new Macs will be faster, use less power, will have a larger software library (all those iOS apps), and more capabilities.

Some software will get “left behind” in the transition, but not very much, at least in the short term. Running software under Rosetta will likely not be great, performance-wise, but it’ll be adequate for a lot of uses.

There is one major downside, for a certain subset of users

No x64 processor means no Bootcamp support for booting into WIndows, and no virtualization software to run your other x86 operating systems. I have a friend who uses Docker extensively as part of his developer workflow, and he’s just going to be out of luck, as far as using a Mac goes.

There is virtualization support on the ARM Macs, but it’s for virtualizing other ARM operating systems. You’ll be able to run an ARM Linux OS inside Parallels (for example), but if your workflow right now includes running code in x64 Windows or Linux, the ARM Macs won’t be a good fit.

What about Games?

It seems like every time a major new MacOS version comes out, they claim it’s going to be “great for games”, but the games mostly don’t actually come.

Having Unity supporting ARM Macs will definitely make it easier for anyone already using Unity to support the new Macs. But the current version of Unity already supports Mac, and still a lot of games never make it there, so I don’t think that’s a win. If anything, it’s a loss, since anybody who wants to support the Mac at least needs to test with both Intel and ARM Macs.

While a lot of big-name games never make it to the Mac, there’s actually quite robust support for the Mac among small indie games developers on Steam and Itch. Again, some of these folks will look at the cost to support both kinds of Macs, and decide it’s not worth it, so we’ll probably lose a few there, as well.

But then there are iPad games. A lot of iPad games are “casual games”, which is just fine by me, since I play more of that sort of thing than I do first-person shooters. And given that iPad games will, by and large “just work” on the new Macs, we may see more iPad games developers amazing a bit more up scale. It’ll be interesting to see.

WIll I buy one?

We’ll see what gets announced, but I expect that I will, whenever the “pro” laptops are available. 

Saturday, June 13, 2020

ARM Macs are coming, and faster than you think

ARM Macs and transition timeframes

(note: This is a lightly-edited version of a post originally published on June 13th, 2020)


We all knew this was coming. In fact, some of us have been expecting it for years. Various rumor outlets are saying that Apple will announce at WWDC that they're transitioning the Macintosh line from using Intel's processors to Apple's own processors, which are based on the ARM architecture.


A bunch of people have written extensively on this rumor, but I have a few thoughts that I haven't seen others concentrate on.


One thing you see a lot of disagreement about online is how long it'll take for Apple to convert its whole lineup of Macs to use its own processors, or if it even will. I've seen people say that they think they'll announce a single model of ARM Mac, then over the course of 2-3 years, move all of the product line over to ARM. I've even seen people predict that they'll keep the "Pro" line of computers on x86 for the foreseeable future, and only convert the portable line to ARM.


A case can be made for those positions, but here's what I think: If Apple announces a transition to ARM at WWDC, it'll happen surprisingly quickly. I wouldn't be at all surprised if the first ARM Macs ship before the end of 2020 , and the whole line is switched over before the end of 2021. That seems like a pretty far out-there prediction, compared to the "consensus" view, so let's take a look at the previous transitions, and how this one is different.

We've been here before, but then again, this is very different

This will be the third major processor transition for the Macintosh (and the 5th major software transition overall). Originally, the Mac used the Motorola m68k processor family. After 10 years, the m68k family was failing to make regular improvements in performance, and Apple started to look at other options, finally settling on the PowerPC. They moved the Mac products from m68k to PPC over the course of about 18 months. Years later, they transitioned from PowerPC to Intel, over the course of about 12 months. And now, we're apparently on the cusp of another processor transition. How will this one turn out? And most importantly: WHY NOW?


Transition 1: The m68k to PowerPC transition

"Any sufficiently-advanced technology is indistinguishable from magic"

– Arthur C. Clarke


This transition was very difficult, both for Apple and for third parties. At the time that Apple announced the change, they were still using what we now call the Classic MacOS. Large parts of the operating system, and the applications that ran on it, were written in Assembly, with an intimate understanding of the hardware they ran on.


Consequently, Apple developed a processor emulator, which would allow existing m68k code to run on the PowerPC without any changes. You could even have an application load a plugin written for the other architecture. The new PPC version of MacOS maintained a shadow copy of all its internal state in a place where 68k applications could see (and modify) it - that was the level of compatibility required to get anything to work. A heroic effort, and it paid off - most software worked out of the box, and performance was "good enough" with emulated code, because the PPC chips were much faster than the m68k chips they were replacing.


The downside of that sort of transition is that it takes many years to complete. There was relatively little pressure on the third parties to update their applications, because they ran just fine on the new models. Even the MacOS itself wasn't completely translated to native code until several years later. 


Transition 2: The MacOS X transition

“If I need to make that many changes, I might as well drop the Mac, and go to Windows”

– Some Mac developer, a Halloween party in 1999


A few years after the PPC transition, Apple announced MacOS X, and software developers were faced with another transition. Originally, OS X was intended to be a clean break with Classic MacOS, with an all-new underlying operating system, and a brand new API, Cocoa (based on the OPENSTEP software which came in with the NeXT acquisition).


Major developers were (understandably) not enthusiastic about the prospect of rewriting the majority of their existing applications. Eventually, Apple caved to the pressure, and provided Carbon, a "modern" API that kept much of the same structure, but removed some of the more egregious aspects of Classic MacOS programming. Apple made it clear that they considered Carbon a transitional technology, and they encouraged developers to use Cocoa. The reaction from the larger developers was pretty much "meh." Quite a few smaller long-time MacOS developers enthusiastically embraced the new APIs though, appreciating the productivity boost they provided.


A footnote to this chapter of the saga is that the "Developer Preview" versions of Rhapsody, the first Mac OS X releases, actually had support for running the OS on Intel-based PC hardware. That didn't survive the re-alignment which gave us Carbon, and MacOS X 10.0 shipped with support for PowerPC Macs only. 


Things were pretty quiet on the Macintosh front for a few years. New versions of OS X came out on a regular schedule, and Apple kept coming out with faster and better PowerBooks, PowerMacs, iBooks, and iMacs. And then, suddenly, the PowerPC processor line had a few unexpected hiccups in the delivery pipeline.


Transition 3:The Intel transition

“Wait, you were serious about that?”

– Carbon-using developers, overheard at WWDC 2005


The PowerPC processors were looking less and less competitive with Intel processors as time went by, which was an embarrassment for Apple, who had famously built the PowerPC advertising around how much faster their chips were than Intel's. The "G5" processor, which was much-hyped to close the gap with Intel, ran years late. It did eventually ship, in a form that required liquid cooling to effectively compete with mainstream Intel desktop PCs. The Mac laptop range particularly suffered, because the low-power laptop chips from Motorola just...never actually appeared.


And so, Apple announced that they were transitioning to Intel processors at WWDC 2005. I was working in the Xcode labs that year, helping third-party developers to get their code up and running on Intel systems. I worked a lot of "extra shifts", but it was amazing to see developers go from utterly freaked out, to mostly reassured by the end of the week.


For any individual developer, the amount of “pain” involved in the transition was variable. If they’d “kept up with” Apple’s developer tools strategy in the years since the introduction of Mac OS X, no problem! For the smaller indie developers who had embraced Xcode, Cocoa, and all of Apple's other newer framework technology, it actually was a trivial process (with one exception). They came into the lab, clicked a button in Xcode, fixed a bunch of compiler warnings and errors, and walked away with a working application, often in just an hour or so. 


For the developers with large Carbon-based applications built using the Metrowerks compiler, it was a real slog. Because of CodeWarrior-specific compiler extensions they'd used, different project structures, etc, etc,  it was hard to even get their programs to build in Xcode.


The exception to the "it just works" result for the up-to-date projects is any kind of external I/O. Code that read or wrote to binary files, or communicated over a network, would often need extensive changes to flip the "endianness" of various memory structures. Endianness is something you generally don’t need to think about as a developer in a high-level language, especially if you're only developing for one platform, which also just happens to use the same endianness as the Internet does. Luckily, these changes tended to be localized.


Transition 4: The 64-bit transition

“You can't say we didn't tell you this was coming..."

– Some Apple Developer Tools representative (probably Matthew), at WWDC 2018


The first Intel-based Macs used processors that could only run in 32-bit mode. This is what I consider one of Apple's major technology mistakes, ever. They should have gone directly to 64-bit Intel from the get-go, though that would have required waiting for the Core II Duo processors from Intel, or using AMD chips, or doing the iMac first, and the notebooks last.


Regardless, after the first year, all Macs were built with 64-bit capable processors, and MacOS started supporting 64-bit applications soon after. Technically, the previous versions of Mac OS X supported 64-bit applications on the "G5" processors, but that was only available in the Power Mac G5, and very few applications (other than ports from workstation hardware) bothered to support 64-bit mode.


Unfortunately for the folks hoping to see a glorious 64-bit future, there was again very little incentive for developers to adopt 64-bit code on MacOS. One of the advantages of Intel-based Macs over the PowerPC versions is that you could reuse code libraries that had been written for Windows PCs. But, of course - almost all Windows applications are written for 32-bit mode, so any code you share between the Windows and Mac versions of your application need to be 32-bit. You also can't mix-and-match 32-bit and 64-bit code in the same process on MacOS. So most MacOS applications remained 32-bit for years after there were no longer any 32-bit processor Macs being sold. Even when OS X 10.7 dropped support for 32-bit processors entirely, most applications stayed 32-bit.


Apple told  developers at every WWDC from probably 2006 on  that they should really convert to 64-bit. They'd talk about faster performance, lower memory overhead for the system software, and various other supposed advantages. And every year, there just didn’t seem to be any great need to do so, so mustly, all Mac software remained 32-bit. A few new APIs were added to MacOS which only worked in 64-bit applications, which just had the unfortunate effect of those features never seeing wide adoption.


Eventually, Apple's tactics on this issue evolved from promises to threats and shaming. Developers were told at WWDC that 32-bit applications would not be supported "without compromises" in High Sierra. Then, when High Sierra shipped, we found that Apple had added a warning message that 32-bit applications were "not optimized" for the new OS. That got the end users to start asking developers about when they were going to “optimize” for the new operating system. For the better part of a year, many developers scrambled to get their apps converted before MacOS Mojave shipped, because they made the reasonable assumption that the warning message was implying that Mojave wouldn’t support 32-bit applications. But then Mojave shipped, and 32-bit apps ran the same as they ever have, with the same warning that was displayed in High Sierra. And then, in MacOS Catalina, they finally stopped allowing 32-bit software to run at all.


Converting an existing 32-bit Cocoa application to 64-bit is not particularly difficult, but it is...tedious. You end up having to make lots of small changes all over your code. In one project that I helped convert, there were changes needed in hundreds of source code files. We got there, but nobody thought it was fun, and it seemed so pointless. Why inflict this pain on users and developers, for what seemed like no gain?


You couldn't say that we weren’t warned that this was coming, since Apple had been telling developers  for literally a decade to convert to 64-bit. But third-party developers were still pretty confused about the timing. Why "suddenly" deprecate 32-bit apps for Catalina? Just to incrementally reduce the amount of maintenance work they needed to do on MacOS? Or to reduce the amount of system overhead by a handful of megabytes on and 8GB Mac? It didn’t make sense. And why did they strongly imply it was coming in Mojave, then suddenly give us a reprieve to Catalina?


Transition 5: The ARM Mac

“The good news is, you've already done the hard part”

– Apple, WWDC 2020


With all of this in mind,  I think that the sudden hard push for 64-bit in High Sierra and beyond was a stealth effort to get the MacOS third-party developers ready for the coming ARM transition. When High Sierra shipped, almost all MacOS software was 32-bit. Now that Catalina is out, almost all the major applications have already transitioned to 64-bit. Perhaps the reason the “deadline” was moved from Mojave to Catalina was because not enough of the “top ten” applications had been converted, yet?


Prior to finally getting all third-party developers to adopt 64-bit, the transition story for converting to ARM would have been complicated, because the applications were all 33-bit, and the Mac ARM chips would be 64-bit (the iOS platform having had their 64-bit conversion a few years back). Apple would have been telling developers: "First, you need to convert to 64-bit. Then, you can make any changes needed to get your code running on ARM".


Now, it's going to be very simple: "If your application currently builds for Catalina, with the current SDK, you can simply flip a switch in Xcode, and it'll be able to run on the new ARM Macs, as well". That's not going to be literally true for many applications, for various reasons (binary dependencies on some other third-party SDK, Intel-specific intrinsics in performance-critical code, etc). 


But this time, there is no endianness issue, no apps that will need to change their toolchain, none of the previous issues will be relevant. I also think it's quite likely that there will be very few arbitrary API deprecations in MacOS 10.16, specifically to make this transition as painless as possible for as many developers as possible.


What’s it all mean, then?

"All of these products are available TODAY"

– Steve Jobs's famous tagline, possibly heard again this year?


So then - on to timing. For the Intel transition, there were about 6 months between announcement and availability of the first Intel Mac, and another 8 months before the transition was complete. This time, it's likely that Apple will shrink the time between announcement and availability, because there's comparatively little work that needs to be done to get most applications up and running on the new hardware.


It's possible that they'll announce ARM Macs, shipping that very day. If Adobe and Microsoft are already ready to go on day one, it might even be plausible. I think they'll want to give developers some time to get ready, though. So, I predict 3 months, meaning ARM Macs in September, 2020. And I think they'll move aggressively to put as many Macs on their own processors as they can, because it's all a win for them - lower costs, better battery life, etc, etc.


"But what about the Mac Pro?", you'll hear from some experts. "Nobody's ever produced a Xeon-equivalent performance ARM chip. It'll take years to make such a design, if it's even possible at all".


The obvious comeback here is: Nobody knows what Apple has running in a lab somewhere, except the people who are working on the project. Maybe they already have an 80 Watt ARM powerhouse chip running in a Mac Pro chassis, right now. But even if they don't, I think it's reasonable to look at this from the "Why Now?" perspective, again.


The previous processor transitions were mainly driven by a need to stay competitive in performance with Intel. That is not the case this time, since the desktop/laptop competition is almost exclusively on the same Intel processors that Apple is using. The success, or rather the lack thereof, of other ARM-architecture computers & operating systems (Windows for ARM, ChromeBooks) doesn't make a compelling case for a switch to “keep up” with Microsoft or Google, either. So there's no hurry.


Given that there's no external pressure to switch, Apple must think that they have a compelling story for why they're switching. And that has to include the entire product line, since the Mac isn't their most-important product, and they surely aren't going to support two different architectures on it, just to keep existing Mac Pro users happy. They either have a prototype of this Mac Pro class processor ready to go, or they're very sure that they can produce it, and they have a believable roadmap to deliver that product. Otherwise, they’d just wait until they did.

Which Macs are switching first?

I bet you didn’t see that coming, skeptics!”.

– Me, maybe?


Everybody is expecting to see a new Macbook, possibly bringing back the 12-inch, fanless form factor, and taking maximum advantage of the power-saving and cooler operation of Apple’s own chips. Some folks are expecting a couple of different models of laptops.


What I would really love to see (but don’t much expect) is for Tim Cook to walk out on stage, announce the first ARM-based Mac, and have it not be a super-small, low-power consumer laptop product. I want it to be  something high end that decisively outperforms the current Mac Pro, and establishes that this is going to be a transition of the whole line. I think that'd be a bold statement, if they could swing it.


Saturday, April 21, 2018

Movie Mini-Review: Rampage

Rampage was...not good. (Warning - "spoilers" below, though honestly, there's not much to spoil)

I see you folks in the back, snickering and saying "well, what did you expect?". Here's the thing though - Rampage was probably the most-disappointing movie I've seen in the last year. It’s an onion of layered badness. Even The Rock couldn’t save this one. What a joyless, boring, poorly-made movie.

Whenever any movie is adapted from some other medium, whether it's a book, a play, or a video game, some amount of changes are inevitable. You often have to trim the plot or characters of a novel in order to "fit" it into a movie, for example. And the writers and director will want to make their own changes, to adapt the story to the medium, or just to put their own twist on a well-known property.

But I've never seen a film adapt source material whose full plot could be written on a 3x5 note card, and use NONE of it. In theory, Rampage is an adaptation of the classic video game of the same name, which was released in 1986. The player takes control of a giant monster - an ape, a werewolf, or a lizard - and climbs onto buildings, smashes them, and eats people. In those days, video games really didn't have "plots", as such. We didn't have the memory for that :-) The entirety of the plot exposition takes place as text that scrolls on the screen at the start the game, and you're shown the origin story of whichever of the monsters you've chosen to control. In each case, it's a human being who's mutated into a giant monster by exposure to some dangerous chemical.

The monsters then go on a 128-city tour of North American cities, starting and ending in Illinois, leaving destruction in their wake. In every city, the layout of the buildings is the different, and there are additional features in some cities, such as the trains in Chicago, and the bridge over the river in Detroit. As you're destroying the buildings, soldiers will shoot you from the windows, and some will inexplicably throw dynamite. Smashing the windows of the buildings will occasionally reveal surprises, as well - from a roasted chicken which you can eat for extra health, to electrical devices that can shock you and make you lose your grip. The whole thing is gloriously silly, in the way of Saturday afternoon Creature Features, with guys in rubber suits beating each other up in model cities made of balsa wood and paper.

Essentially none of that is in the movie. There is a giant ape, who's named George. But he's not a human who turned into an ape - he's a gorilla affected by a DNA-editing toxin, which causes him to grow very rapidly. His handler, played by The Rock, has to search for an antidote to cure him. This is obviously part of the process of making room for the human characters to actually be the stars, and it hugely alters the tone of the thing, as well. Instead of a fun movie about giant monsters smashing stuff, we get a much more-typical blockbuster hero movie, where the muscular hero dude and his female sidekick have to race against time to save the world (or at least Chicago) from destruction.

Rampage is a game about monsters punching buildings and eating people...but an hour into the movie, there were no buildings wrecked (well, one partially wrecked), and almost nobody got eaten. The film did try to inject some humor into things, but a lot of the funny bits fell pretty flat, because they didn't really fit the "grim and gritty reboot" that the rest of the movie was trying to be.

And then there's the gore, which I found really off putting. PG-13 is apparently The Uncanny Valley for gore. In little kids movies, there's no gore. In R-rated movies, you can have either realistic gore or ridiculous over-the top gore, take your pick. PG-13, you can get enough blood to be disturbing, but not enough to be funny.

On my way in, I saw a couple and their young (maybe 8 or 9 year-old) daughter settling in to watch the movie. I know that all kids are different, and maybe her parents weren't really thinking about the "13" in the PG-13 rating, but this is a movie that starts out with a fairly intense chase scene in a space station filled with blood, severed heads, and detached limbs. Probably not what they thought they were getting, based on the trailer, and the fact that The Rock was the headline star. Unsurprisingly, the little girl was pretty upset after being subjected to that. I didn't see them when I left the theater, but I'm guessing they didn't see the whole thing.

Interestingly, some folks have praised Rampage as "The most faithful video game adaptation" or gone into great detail on the various nods to the source material. I guess it all depends on what you're looking for. For me, adapting something to film, and losing the "soul" of the thing along the way is just sad. Someone could have made a really fun Rampage movie, but this definitely wasn't it.


Monday, November 20, 2017

Mergers: The Good

(intro blog post here)

How to help your acquired employees succeed


Out of the 6 acquisitions I've been involved with, two really stand out as positive experiences, both for the acquired and the parent company. Here's what was different about those two mergers, as opposed to the ones that didn't go so well.

Integrate the new team quickly (Apple/NeXT)

In Apple's case, they acquired NeXT both in order to get new technology to base their next-generation OS on, and to get a fully-functional engineering organization. You can't really understand just how screwed up Apple was in 1996 unless you were there, but in the last quarter of the year, Apple lost over a billion dollars. They had, at that point, had 2 or 3 (depending on how you count) different "next generation" OS projects crash and burn, and the latest one, Copland, was on the verge of disaster – I've seen credible evidence that it wouldn't have shipped for another 5 years, if ever. Into all this swirling chaos, we bring the NeXT team, understandably freaked out to be called on to "save" this huge, dysfunctional company from itself.

But one thing that was hugely encouraging, and helped us to all settle in, was how quickly we were integrated into the Apple organization as a whole. Within a month after the acquisition, we were meeting with our counterparts in Cupertino, we had apple.com email addresses, our systems were on the Apple network, and we'd had an army of Apple HR folks over to the NeXT offices to get us transferred over to Apple's payroll and benefits.

It was still a very hard slog, and there was a LOT of anger from folks at Apple that had their friends laid off right after the acquisition, but feeling like we were legitimately part of the team, and not just a bunch of outsiders, helped us to fight the battles we had to fight.

Put the full support of the larger organization behind the newcomers (LG/WebOS)

After the debacle that was the HP's acquisition of Palm (see the "Ugly" segment, coming soon), the folks remaining on the WebOS team were pretty nervous when we were told that we were being sold off to LG. "Oh, great, another absentee owner who will tell us we're important, but then never do anything".

And then we had our first meetings with LG's upper management. And we were told that we would be building the user interface for all of LG's high-end smart TV's, that we were going to ship in less than a year, and that we were expected to deliver something BETTER than the existing NetCast software, which they had been shipping for a few years. "Oh, crap, I thought - none of us know anything about Smart TVs, or TVs in general". But then they told us: "The CEO has expressed his full support of this project, and you'll have as much support as you need".

I really didn't believe that we were going to get "as much support as you need", but sure enough, within a short time period after the acquisition, truckloads of current-generation TVs and prototype logic boards for the next generation started flooding into the office. And in the months after that, truckloads of engineers from Korea, who knew the hardware and the existing NetCast software intimately. Anything we asked for, we got – score one for top-down, authoritarian management style, I guess.

And we did it - a small group of developers, working their asses off, managed to build something in less than a year which was immensely better than the existing product, which had been shipping for several years. The next-generation smart TVs, with a new version of WebOS, were even better. This was definitely a high point for the "acquire a smaller company to bring innovation to the larger company" strategy. And it succeeded because the project had a powerful advocate within the larger company, and a VERY clear vision of what they wanted to accomplish.

Next week

What not to do to make your new employees feel welcome, and how to tell (as an employee) when things are likely to go sour quickly.

Monday, November 13, 2017

Mergers: The Good, The Bad, and The Ugly

You've been acquired how many times?

In my career, I've been fortunate enough to have worked for a number of small software/hardware companies, several of which were absorbed by much larger companies. I though tit'd be interesting to compare and contrast some of the ways the various mergers went good and bad, and what acquiring companies might be able to learn from my experience.

Here's the timeline so far:

  1. I started working for NeXT Software in 1994, they were acquired by Apple in 1996.
  2. I left Apple in 1999 to work for Silicon Spice. They were acquired by Broadcom in 2000.
  3. Broadcom laid me off, and I went back to Apple for a while.
  4. I left Apple in 2005 to work at Zing Systems, which was acquired by Dell in 2007.
  5. I left Dell to go work at Palm in 2009. In 2010, Palm was acquired by Hewlett-Packard.
  6. Hewlett-Packard eventually sold the entire WebOS group to LG.
  7. I left LG to go work for Citrix on GoToMeeting. After 2 1/2 years, the GoToMeeting business was spun off and merged with LogMeIn, Inc.
So I've been part of 6 different merger/acquisition processes at this point, and I feel like I'm getting a feel for how you can tell when an acquisition is going to go well, as opposed to going poorly.

Why do big companies buy smaller companies?

When a big company acquires a smaller company, it can be for a variety of reasons. Sometimes it's to acquire a potential competitor, before they can get large enough to threaten the larger company. It can be an "acqui-hire", where they buy the smaller company strictly for its human resources, and have no real interest in the technology or products the smaller company has developed (this happens with social media companies frequently, because skilled developers are hard to find). Or, it can be a case of acquiring a new technology, and a team of experts in that technology, in order to either kick-start a new product, or to kick new life into an existing product. That last reason was the primary reason for all of the acquisitions I've been involved in.

What's the most-comon mistake acquiring companies make?

Understandably, big companies often look to smaller companies as an engine to drive innovation. There's a perception that small companies can move faster and be more nimble than larger companies. So there's often a desire to let the new acquisition run itself, as a sort of independent entity inside the larger company. Being hands-off seems like the obviously-right thing to do if you wanted increased agility to start with, but this is generally not as good of an idea as it'd seem at first blush.

Yes, you absolutely don't want to break up the functional team you just acquired, and spread them out willy-nilly throughout your company. You don't want to drag them into the bureaucracy and infighting that has marred all of your internal attempts at innovation. But guess what? If you don't make an effort to get them at least nominally integrated with the rest of the company, you will, at best, end up with an isolated group, who continue to do their one thing, but don't meaningfully contribute to your larger organization's bottom line. And the smaller group will also get none of the benefits of scale of being part of the larger group. It's lose-lose.

Examples of the Good, the Bad, and the Ugly

Tune in next Monday (and the Monday after that) for real-life tales of integrations gone well, gone poorly, and gone horribly wrong.

Monday, November 06, 2017

That delicate line between security and convenience

A key problem, maybe the key problem in software security is how to properly balance user convenience with security. Adding additional security to a system often includes extra work, more time, or other compromises from the end-user. And reasonable people can disagree about where the line is for the appropriate trade-off.

That iPhone camera permissions "flaw"
There was a brief flurry of articles in the news recently, talking about a "flaw" in iOS permissions which would allow applications to take your picture without you being aware. Typically, these were presented with click-bait headlines like:


or 


The blog post of the actual security researcher who raised this issue (Felix Krause) is substantially less-sensational:


It's good that this issue is getting some attention, but it's important to understand where we're coming from, what the actual issue is, and possible ways to mitigate it. As a quick aside, I find it annoying that the articles say "Google engineer". Yes, Krause works for Google, but this work is not coming out of his "day job", but rather his own work in security research. Also, Android has exactly this same problem, but it doesn't merit a blog post or worldwide news coverage, because apparently nobody expects even minimal privacy from Android devices.

How camera permissions work on iOS today
The current version of iOS asks the user for permission to use the camera the first time that an application tries to access it. After that, fi the application is running in the foreground, it can access the camera whenever it wants to, without any additional interaction. And typically, this is actually what the user wants.

It's convenient and fun to be able to use the built-in camera support in Facebook without having to answer "yes I do want to use the camera" each time that you choose to share a photo on social media. And replacements for the built-in camera app, like Instagram, Snapchat, and Halide, would be pretty much unusable if you had to answer a prompt Every. Single. Time. you wanted to take a photo.

How it used to work
Previous versions of iOS actually required applications to use the built-in camera interface to take pictures. You still only had to give permission once, but it was really obvious when the app was taking you picture, because the camera preview was right there in your face, taking over your screen. This design was widely criticized by app developers, because it made for a really jarring break in their carefully-crafted use experience to have the built-in camera appear, and they couldn't provide a preview that actually showed what was going to be captured (with the rise of photo filters, this is especially problematic).

At some point, Apple added the capability to capture photos and video, while presenting the app's  own interface. This makes for a more-cohesive experience for the end-user, and makes it possible for apps to preview what they're actually going to produce, filters, silly hats, and all. This is clearly a win for the app developers, and I'd argue it is also a win for the end-user, as they get a better experience with the whole picture taking process.

What's the actual privacy issue here?
I use Facebook to post photos and videos, sometimes. But I don't really want Facebook taking pictures of my face when I'm not using the camera features, and analyzing that data to better serve me content, including advertisements.

If I'm scrolling through my news feed, and Facebook is silently analyzing the images coming through the back camera, so that they can discover my location and serve me ads for whatever business I'm standing in front of, that's intrusive and creepy. If they're reading my facial expression to try to determine how I feel about the items in my news feed, that's even worse.

How Apple can better-inform users
I don't think anybody wants to go back to using the UIImagePicker interface, and I don't think anybody (except possibly security researchers) wants to have to affirmatively give permission every time an application wants to take a picture or video. One alternative that I like (and Krause mentions this in his initial blog) is some kind of persistent system UI element that indicates that the camera is on. Apple already does something similar with a persistent banner on the top of the screen when applications in the background are capturing audio (for VoIP communications). A little dot on the status area would go a long way, here.

It'd also be really nice to have a toggle in Preferences (or better, in Control Center) to disable the camera system-wide, so if you know you're heading somewhere that you shouldn't be taking pictures, you can temporarily disable the camera.

What users can do to better protect themselves
Obviously, just don't grant camera permission to applications that don't actually need them.  think most social network software falls into this category. Twitter and Facebook don't actually need to access my camera, so I have it disabled for both of them. If you actually DO use Facebook and Twitter to take pictures, then I guess you'll just need to be more aware of the tradeoffs.

If you "have to" enable camera access to certain apps, but you don't fully-trust them, there are honest-to-goodnes lens caps you can buy which will cover your iPhone camera when you're not using it. Or a piece of tape works. There's even specially-made tape tabs for just this purpose.


Tuesday, October 17, 2017

"Responsible Encryption" - what does that mean?

This weekend I read this excellent article by Alex Gaynor responding to Deputy Attorney General Rod Rosenstein's remarks on encryption to two different audiences last week. Please do go and read it when you get a chance, as it delves into the sadly common tactic of pointing to a bunch of scary criminal incidents, then saying "unbreakable encryption enables criminals and terrorists", without presenting any evidence that those crimes were enabled by encryption technology, or that law enforcement officers were actually hampered in their investigations by encryption.

In fact, in the case of the FBI, Apple, and the San Bernardino shooter, AG Rosenstein repeats all of the same false narrative that we've been presented with before - that the shooter's phone possibly contained vital information, that Apple "could" decrypt the information, and that they fought the FBI's legal attempts to force them to do so. Read my previous blog post (linked above) for background on that line of argument, and how the FBI willfully twists the facts of the case, to try to get something much more far-reaching than what they claim to want.

One thing not addressed directly in Alex's article is the frustration that the FBI and other law enforcement  officials have expressed over the inability to execute a legal search warrant, when they're faced with a locked phone, or a communications system that provides end-to-end encryption.

From Rosenstein's remarks to the Global Security Conference
We use the term “responsible encryption” to describe platforms that allow police to access data when a judge determines that compelling law enforcement concerns outweigh the privacy interests of a particular user.  In contrast, warrant-proof encryption places zero value on law enforcement.  Evidence remains unavailable to the police, no matter how great the harm to victims.
First, what a bunch of emotionally-charged words. And again we see the disconnect between what the FBI and other agencies say that they want (a way to unlock individual phones), and what they seem to keep asking for (a key to unlock any phone they can get their hands on).

But the man does have a point - there is some value to society in the FBI being able to execute a valid search warrant against someone's phone, or to "tap" the communications between known criminals. And I think he's also right that that sort of access is not going to be provided if the free market is allowed to set the rules. It'll never be in Apple's or any individual customer's interest to make it easier to access a locked phone. So, it'll come down to a matter of legislation, and I think it's worth the tech folks having this conversation before Congress sits down with a bill authored by the FBI and the NSA to try to force something on us.

The encryption-in-flight question is very complicated (and crypto protocols are hard to get right - see the recent KRACK security vulnerabilities), so I'll leave that for a future post. I do believe that there are reasonable ways for tech companies to design data-at-rest encryption that is accessible via a court order, but maintains reasonably-good security for customers. Here's a sketch of how one such idea might be implemented:

On-device Key Escrow


Key escrow 
The basic idea of key escrow is that there can be two keys for a particular piece of encrypted data - one key that the user keeps, and one that is kept "in escrow" so another authorized agent can access the data, if necessary. The ill-fated Clipper Chip was an example of such a system. The fatal flaw of Clipper (well, one of them) is that it envisioned every single protected device would have its secondary key  held securely by the government to be used in case of a search warrant being issued. If Clipper had ever seen broad adoption, the value of that centralized key store would have been enormous, both economically and militarily. We're talking a significant fraction of the US GDP, probably trillions of dollars. That would have made it the #1 target of thieves and spies across the world.

Eliminating central key storage
But the FBI really doesn't need the ability to decrypt every phone out there. They need the ability to decrypt specific phones, in response to a valid search warrant. So, how about storing the second key on the device itself? Every current on-device encryption solution that I know of provides for the option of multiple keys. And in fact, briefly getting back to the San Bernardino shooter's phone, if the owners of that phone (San Bernardino County) had had a competent IT department, they would have set up a second key that they could then have handed over to the FBI, neatly avoiding that whole mess with suing Apple.

You could imagine Apple generating a separate "law enforcement" key for every phone, and storing that somewhere, but that has all the same problems as the Clipper central key repository, just on a slightly smaller scale. So those keys need to stored separately. How about storing them on the device itself?

Use secure storage
Not every phone has a "secure enclave" processor like the iPhone, but it's a feature that you'll increasingly see on newer phones, as Apple and other manufacturers try to compete on the basis of providing better privacy protection to their customers. The important feature of these processors is that they don't allow software running on the phone to extract the stored keys. This is what keeps the user's data secure from hackers. So, if the key is stored in there, but the phone software can't get it out, how will the FBI get the key?

Require physical access
My preferred solution would be for the secure enclave to have a physically-disconnected set of pins that can be used just for extracting the second key. In order to extract the key, you'd need to have physical access to the device, disassemble it, and solder some wires on it. This is, I think, sufficiently annoying that nobody would try to do it without getting a warrant first.

It also means that nobody can search your phone without taking it out of your possession for a good long while. This seems like a reasonable trade-off to me. If someone executes a search warrant on your house, you'll certainly know about it. There's such a thing as "sneak and peek" warrants, or delayed-notice warrants, where police sneak in and search your home while you're not there, but I'm not particularly interested in solving that problem for them.

Conclusion
Is this a perfect solution? Of course not. But I think something like this is a reasonable place to start when discussing law enforcement access to personal electronics. And I think we want to have this conversation sooner, rather than later. What do you think?