tag:blogger.com,1999:blog-140341772024-03-13T19:01:11.962-07:00Another Day In The Code MinesYet another infrequently-updated blog, this one about the daily excitement of working in the software industry.Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.comBlogger92125tag:blogger.com,1999:blog-14034177.post-60218571631895769552021-03-21T22:53:00.002-07:002021-03-21T22:53:18.978-07:00He went that-a-way!<p> This blog has moved to a new Wordpress site, <a href="http://markbessey.blog" target="_blank">here</a>. All of the old posts have been migrated there, and I'll be writing there, going forward. I just got tired of constantly fighting with the Blogger engine.</p><p>I hope to see all 6 of you subscribers there.</p><p>-Mark</p><p><br /></p>Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com1tag:blogger.com,1999:blog-14034177.post-39487934327752098212020-11-30T23:52:00.002-08:002020-11-30T23:56:51.608-08:00Fact-checking some Apple Silicon exuberanceThe first Apple Silicon Macs are out in customers' hands now, and people are <i>really excited</i> about them. But there are a couple of odd ideas bouncing around on the Internet that are annoying me. So, here's a quick fact check on a couple of the more breathless claims that are swirling around these new Macs.<h2 style="text-align: left;"><br /></h2><h2 style="text-align: left;">Claim: Apple added custom instructions to accelerate JavaScript!</h2><div style="text-align: left;"><span style="font-family: courier; font-size: xx-large;">False.</span></div><div><br /></div><div>There is <i>one instruction</i> on the M1 which is JavaScript-related, but it's part of the standard ARM v8.3 instruction set, not a custom Apple extension. As far as I know, there are no (documented) instruction set extensions for M1 outside of the ARM standard.</div><div><br /></div><div>The "JavaScript accelerator" instruction is <a href="https://developer.arm.com/documentation/dui0801/g/A64-Floating-point-Instructions/FJCVTZS" target="_blank">FJCVTZS</a>, or "Floating-point Javascript Convert to Signed fixed-point, rounding toward Zero". You might well wonder why this *one* operation is deserving of its own opcode. Well, there are a couple of things at work here: All JavaScript numbers are stored as double-precision floats (unless the runtime optimizes that out), and if you do something as simple as applying a bit mask to a number, you've got to convert it from the internal representation to an integer, first.</div><h2 style="text-align: left;"><br /></h2><h2 style="text-align: left;">Claim: Faster reference counting means the M1 can do more with less RAM!</h2><div><span style="font-family: courier; font-size: xx-large;">False.</span></div><div><br /></div><div>This one is just a baffling combination of ideas, mashed together. David Smith, an Apple engineer, posted a thread on Twitter about how the M1 is able to do a retain/release operation much faster than Intel processors. They're even faster when translated with Rosetta than they are on native x86 code.</div><div><br /></div><blockquote class="twitter-tweet"><p dir="ltr" lang="en">fun fact: retaining and releasing an NSObject takes ~30 nanoseconds on current gen Intel, and ~6.5 nanoseconds on an M1</p>— David Smith (@Catfish_Man) <a href="https://twitter.com/Catfish_Man/status/1326238434235568128?ref_src=twsrc%5Etfw">November 10, 2020</a></blockquote><div>This is a pretty awesome micro-optimization, but it got blown all out of proportion, I think largely due to t<a href="https://daringfireball.net/2020/11/the_m1_macs" target="_blank">his post on Daring Fireball</a>, which takes the idea that this one memory management operation is faster, and combines that with a poorly understood comparison between Android and iOS from <a href="https://blog.metaobject.com/2020/11/m1-memory-and-performance.html" target="_blank">this other blog post</a>, and somehow comes to the conclusion that this means that 8GB of RAM in an M1 system is basically equivalent to 16GB on Intel. There's a caveat in there, which I guess isn't being noticed by most people quoting him, that if you "really need" 16GB, then you really need it.</div><div><br /></div><div>Yes, reference counting is lighter on RAM than tracing garbage collection, at a potential cost in additional overhead to manipulate the reference counts. But that advantage is between MacOS and other operating systems, not between ARM and Intel processors. It's great that retain/release are faster on M1, but application code was not spending a significant percentage of time doing that. It was already really fast on Intel. If you actually use 16GB of RAM, then 8GB is going to be an issue, regardless of Apple's optimizations.</div><h2 style="text-align: left;"><br /></h2><h2 style="text-align: left;">Claim: The M1 implements special silicon to make emulating Intel code faster!</h2><div><span style="font-family: courier; font-size: xx-large;">True.</span></div><div><br /></div><div>Amazingly, this one is basically true. One of the most surprising things about Rosetta2 is that the performance of the emulated code is really, really good. There's a lot of very smart software engineering going into that of course, but it turns out that there is actually a hardware trick up Apple's sleeve here. </div><div><br /></div><div>The ARM64 and x64 environments are very similar - much more so than PowerPC and x86 were, back in the day. But there's one major difference: the memory model is "weaker" on the ARM chips. There are more cases where loads and stores can be rearranged. This means that if you just translate the code in the "obvious" way, you'll get all sorts of race conditions showing up in the translated code, that wouldn't have happened on the Intel processor. And the "obvious" fix for that is to synchronize on every single write to memory, which will absolutely kill performance.</div><div><br /></div><div>Apple's clever solution here is to implement an optional store ordering mode on the Performance cores on the M1, that the OS can automatically turn on for Rosetta-translated processes, and leave off for everything else. Some random person posted about this <a href="https://github.com/saagarjha/TSOEnabler" target="_blank">on GitHub</a>, along with a proof of concept for enabling it on your own code. Apple hasn't documented this switch, because <i>of course</i> they don't want third-party developers to use it.</div><div><br /></div><div>But it's a very clever way to avoid unnecessary overhead in translated applications, without giving up the performance benefits of store re-ordering for native code.</div><div><br /></div><div><h2 style="text-align: left;">Claim: The unified memory on M1 means that you don't need as much memory, since the GPU and CPU share a common pool.</h2><div><span style="font-family: courier; font-size: xx-large;">False.</span></div></div><div><span style="font-family: inherit;"><br /></span></div><div>This<span style="font-family: inherit;"> one is kind of hard to rule on, because it depends a lot on </span>how<span style="font-family: inherit;"> you're counting. If you compare to a </span>typical<span style="font-family: inherit;"> CPU with a discrete GPU, then you will need less <i>total memory</i> in the unified </span>design. Instead of having two copies of a lot of data (one for the CPU, and one for the GPU), they can just share one copy. But then again, if you've got a discrete GPU, it'll have its own RAM attached, and we normally don't count that when thinking about "how much RAM" a computer has.</div><div><br /></div><div>An 8GB M1 Mac is still going to have the same amount of usable RAM as an 8GB Intel Mac, so again, this is not a reason to think that an 8GB model will be an adequate replacement for your 16GB Intel Mac.</div><div><br /></div><h2 style="text-align: left;">Claim: Apple's M1 is an amazing accomplishment</h2><div><span style="font-family: courier; font-size: xx-large;">True!</span></div><div><br /></div><div>Leaving aside some of the the wilder claims, <i>it is true</i> that these processors represent am exciting new chapter for Apple, and for the larger industry. I'm excited to see what they do next. I have some thoughts about that, for a future post...</div>Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com0tag:blogger.com,1999:blog-14034177.post-38890428847148731532020-10-11T16:37:00.002-07:002020-10-11T20:07:08.311-07:00What I learned by NOT playing D&D<p> </p><span id="docs-internal-guid-84a7c6dd-7fff-37a5-56ac-843c283696b8"><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial; font-size: 20pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;">What I learned by NOT playing D&D</span></h1><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">I really loved playing Dungeons & Dragons as a kid, but I really haven’t played much since becoming an adult. And I haven’t run a game in this century. Every now and then, I’d talk with Yvette about it, and she’d say, “We should write up an adventure, and run it for our friends!”. It always sounded good, but I had doubts - doubts that I could run a game effectively, doubts that I could write an adventure that would hold people’s interest, doubts that people would actually show up regularly…</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">Then, something pretty amazing happened. One of Yvette’s friends asked her, “do you know anybody who’s into D&D? My daughter watched a game being played at the library, and she wants to play”. Which was hilarious, since Yvette met this friend through her brother, who was someone she played D&D with, back when they were all kids.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">Yvette asked me if I’d be willing to run a game for the girl, M, and her friend, K. And for whatever reason, I said yes. So since late July, I’ve been running a regular weekly D&D game for my wife, two teenage girls, and a couple of sometimes players, including another teenager (a boy), and one of OUR friends (about my age), who’s someone we’ve played games with at gaming conventions.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">And it’s gone really well. I was very stressed before the first few sessions, but it’s gotten a lot easier, and I think I might actually be pretty good at this. I’m definitely much better at it than I ever was as a teenager.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">But I haven’t run a game, or even played much, in the last few decades. How is it possible that I’ve gotten better, without actually practicing the craft? It turns out I have been learning how to do this, while not actually doing it.</span></p><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial; font-size: 20pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;">Downtime and “Leveling Up”</span></h1><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">D&D has this idea of “character levels”, which represent how good your character is at being whatever kind of adventurer they are. They go out, kill some monsters, intimidate some guards, steal some ancient artifacts, and they get better at what they do. Except they don’t normally get better at adventuring while they’re adventuring. They go out, have an adventure, and then go back to town, where they study and train, and then they go up a level. They call this “downtime”, and it’s part of the natural ebb and flow of getting better at something. I think this works in the real world sometimes, too.</span></p><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial; font-size: 20pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;">How I leveled up during my 40 years in the desert</span></h1><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">While I haven’t actually tried to run a game in many years, I have been learning a lot of new skills in the intervening years which just so happen to make me better at the things that were hard for me when I tried to do this before.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14.666666984558105px; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 14.666666984558105px; white-space: pre-wrap;">I always dreaded coming up with characters, locations, and plot lines, for each game. It seemed like an impossible task to create a whole world from scratch, populate it with people, and write a story that takes place there, especially when the players might easily take off and do something I hadn’t planned for in advance.</span></p><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial; font-size: 20pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;">A little bit of research</span></h1><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">I’ve always been interested in how games “work”, and I’ve bought a lot of rulebooks for games that I’ve never played. And I’ve learned a lot. Looking at how a game like D&D evolved from the versions that I played, to the pinnacle of complexity that they eventually became, and two subsequent ground-up redesigns, has really taught me a lot about how they work under the surface.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">And then there was Fate. The Fate RPG is sort of the benchmark for a modern, “rules light” RPG design. It’s very very different from D&D in terms of designs, and comes in a cute little booklet that lays out the rules quickly, and then goes into great depth into how to run a collaborative storytelling experience. And it would not be an exaggeration to say that it opened up my eyes to a completely different way of playing these sorts of games. But before I could hear what Fate had to teach me, I needed to be in the right headspace to be receptive. Me from 20 years ago would have been totally baffled by all of the things “missing” from the Fate rules.</span></p><br /><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial; font-size: 20pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;">Letting everyone contribute to the story</span></h1><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 4pt; margin-top: 16pt;"><span style="color: #434343; font-family: Arial; font-size: 13.999999999999998pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;">“Aspects are always true” - Fate rulebook</span></h3><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">Fate has this idea of “aspects”, which are statements about a character, a location, or an object. Players define aspects for their characters, and anybody involved in a scene can create an aspect on either the location, or any of the characters or objects in the scene. If you’re in a fight in a warehouse, and one of the players throws a molotov cocktail, that might add “the building is ON FIRE” as an aspect of the scene. If a player says that their character has an aspect of “my family is my world”, then that naturally brings up the question of what sorts of complications might arise if someone threatens their family. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">This turns out to be a fantastically useful tool in D&D, too. When a player tells me something about themselves, I can take it as true, and start working parts of it into the story. We might run into that older brother of theirs, or perhaps the group they’re on the run from will send an assassin after them. Who knows?</span></p><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial; font-size: 20pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;">Improv, and the power of Yes, and…</span></h1><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">One of the guiding principles of Improv, is “Yes, and…”. When you’re in a scene, you accept whatever your partner says as true, and you work from there. You don’t negate what they say, and you don’t try to steer them back onto what your original idea of the scene was. If someone addresses you as “mother”, then you’re their mother, and you go from there.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">It’s pretty obvious, at the lowest level, that Roleplaying Games would involve...playing a role. But because of the interactive nature of playing a cooperative storytelling game with a group of other people, you can’t actually practice your lines ahead of time. You have to react, and you have to take what the other players say, and build on top of that.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">I have taken a couple of classes in improvisational acting from the local Santa Barbara Improv Workshop, and that’s been super-helpful in teaching me to naturally react to the other players. If I’m playing a religious fanatic Lizard-person, and one of the player characters asks me a question I haven’t prepared an answer for, I can just jump into the headspace of that character, and answer as the character I’m playing would answer.</span></p><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial; font-size: 20pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;">Don’t be afraid to steal from your collaborators</span></h1><h3 dir="ltr" style="line-height: 1.38; margin-bottom: 4pt; margin-top: 16pt;"><span style="color: #434343; font-family: Arial; font-size: 13.999999999999998pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;">"Good artists borrow, great artists steal" - Pablo Picasso</span></h3><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">The other interesting thing about collaborative storytelling is that it works, even when some of the people involved don’t know that they’re collaborating. When the players are speculating about the Big Bad Guy’s evil plan, or what they think the motivations of a minor character might be, I can listen in, and just “borrow” those ideas, and weave them into the story.</span></p><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial; font-size: 20pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;">And they </span><span style="font-family: Arial; font-size: 20pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;">loved it</span><span style="font-family: Arial; font-size: 20pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;">!</span></h1><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">After all that, you may be wondering how the game went. Everybody had a great time, they loved the story, and they really want to play again, after we take a short time off. A bunch of seeds have been planted to connect the characters’ backstories to places and people in the world, I’ve introduced a bunch of hopefully recurring characters, and at least one player is starting to see conspiracies everywhere, which is always fun.</span></p><h1 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 20pt;"><span style="font-family: Arial; font-size: 20pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;">What have </span><span style="font-family: Arial; font-size: 20pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;">you</span><span style="font-family: Arial; font-size: 20pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;"> leveled up in?</span></h1><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">Maybe now is a good time to think about something you “used to do”, and evaluate whether now you’re in a position to apply some of the things you’ve learned in the meantime? See if some of your previous creative blocks are no longer relevant. You never know, maybe you “leveled up” when you weren’t watching!</span></p></span><br class="Apple-interchange-newline" />Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com2tag:blogger.com,1999:blog-14034177.post-48823478454217870272020-08-09T16:36:00.003-07:002020-08-09T16:38:38.938-07:00Good takes and bad takes on Apple Silicon<h1 style="text-align: left;">Good takes and bad takes on Apple Silicon</h1><div>There are a lot of people out there that seem to be clinging to some weird ideas about what the Apple Silicon transition is going to look like, and what the Apple Silicon team "can do", in terms of what they can deliver.</div><div><br /></div><h2 style="text-align: left;">Good Takes</h2><div>First, someone who seems to "get it" pretty well, Rene Ritchie. Here are two of his very clear and informative videos about why "Apple Silicon" is not just "ARM", and what that means for the Mac going forward:</div><div><br /></div><h3 style="text-align: left;"><b>Rene Richie on YouTube:</b></h3><div><a href="https://www.youtube.com/watch?v=wgEKDGNjqsk" target="_blank">Wrong About the Apple Silicon Mac</a></div><div><a href="https://www.youtube.com/watch?v=OmO_oewq_2c" target="_blank">Why Apple Silicon Will Make Better Macs</a><br /></div><div><span><br /></span></div><div><span>The key takeaway here is that many of the huge advantages that Apple Silicon will bring to the Mac come from <i>everything else</i> that Apple will integrate into the chips, not just from using their own CPU cores. Having said that, I think there's a lot of confusion about the likely performance of those cores. More on that below.</span></div><h2 style="text-align: left;"><span><b><br /></b></span></h2><h2 style="text-align: left;"><span><b>Bad takes</b></span></h2><div><span>I'm not going to link to blog posts, tweets, and YouTube videos of the bad takes, for two reasons. Primarily, because bagging on people who are coming at this from a position of ignorance seems kind of mean. But also, because there are so many bad takes for each of these positions, there's no point in singling out any particular instance of them. Having said that, here are some bits of commonly "known facts" about Apple Silicon that I think are really wrong-headed, and why I think so.</span></div><div><span><br /></span></div><div><span><h3 style="text-align: left;"><span><span><b><br /></b></span></span></h3><h3 style="text-align: left;"><span><span><b>Claim: ARM Macs will never be as fast as Intel Macs</b></span></span></h3><div><span><span>You see a bunch of variations on this, from "obviously, the ARM processors will be for the low-end, but they'll continue to use Intel for the Mac Pro and the MacBook Pro", to "Obviously, the low-end Macs will transition first, and we won't see 'pro' Macs until late next year, at the earliest".</span></span></div><div><span><span><br /></span></span></div><div><span><span><b>My take: You will see 'Pro' Apple Silicon Macs this year</b></span></span></div><div><span><span>Apple's flagship product is the MacBook Pro. It's the product that everybody wants, and also the one that a lot of people buy, especially "Pro Users", whatever that name might mean. Apple will definitely not carve out solely the low end of the Mac range for their new Apple Silicon processors, because the perception of "Apple Silicon means low-end" is not something they want to have stick to the new product line. </span></span></div><div><span><span><br /></span></span></div><div><span><span>In addition, based on what we know, and can extrapolate from, the first Apple Silicon processors are likely going to be substantially faster than the Intel processors in the existing laptop range. In single-core performance, the A12Z is already on par with the Intel processors in the MBP line. It's really hard to say what the performance improvement will be from the A12Z to the first Apple Silicon Mac chip, but my best guess is somewhere between 50% and 100% improvement over the A12Z. At that speed, those Apple Silicon chips will just wipe the floor with the current Intel processor MacBook Pros in single-core speed. Beyond that, it's mostly a question of how many "performance" cores go into that processor.</span></span></div><h3 style="text-align: left;"><b><br /></b></h3><h3 style="text-align: left;"><b>Claim: ARM Macs will not support discrete GPUs</b></h3><div class="separator" style="clear: both;"><a href="https://1.bp.blogspot.com/-ywyCSqiBd8s/XzB7LYeAo6I/AAAAAAAAAZs/snDSokzcy40B-Cr4dxQ54f1BcFvCr_57wCLcBGAsYHQ/s2048/Screen%2BShot%2B2020-08-09%2Bat%2B3.36.18%2BPM.png" style="display: block; padding: 1em 0px;"><img border="0" data-original-height="1147" data-original-width="2048" src="https://1.bp.blogspot.com/-ywyCSqiBd8s/XzB7LYeAo6I/AAAAAAAAAZs/snDSokzcy40B-Cr4dxQ54f1BcFvCr_57wCLcBGAsYHQ/s640/Screen%2BShot%2B2020-08-09%2Bat%2B3.36.18%2BPM.png" width="640" /></a></div><div>This is apparently based on a single slide from a single WWDC 2020 presentation: <a href="https://developer.apple.com/videos/play/wwdc2020/10631/" target="_blank">Bring your Metal app to Apple Silicon Macs</a>. Based on seeing "Intel, Nvidia, and AMD GPUs" under the Intel-based Mac heading on one side of the slide, and "Apple GPU" on the other side, under Apple Silicon, some people have apparently concluded that discrete GPU support is not going to be available on Apple Silicon.</div><div><br /></div><div><b>My Take: We really don't know, but it seems unlikely that discrete GPUs will never be supported</b></div><div>The point of the presentation at WWDC was very much not "we won't support discrete GPUs on Apple Silicon". The point of the presentation was "you definitely don't want to assume that 'integrated equals slow', when dealing with Apple Silicon". It's likely that Apple will still have discrete GPU options on some of their Pro devices.</div><div><br /></div><div>However, I would not be at all surprised if the first few Macs released didn't have discrete GPUs, because the integrated GPU will have better performance than the top laptop GPUs currently available. We do know that Apple Silicon Macs will have Thunderbolt and PCIe, so they will have the hardware capability to support discrete GPU configurations, including external GPUs. It's just a question of drivers, at that point. Apple will either write the needed drivers, or pay the GPU vendor to write them, if they're needed to achieve a particular performance level.</div><h3 style="text-align: left;"><br /></h3><h3 style="text-align: left;">Claim: Much existing software will not come to Apple Silicon Macs soon, or indeed at all</h3><div>This is often tied to the argument that "x86 processors are just better for 'heavy processing' than ARM, which are optimized for power efficiency". Given that assumption, they then say you won't see Photoshop, or Logic, or Premiere, or whatever other piece of software on ARM Macs, because they won't be fast enough. A different argument is that the effort of porting will be too high, and so third-party developers will not bother porting to the Apple Silicon architecture.</div><div><br /></div><div><b>My Take: Building for Apple Silicon is pretty darn easy, and Rosetta2 is better than you think</b></div><div>I talked about this <a href="http://codemines.blogspot.com/2020/06/some-thoughts-on-macintosh-transition.html" target="_blank">in a previous post</a>, but this transition is going to be much less painful for most developers than the PPC->Intel transition was, or in fact than the transition from x86-32bit to x86-64, which a bunch of developers just went through for Catalina. If an app runs on Catalina, it'll run on Apple Silicon, eventually.</div><div><br /></div><div>I need to be careful about what I say in this next section, because I do have the Apple DTK, and it came with a fairly restrictive NDA, saying not to talk about performance or benchmark results. But I have run a bunch of third-party software under Rosetta2, and other than a compatibility issue related to page size that's described in <a href="https://developer.apple.com/services-account/download?path=/Documentation/Universal_App_Quick_Start_Program_Resources/DTK_Release_notes.pdf" target="_blank">the DTK release notes</a> (you may not be able to read that, without a developer account), almost everything I've tried runs perfectly well. It's actually hard to tell the difference between running under Rosetta, and running something native. It does use more CPU power to run a Rosetta process than a native process, and they're slow to start the very first time, but other than that, it's completely seamless.</div><div><br /></div><div>Someone posted Geekbench scores from a DTK, with Geekbench running under Rosetta, and the performance is 20-30% slower than native (as compared to the iPad Pro running the iOS version natively). Assuming that holds generally, and that the Apple Silicon Mac processors will be substantially faster than the existing Intel Mac processors, I wouldn't be too surprised if for many users, running their existing software under Rosetta would still be a performance improvement over their current Mac. Once native versions of these apps become available, there will be no contest.</div><h3 style="text-align: left;"><br /></h3><h3 style="text-align: left;">Claim: The Surface Pro shows that ARM isn't ready for the desktop</h3><div>The Surface Pro is an obvious comparison to make, because it's an "ARM laptop", running an ARM version of Windows. They're great, for what they are. But they haven't been a great success. The lack of software support, and disappointing performance when emulating x86 code, has been used to justify skepticism of the viability of Apple Silicon Macs.</div><div><br /></div><div><b>My Take: The Surface Pro is a great example of the key differences between Apple and Microsoft.</b></div><div>From a third-party developer's perspective ARM Windows is this weird off-shoot of the main Windows product line. Even if you want to support it, it's a much smaller market than the x86 mainstream Windows family, and so the payoff for the porting work is uncertain. When Apple switches to Apple Silicon, they will completely switch over. At the end of the two year transition, every new Mac will be running on Apple Silicon. If you want to be in the Mac software market, you will need to support Apple Silicon.</div><div><br /></div><div>It turns out that there is hardware support for Total Store Ordering, or TSO, built in to Apple Silicon processors. This was somehow discovered by a third party, and they've subsequently released <a href="https://github.com/saagarjha/TSOEnabler">a proof-of-concept</a> for enabling TSO on a per-thread basis on the DTK. The relevance here is that TSO is one of the major differences between the memory model of Intel processors and ARM processors. By providing this switch, Apple have eliminated a huge source of synchronization slowdowns (and potentially bugs) when translating x86 code to ARM code in Rosetta. This is a hardware feature implemented just to support a particular piece of software, and a great illustration of the advantages Apple gets from controlling the entire stack from hardware to applications.</div><div><br /></div><h3 style="text-align: left;"><br /></h3><h3 style="text-align: left;">Claim: Geekbench is not a realistic benchmark, and doesn't reflect real-world performance</h3><div>This is a fun one. Since Geekbench shows the A12Z as being on par with current Intel laptop chips, it must be that the benchmark is wrong, or intentionally skewed in Apple's favor.</div><div><br /></div><div><b>My Take: Benchmarking is hard, but Geekbench is at least trying to be fair</b></div><div>You can see descriptions of the Geekbench micro-benchmarks <a href="https://www.geekbench.com/doc/geekbench5-cpu-workloads.pdf" target="_blank">here</a> and <a href="https://www.geekbench.com/doc/geekbench5-compute-workloads.pdf" target="_blank">here</a>. There's nothing in here that would obviously bias these tests towards or away from any particular processor. They're artificial benchmarks, but are built up of things that real applications actually do - image compression, database operations, etc, etc.</div><div><br /></div><h2 style="text-align: left;">Conclusion</h2><div>The first round of Apple Silicon Macs are going to be setting the conversation for the next year about the wisdom of Apple's rather risky decision to abandon Intel. Apple obviously knows this, and I would not be at all surprised if the first Apple Silicon MacBook Pro (or whatever they call the pro laptop, if they rename it) will be the fastest Mac laptop yet. And the desktop Macs will also be an impressive upgrade over their current counterparts.</div><div><br /></div></span></div>Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com0tag:blogger.com,1999:blog-14034177.post-29831048272568652322020-06-23T13:26:00.003-07:002020-07-06T18:29:52.634-07:00What Apple Announced at WWDC 2020<h2>
<span style="background-color: transparent; color: black; font-family: arial; font-size: 26pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">What Apple Announced at WWDC 2020</span></h2>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">It looks like I got some things right, and some things wrong, in <a href="http://codemines.blogspot.com/2020/06/some-thoughts-on-macintosh-transition.html" target="_blank">my previous post</a>. Let’s look at what Apple actually announced.</span></div><div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><br /></span></div>
<h3>
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 700; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Mac with Apple Processors, across the line</span></h3>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Yes, they really are transitioning the whole Mac line to Apple’s own processors. The timeline they announced is “within 2 years” to transition the whole line. I still expect that they’ll end up well within two years, maybe even closer to a year. Similar to how the Intel transition was announced as “a year and a half”, but ended up being shorter.</span></div>
<b style="-webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; caret-color: rgb(0, 0, 0); color: black; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px;"><br /></b>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">The first ARM Macs will be available this year. This was a surprise to a lot of pundits, but it makes total sense to me, given that several major third-party software vendors (Adobe, Microsoft, and Epic) are on-board with the switch, and have their software working already. I was expecting both Adobe and Microsoft to show working pre-release software, just because it really is that easy to move a modern Mac code base to ARM, and they’ve both recently gone through a fairly-painful 64-bit transition for Catalina.</span></div>
<b style="-webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; caret-color: rgb(0, 0, 0); color: black; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px;"><br /></b>
<h3>
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 700; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Rosetta 2, to run existing Intel Mac Applications</span></h3>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">The big surprise for me is that they did include x64 compatibility in Big Sur. I’m happy to be wrong about that, it’s obviously good news for users. I just figured that the chance to make a clean break would be very tempting to Apple.</span></div>
<b style="-webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; caret-color: rgb(0, 0, 0); color: black; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px;"><br /></b>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Rosetta 2 uses a combination of translation at install time for applications, and translation at load time for plugins. I think that ahead of time translation is a good tradeoff between taking some extra time to get started, in exchange for getting better translation. JIT translation of machine code is hard to balance between performance and latency.</span></div>
<b style="-webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; caret-color: rgb(0, 0, 0); color: black; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px;"><br /></b>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">The <a href="https://developer.apple.com/documentation/apple_silicon/about_the_rosetta_translation_environment?changes=latest____8_8" target="_blank">Rosetta 2 documentation</a> is pretty sparse right now, but I did get the impression that x64-JIT compilers are supported in Rosetta apps, which is interesting. Presumably, when you make the syscall to make a segment of code executable, they translate it then. Pretty slick, though I wonder how much it’ll cause performance hiccups in, for example, web browsers, which rely heavily on JIT to get adequate performance.</span></div>
<b style="-webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; caret-color: rgb(0, 0, 0); color: black; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px;"><br /></b>
<h3>
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 700; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Running iPad and iPhone applications without modification</span></h3>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Another thing that seems to have taken a bunch of people by surprise is that ARM Macs will be able to run iPad and iPhone apps without any modifications. This is a logical outgrowth of the Catalyst software that lets you rebuild iPad apps to run on the Intel version of MacOS. You just don’t need to recompile on the ARM Mac, because they’re already the same processor architecture.</span></div>
<b style="-webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; caret-color: rgb(0, 0, 0); color: black; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px;"><br /></b>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Just like existing Catalyst applications, it’s possible for developers to add Mac-specific features (e.g. menus), to create a better user experience on the Mac. This really does make UIKit (or SwiftUI) the “obvious” choice for new application development.</span></div>
<b style="-webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; caret-color: rgb(0, 0, 0); color: black; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px;"><br /></b>
<h3>
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 700; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Porting assistance for Open-Source apps and frameworks</span></h3>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">An interesting news item that came out of WWDC is that Apple is going to make patches available to a number of Open Source projects, to help get them up and running on ARM Macs. This includes Chromium, Node.js, OpenJDK, and Electron, which should mean that those projects won’t lag far behind.</span></div>
<b style="-webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; caret-color: rgb(0, 0, 0); color: black; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px;"><br /></b>
<h3>
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 700; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">So, what’s it all mean?</span></h3>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">To me, this seems like just about entirely a win. The new Macs will be faster, use less power, will have a larger software library (all those iOS apps), and more capabilities.</span></div>
<b style="-webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; caret-color: rgb(0, 0, 0); color: black; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px;"><br /></b>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Some software will get “left behind” in the transition, but not very much, at least in the short term. Running software under Rosetta will likely not be great, performance-wise, but it’ll be adequate for a lot of uses.</span></div>
<b style="-webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; caret-color: rgb(0, 0, 0); color: black; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px;"><br /></b>
<h3>
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 700; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">There is one major downside, for a certain subset of users</span></h3>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">No x64 processor means no Bootcamp support for booting into WIndows, and no virtualization software to run your other x86 operating systems. I have a friend who uses Docker extensively as part of his developer workflow, and he’s just going to be out of luck, as far as using a Mac goes.</span></div>
<b style="-webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; caret-color: rgb(0, 0, 0); color: black; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px;"><br /></b>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">There is virtualization support on the ARM Macs, but it’s for virtualizing other ARM operating systems. You’ll be able to run an ARM Linux OS inside Parallels (for example), but if your workflow right now includes running code in x64 Windows or Linux, the ARM Macs won’t be a good fit.</span></div>
<b style="-webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; caret-color: rgb(0, 0, 0); color: black; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px;"><br /></b>
<h3>
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 700; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">What about Games?</span></h3>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">It seems like every time a major new MacOS version comes out, they claim it’s going to be “great for games”, but the games mostly don’t actually come.</span></div>
<b style="-webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; caret-color: rgb(0, 0, 0); color: black; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px;"><br /></b>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Having Unity supporting ARM Macs will definitely make it easier for anyone already using Unity to support the new Macs. But the current version of Unity already supports Mac, and still a lot of games never make it there, so I don’t think that’s a win. If anything, it’s a loss, since anybody who wants to support the Mac at least needs to test with both Intel and ARM Macs.</span></div>
<b style="-webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; caret-color: rgb(0, 0, 0); color: black; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px;"><br /></b>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">While a lot of big-name games never make it to the Mac, there’s actually quite robust support for the Mac among small indie games developers on Steam and Itch. Again, some of these folks will look at the cost to support both kinds of Macs, and decide it’s not worth it, so we’ll probably lose a few there, as well.</span></div>
<b style="-webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; caret-color: rgb(0, 0, 0); color: black; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px;"><br /></b>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">But then there are iPad games. A lot of iPad games are “casual games”, which is just fine by me, since I play more of that sort of thing than I do first-person shooters. And given that iPad games will, by and large “just work” on the new Macs, we may see more iPad games developers moving a bit more up</span><span style="font-family: arial; font-size: 11pt; white-space: pre-wrap;">scale. It’ll be interesting to see.</span></div>
<b style="-webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px; caret-color: rgb(0, 0, 0); color: black; font-style: normal; font-variant-caps: normal; font-weight: normal; letter-spacing: normal; orphans: auto; text-align: start; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; widows: auto; word-spacing: 0px;"><br /></b>
<h3>
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 700; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">WIll I buy one?</span></h3>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-family: arial; font-size: 11pt; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">We’ll see what gets announced, but I expect that I will, whenever the “pro” laptops are available. </span></div>
<br class="Apple-interchange-newline" />Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com0tag:blogger.com,1999:blog-14034177.post-80692098968903298432020-06-13T15:17:00.006-07:002020-06-14T21:47:02.330-07:00ARM Macs are coming, and faster than you think<div><span style="font-family: Arial; white-space: pre-wrap;"><font size="5">ARM Macs and transition timeframes</font></span></div><div><span id="docs-internal-guid-67abce06-7fff-5934-4259-554ba2f4fb6a"><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">(note: This is a lightly-edited version of a post originally published on June 13th, 2020)</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">We all knew this was coming. In fact, some of us have been expecting it for years. Various rumor outlets are saying that Apple will announce at WWDC that they're <a href="https://www.bloomberg.com/news/articles/2020-06-09/apple-plans-to-announce-move-to-its-own-mac-chips-at-wwdc">transitioning the Macintosh line from using Intel's processors to Apple's own processors</a>, which are based on the ARM architecture.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">A <a href="https://daringfireball.net/2020/06/on_apple_announcing_the_mac_arm_transition_at_wwdc">bunch</a> of <a href="https://9to5mac.com/2020/06/10/apple-arm-mac-transition/">people</a> have written extensively on this rumor, but I have a few thoughts that I haven't seen others concentrate on.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">One thing you see a lot of disagreement about online is how long it'll take for Apple to convert its whole lineup of Macs to use its own processors, or if it even will. I've seen people say that they think they'll announce a single model of ARM Mac, then over the course of 2-3 years, move all of the product line over to ARM. I've even seen people predict that they'll keep the "Pro" line of computers on x86 for the foreseeable future, and only convert the portable line to ARM.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">A case can be made for those positions, but here's what I think: </span><span style="font-family: Arial; font-size: 11pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">If Apple announces a transition to ARM at WWDC, it'll happen surprisingly quickly.</span><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;"> I wouldn't be at all surprised if the first ARM Macs ship before the end of 2020 , and the whole line is switched over before the end of 2021. That seems like a pretty far out-there prediction, compared to the "consensus" view, so let's take a look at the previous transitions, and how this one is different.</span></p><h2 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 18pt;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;">We've been here before, but then again, this is very different</span></h2><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">This will be the third major processor transition for the Macintosh (and the 5th major software transition overall). Originally, the Mac used the Motorola m68k processor family. After 10 years, the m68k family was failing to make regular improvements in performance, and Apple started to look at other options, finally settling on the PowerPC. They moved the Mac products from m68k to PPC over the course of about 18 months. Years later, they transitioned from PowerPC to Intel, over the course of about 12 months. And now, we're apparently on the cusp of another processor transition. How will this one turn out? And most importantly: WHY NOW?</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Transition 1: The m68k to PowerPC transition</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">"Any sufficiently-advanced technology is indistinguishable from magic"</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">– Arthur C. Clarke</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">This transition was very difficult, both for Apple and for third parties. At the time that Apple announced the change, they were still using what we now call the Classic MacOS. Large parts of the operating system, and the applications that ran on it, were written in Assembly, with an intimate understanding of the hardware they ran on.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">Consequently, Apple developed a processor emulator, which would allow existing m68k code to run on the PowerPC without any changes. You could even have an application load a plugin written for the other architecture. The new PPC version of MacOS maintained a shadow copy of all its internal state in a place where 68k applications could see (and modify) it - that was the level of compatibility required to get anything to work. A heroic effort, and it paid off - most software worked out of the box, and performance was "good enough" with emulated code, because the PPC chips were much faster than the m68k chips they were replacing.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">The downside of that sort of transition is that it takes </span><span style="font-family: Arial; font-size: 11pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">many years</span><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;"> to complete. There was relatively little pressure on the third parties to update their applications, because they ran just fine on the new models. Even the MacOS itself wasn't completely translated to native code until several years later. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Transition 2: The MacOS X transition</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">“If I need to make that many changes, I might as well drop the Mac, and go to Windows”</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">– Some Mac developer, a Halloween party in 1999</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">A few years after the PPC transition, Apple announced MacOS X, and software developers were faced with another transition. Originally, OS X was intended to be a clean break with Classic MacOS, with an all-new underlying operating system, and a brand new API, Cocoa (based on the OPENSTEP software which came in with the NeXT acquisition).</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">Major developers were (understandably) not enthusiastic about the prospect of rewriting the majority of their existing applications. Eventually, Apple caved to the pressure, and provided Carbon, a "modern" API that kept much of the same structure, but removed some of the more egregious aspects of Classic MacOS programming. Apple made it clear that they considered Carbon a transitional technology, and they encouraged developers to use Cocoa. The reaction from the larger developers was pretty much "meh." Quite a few smaller long-time MacOS developers enthusiastically embraced the new APIs though, appreciating the productivity boost they provided.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">A footnote to this chapter of the saga is that the "Developer Preview" versions of Rhapsody, the first Mac OS X releases, actually had support for running the OS on Intel-based PC hardware. That didn't survive the re-alignment which gave us Carbon, and MacOS X 10.0 shipped with support for PowerPC Macs only. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">Things were pretty quiet on the Macintosh front for a few years. New versions of OS X came out on a regular schedule, and Apple kept coming out with faster and better PowerBooks, PowerMacs, iBooks, and iMacs. And then, suddenly, the PowerPC processor line had a few unexpected hiccups in the delivery pipeline.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Transition 3:The Intel transition</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">“Wait, you were </span><span style="font-family: Arial; font-size: 11pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; text-decoration-skip: none; text-decoration: underline; vertical-align: baseline; white-space: pre-wrap;">serious</span><span style="font-family: Arial; font-size: 11pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;"> about that?”</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">– Carbon-using developers, overheard at WWDC 2005</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">The PowerPC processors were looking less and less competitive with Intel processors as time went by, which was an embarrassment for Apple, who had famously built the PowerPC advertising around how much faster their chips were than Intel's. The "G5" processor, which was much-hyped to close the gap with Intel, ran years late. It did eventually ship, in a form that required liquid cooling to effectively compete with mainstream Intel desktop PCs. The Mac laptop range particularly suffered, because the low-power laptop chips from Motorola just...never actually appeared.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">And so, Apple announced that they were transitioning to Intel processors at WWDC 2005. I was working in the Xcode labs that year, helping third-party developers to get their code up and running on Intel systems. I worked a lot of "extra shifts", but it was amazing to see developers go from utterly freaked out, to mostly reassured by the end of the week.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">For any individual developer, the amount of “pain” involved in the transition was variable. If they’d “kept up with” Apple’s developer tools strategy in the years since the introduction of Mac OS X, no problem! For the smaller indie developers who had embraced Xcode, Cocoa, and all of Apple's other newer framework technology, it actually was a trivial process (with one exception). They came into the lab, clicked a button in Xcode, fixed a bunch of compiler warnings and errors, and walked away with a working application, often in just an hour or so. </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">For the developers with large Carbon-based applications built using the Metrowerks compiler, it was a real slog. Because of CodeWarrior-specific compiler extensions they'd used, different project structures, etc, etc, it was hard to even get their programs to build in Xcode.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">The exception to the "it just works" result for the up-to-date projects is any kind of external I/O. Code that read or wrote to binary files, or communicated over a network, would often need extensive changes to flip the "endianness" of various memory structures. Endianness is something you generally don’t need to think about as a developer in a high-level language, especially if you're only developing for one platform, which also just happens to use the same endianness as the Internet does. Luckily, these changes tended to be localized.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Transition 4: The 64-bit transition</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">“You can't say we didn't tell you this was coming..."</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">– Some Apple Developer Tools representative (probably Matthew), at WWDC 2018</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">The first Intel-based Macs used processors that could only run in 32-bit mode. This is what I consider one of Apple's major technology mistakes, ever. They should have gone directly to 64-bit Intel from the get-go, though that would have required waiting for the Core II Duo processors from Intel, or using AMD chips, or doing the iMac first, and the notebooks last.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">Regardless, after the first year, all Macs were built with 64-bit capable processors, and MacOS started supporting 64-bit applications soon after. Technically, the previous versions of Mac OS X supported 64-bit applications on the "G5" processors, but that was only available in the Power Mac G5, and very few applications (other than ports from workstation hardware) bothered to support 64-bit mode.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">Unfortunately for the folks hoping to see a glorious 64-bit future, there was again very little incentive for developers to adopt 64-bit code on MacOS. One of the advantages of Intel-based Macs over the PowerPC versions is that you could reuse code libraries that had been written for Windows PCs. But, of course - almost all Windows applications are written for 32-bit mode, so any code you share between the Windows and Mac versions of your application need to be 32-bit. You also can't mix-and-match 32-bit and 64-bit code in the same process on MacOS. So most MacOS applications remained 32-bit for years after there were no longer any 32-bit processor Macs being sold. Even when OS X 10.7 dropped support for 32-bit processors entirely, most applications stayed 32-bit.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">Apple told developers at every WWDC from probably 2006 on that they should really convert to 64-bit. They'd talk about faster performance, lower memory overhead for the system software, and various other supposed advantages. And every year, there just didn’t seem to be any great need to do so, so mustly, all Mac software remained 32-bit. A few new APIs were added to MacOS which only worked in 64-bit applications, which just had the unfortunate effect of those features never seeing wide adoption.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">Eventually, Apple's tactics on this issue evolved from promises to threats and shaming. Developers were told at WWDC that 32-bit applications would not be supported "without compromises" in High Sierra. Then, when High Sierra shipped, we found that Apple had added a warning message that 32-bit applications were "not optimized" for the new OS. That got the end users to start asking developers about when they were going to “optimize” for the new operating system. For the better part of a year, many developers scrambled to get their apps converted before MacOS Mojave shipped, because they made the reasonable assumption that the warning message was implying that Mojave wouldn’t support 32-bit applications. But then Mojave shipped, and 32-bit apps ran the same as they ever have, with the same warning that was displayed in High Sierra. And then, in MacOS Catalina, they finally stopped allowing 32-bit software to run at all.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">Converting an existing 32-bit Cocoa application to 64-bit is not particularly difficult, but it is...tedious. You end up having to make lots of small changes all over your code. In one project that I helped convert, there were changes needed in hundreds of source code files. We got there, but nobody thought it was fun, and it seemed so pointless. Why inflict this pain on users and developers, for what seemed like no gain?</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">You couldn't say that we weren’t warned that this was coming, since Apple had been telling developers for literally a decade to convert to 64-bit. But third-party developers were still pretty confused about the timing. Why "suddenly" deprecate 32-bit apps for Catalina? Just to incrementally reduce the amount of maintenance work they needed to do on MacOS? Or to reduce the amount of system overhead by a handful of megabytes on and 8GB Mac? It didn’t make sense. And why did they strongly imply it was coming in Mojave, then suddenly give us a reprieve to Catalina?</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 700; vertical-align: baseline; white-space: pre-wrap;">Transition 5: The ARM Mac</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">“The good news is, you've already done the hard part”</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">– Apple, WWDC 2020</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">With all of this in mind, I think that the sudden hard push for 64-bit in High Sierra and beyond was a stealth effort to get the MacOS third-party developers ready for the coming ARM transition. When High Sierra shipped, almost all MacOS software was 32-bit. Now that Catalina is out, almost all the major applications have already transitioned to 64-bit. Perhaps the reason the “deadline” was moved from Mojave to Catalina was because not enough of the “top ten” applications had been converted, yet?</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">Prior to finally getting all third-party developers to adopt 64-bit, the transition story for converting to ARM would have been complicated, because the applications were all 33-bit, and the Mac ARM chips would be 64-bit (the iOS platform having had their 64-bit conversion a few years back). Apple would have been telling developers: "First, you need to convert to 64-bit. Then, you can make any changes needed to get your code running on ARM".</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">Now, it's going to be very simple: "If your application currently builds for Catalina, with the current SDK, you can simply flip a switch in Xcode, and it'll be able to run on the new ARM Macs, as well". That's not going to be literally true for many applications, for various reasons (binary dependencies on some other third-party SDK, Intel-specific intrinsics in performance-critical code, etc). </span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">But this time, there is no endianness issue, no apps that will need to change their toolchain, none of the previous issues will be relevant. I also think it's quite likely that there will be very few arbitrary API deprecations in MacOS 10.16, specifically to make this transition as painless as possible for as many developers as possible.</span></p><br /><h2 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 18pt;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;">What’s it all mean, then?</span></h2><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;"><i>"All of these products are available TODAY"</i></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;"><i>– Steve Jobs's famous tagline, possibly heard again this year?</i></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;"><br /></span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">So then - on to timing. For the Intel transition, there were about 6 months between announcement and availability of the first Intel Mac, and another 8 months before the transition was complete. This time, it's likely that Apple will shrink the time between announcement and availability, because there's comparatively little work that needs to be done to get most applications up and running on the new hardware.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">It's </span><span style="font-family: Arial; font-size: 11pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">possible</span><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;"> that they'll announce ARM Macs, shipping that very day. If Adobe and Microsoft are already ready to go on day one, it might even be </span><span style="font-family: Arial; font-size: 11pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">plausible</span><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">. I think they'll want to give developers </span><span style="font-family: Arial; font-size: 11pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">some</span><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;"> time to get ready, though. So, I predict 3 months, meaning ARM Macs in September, 2020. And I think they'll move aggressively to put as many Macs on their own processors as they can, because it's all a win for them - lower costs, better battery life, etc, etc.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">"But what about the Mac Pro?", you'll hear from some experts. "Nobody's ever produced a Xeon-equivalent performance ARM chip. It'll take years to make such a design, if it's even possible at all".</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">The obvious comeback here is: </span><span style="font-family: Arial; font-size: 11pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">Nobody</span><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;"> knows what Apple has running in a lab somewhere, except the people who are working on the project. Maybe they already have an 80 Watt ARM powerhouse chip running in a Mac Pro chassis, right now. But even if they don't, I think it's reasonable to look at this from the "Why Now?" perspective, again.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">The previous processor transitions were mainly driven by a need to stay competitive in performance with Intel. That is not the case this time, since the desktop/laptop competition is almost exclusively on the same Intel processors that Apple is using. The success, or rather the lack thereof, of other ARM-architecture computers & operating systems (Windows for ARM, ChromeBooks) doesn't make a compelling case for a switch to “keep up” with Microsoft or Google, either. So there's no hurry.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">Given that there's no external pressure to switch, Apple must think that they have a compelling story for why they're switching. And that has to include the entire product line, since the Mac isn't their most-important product, and they surely aren't going to support two different architectures on it, just to keep existing Mac Pro users happy. They either have a prototype of this Mac Pro class processor ready to go, or they're very sure that they can produce it, and they have a believable roadmap to deliver that product. Otherwise, they’d just wait until they did.</span></p><h2 dir="ltr" style="line-height: 1.38; margin-bottom: 6pt; margin-top: 18pt;"><span style="font-family: Arial; font-size: 16pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; font-weight: 400; vertical-align: baseline; white-space: pre-wrap;">Which Macs are switching first?</span></h2><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">“</span><span style="font-family: Arial; font-size: 11pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">I bet you didn’t see that coming, skeptics!</span><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">”.</span></p><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">– Me, maybe?</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">Everybody is expecting to see a new Macbook, possibly bringing back the 12-inch, fanless form factor, and taking maximum advantage of the power-saving and cooler operation of Apple’s own chips. Some folks are expecting a couple of different models of laptops.</span></p><br /><p dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;"><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">What I would really love to see (but don’t much expect) is for Tim Cook to walk out on stage, announce the first ARM-based Mac, and have it </span><span style="font-family: Arial; font-size: 11pt; font-style: italic; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;">not</span><span style="font-family: Arial; font-size: 11pt; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; vertical-align: baseline; white-space: pre-wrap;"> be a super-small, low-power consumer laptop product. I want it to be something high end that decisively outperforms the current Mac Pro, and establishes that this is going to be a transition of the whole line. I think that'd be a bold statement, if they could swing it.</span></p><br /></span></div>Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com0Santa Barbara, CA, USA34.4208305 -119.69819016.1105966638211555 -154.8544401 62.731064336178846 -84.5419401tag:blogger.com,1999:blog-14034177.post-21848683107699886602018-04-21T23:35:00.000-07:002018-04-21T23:35:38.269-07:00Movie Mini-Review: Rampage<span style="font-family: inherit;">Rampage was...not good. (Warning - "spoilers" below, though honestly, there's not much to spoil)</span><br />
<span style="font-family: inherit;"><br /></span>
<span style="font-family: inherit;">I see you folks in the back, snickering and saying "well, what did you expect?". Here's the thing though - Rampage was probably the most-disappointing movie I've seen in the last year. </span>It’s an onion of layered badness. Even The Rock couldn’t save this one. What a joyless, boring, poorly-made movie.<br />
<span style="font-family: inherit;"><br /></span>
<span style="font-family: inherit;">Whenever any movie is adapted from some other </span>medium, whether it's a book, a play, or a video game, some amount of changes are inevitable. You often have to trim the plot or characters of a novel in order to "fit" it into a movie, for example. And the writers and director will want to make their own changes, to adapt the story to the medium, or just to put their own twist on a well-known property.<br />
<br />
But I've never seen a film adapt source material whose full plot could be written on a 3x5 note card, and use NONE of it. In theory, Rampage is an adaptation of the classic video game of the same name, which was released in 1986. The player takes control of a giant monster - an ape, a werewolf, or a lizard - and climbs onto buildings, smashes them, and eats people. In those days, video games really didn't have "plots", as such. We didn't have the memory for that :-) The entirety of the plot exposition takes place as text that scrolls on the screen at the start the game, and you're shown the origin story of whichever of the monsters you've chosen to control. In each case, it's a human being who's mutated into a giant monster by exposure to some dangerous chemical.<br />
<br />
The monsters then go on a 128-city tour of North American cities, starting and ending in Illinois, leaving destruction in their wake. In every city, the layout of the buildings is the different, and there are additional features in some cities, such as the trains in Chicago, and the bridge over the river in Detroit. As you're destroying the buildings, soldiers will shoot you from the windows, and some will inexplicably throw dynamite. Smashing the windows of the buildings will occasionally reveal surprises, as well - from a roasted chicken which you can eat for extra health, to electrical devices that can shock you and make you lose your grip. The whole thing is gloriously silly, in the way of Saturday afternoon Creature Features, with guys in rubber suits beating each other up in model cities made of balsa wood and paper.<br />
<span style="font-family: inherit;"><br /></span>
Essentially<i> none of that is in the movie</i>. There <i>is</i> a giant ape, who's named George. But he's not a human who turned into an ape - he's a gorilla affected by a DNA-editing toxin, which causes him to grow very rapidly. His handler, played by The Rock, has to search for an antidote to cure him. This is obviously part of the process of making room for the human characters to actually be the stars, and it hugely alters the tone of the thing, as well. Instead of a fun movie about giant monsters smashing stuff, we get a much more-typical blockbuster hero movie, where the muscular hero dude and his female sidekick have to race against time to save the world (or at least Chicago) from destruction.<br />
<span style="font-family: inherit;"><br /></span>
<span style="font-family: inherit;">Rampage is a game about monsters punching buildings and eating people...but an hour into the movie, there were no buildings wrecked (well, one partially wrecked), and almost nobody got eaten. The film did try to inject some humor into things, but a lot of the funny</span><span style="font-family: inherit;"> bits fell pretty flat, because they didn't really fit the "grim and gritty reboot" that the rest of the movie was trying to be.</span><br />
<span style="font-family: inherit;"><br /></span>
<span style="font-family: inherit;">And then there's the gore, which I found r</span><span style="font-family: inherit;">eally off putting. </span><span style="font-family: inherit;">PG-13 is apparently The Uncanny Valley for gore. In little kids movies, there's no gore. In R-rated movies, you can have either realistic gore or ridiculous over-the top gore, take your pick. PG-13, you can get enough blood to be disturbing, but not enough to be funny.</span><br />
<br />
<span style="font-family: inherit;">On my way in, I saw a couple and their young (maybe 8 or 9 year-old) daughter settling in to </span>watch<span style="font-family: inherit;"> the movie. I know that all kids are different, and maybe her parents weren't really thinking about the "13" in the PG-13 rating, but this is a movie that starts out with a fairly intense chase scene in a space station filled with blood, severed heads, and detached limbs. Probably not what they thought they were getting, based on <a href="https://www.youtube.com/watch?v=coOKvrsmQiI" target="_blank">the trailer</a>, and the fact that The Rock was the headline star. Unsurprisingly, the little girl was pretty upset after being subjected to that. I didn't see them when I left the theater, but I'm guessing they </span>didn't<span style="font-family: inherit;"> see the whole thing.</span><br />
<span style="font-family: inherit;"><br /></span>
<span style="font-family: inherit;">Interestingly, some folks have praised Rampage as "</span><a href="http://www.vulture.com/2018/04/rampage-the-most-faithful-video-game-adaptation-ever-made.html" style="font-family: inherit;" target="_blank">The most faithful video game adaptation</a><span style="font-family: inherit;">" or gone into great detail on the </span><a href="https://www.youtube.com/watch?v=hl9f1LJeryc" style="font-family: inherit;" target="_blank">various nods to the source material</a><span style="font-family: inherit;">. I guess it all </span>depends<span style="font-family: inherit;"> on what you're looking for. For me, adapting something to film, and losing the "soul" of the thing along the way is just sad. </span>Someone<span style="font-family: inherit;"> could have made a really fun </span>Rampage<span style="font-family: inherit;"> movie, but this definitely wasn't it.</span><br />
<span style="font-family: inherit;"><br /></span>
<br />Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com0tag:blogger.com,1999:blog-14034177.post-79095560829360903302017-11-20T09:00:00.000-08:002017-11-20T11:57:25.295-08:00Mergers: The Good(intro blog post <a href="https://codemines.blogspot.com/2017/11/mergers-good-bad-and-ugly.html" target="_blank">here</a>)<br />
<h2>
<b>How to help your acquired employees succeed</b></h2>
<br />
Out of the 6 acquisitions I've been involved with, two really stand out as positive experiences, both for the acquired and the parent company. Here's what was different about those two mergers, as opposed to the ones that didn't go so well.<br />
<br />
<h2>
<b>Integrate the new team quickly (Apple/NeXT)</b></h2>
In Apple's case, they acquired NeXT both in order to get new technology to base their next-generation OS on, and to get a fully-functional engineering organization. You can't really understand just how screwed up Apple was in 1996 unless you were there, but in the last quarter of the year, Apple lost over a billion dollars. They had, at that point, had 2 or 3 (depending on how you count) different "next generation" OS projects crash and burn, and the latest one, Copland, was on the verge of disaster – I've seen credible evidence that it wouldn't have shipped for another 5 years, if ever. Into all this swirling chaos, we bring the NeXT team, understandably freaked out to be called on to "save" this huge, dysfunctional company from itself.<br />
<br />
But one thing that was hugely encouraging, and helped us to all settle in, was how quickly we were integrated into the Apple organization as a whole. Within a month after the acquisition, we were meeting with our counterparts in Cupertino, we had apple.com email addresses, our systems were on the Apple network, and we'd had an army of Apple HR folks over to the NeXT offices to get us transferred over to Apple's payroll and benefits.<br />
<br />
It was still a very hard slog, and there was a LOT of anger from folks at Apple that had their friends laid off right after the acquisition, but feeling like we were legitimately part of the team, and not just a bunch of outsiders, helped us to fight the battles we had to fight.<br />
<br />
<h2>
<b>Put the full support of the larger organization behind the newcomers (LG/WebOS)</b></h2>
After the debacle that was the HP's acquisition of Palm (see the "Ugly" segment, coming soon), the folks remaining on the WebOS team were pretty nervous when we were told that we were being sold off to LG. "Oh, great, another absentee owner who will tell us we're important, but then never do anything".<br />
<br />
And then we had our first meetings with LG's upper management. And we were told that we would be building the user interface for all of LG's high-end smart TV's, that we were going to ship in less than a year, and that we were expected to deliver something BETTER than the existing NetCast software, which they had been shipping for a few years. "Oh, crap, I thought - none of us know anything about Smart TVs, or TVs in general". But then they told us: "The CEO has expressed his full support of this project, and you'll have as much support as you need".<br />
<br />
I really didn't believe that we were going to get "as much support as you need", but sure enough, within a short time period after the acquisition, truckloads of current-generation TVs and prototype logic boards for the next generation started flooding into the office. And in the months after that, truckloads of engineers from Korea, who knew the hardware and the existing NetCast software intimately. Anything we asked for, we got – score one for top-down, authoritarian management style, I guess.<br />
<br />
And we did it - a small group of developers, working their asses off, managed to build something in less than a year which was immensely better than the existing product, which had been shipping for several years. The next-generation smart TVs, with a new version of WebOS, were even better. This was definitely a high point for the "acquire a smaller company to bring innovation to the larger company" strategy. And it succeeded because the project had a powerful advocate within the larger company, and a VERY clear vision of what they wanted to accomplish.<br />
<br />
<h2>
<b>Next week</b></h2>
What <i>not to do</i> to make your new employees feel welcome, and how to tell (as an employee) when things are likely to go sour quickly.Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com0tag:blogger.com,1999:blog-14034177.post-17186198308151227292017-11-13T09:00:00.000-08:002017-11-20T11:56:24.560-08:00Mergers: The Good, The Bad, and The Ugly<h2>
<span style="font-family: inherit;"><b>You've been acquired <i>how many times</i>?</b></span></h2>
<span style="font-family: inherit;">In my career, I've been fortunate enough to have worked for a number of small software/hardware companies, several of which were absorbed by much larger companies. I though tit'd be interesting to compare and contrast some of the ways the various mergers went good and bad, and what acquiring companies might be able to learn from my experience.</span><br />
<span style="font-family: inherit;"><br />Here's the timeline so far:</span><br />
<ol>
<li><span style="font-family: inherit;">I started working for NeXT Software in 1994, they were acquired by Apple in 1996.</span></li>
<li><span style="font-family: inherit;">I left Apple in 1999 to work for Silicon Spice. They were acquired by Broadcom in 2000.</span></li>
<li><span style="font-family: inherit;">Broadcom laid me off, and I went back to Apple for a while.</span></li>
<li><span style="font-family: inherit;">I left Apple in 2005 to work at Zing Systems, which was acquired by Dell in 2007.</span></li>
<li><span style="font-family: inherit;">I left Dell to go work at Palm in 2009. In 2010, Palm was acquired by Hewlett-Packard.</span></li>
<li><span style="font-family: inherit;">Hewlett-Packard eventually sold the entire WebOS group to LG.</span></li>
<li><span style="font-family: inherit;">I left LG to go work for Citrix on GoToMeeting. After 2 1/2 years, the GoToMeeting business was spun off and merged with LogMeIn, Inc.</span></li>
</ol>
<div>
<span style="font-family: inherit;">So I've been part of 6 different merger/acquisition processes at this point, and I feel like </span>I'm<span style="font-family: inherit;"> getting a feel for how you can tell </span>when<span style="font-family: inherit;"> an acquisition is going to go well, as opposed to going poorly.</span></div>
<div>
<span style="font-family: inherit;"><br /></span></div>
<div>
<h2>
<b>Why do big companies buy smaller companies?</b></h2>
</div>
<div>
When a big company acquires a smaller company, it can be for a variety of reasons. Sometimes it's to acquire a potential competitor, before they can get large enough to threaten the larger company. It can be an "acqui-hire", where they buy the smaller company strictly for its human resources, and have no real interest in the technology or products the smaller company has developed (this happens with social media companies frequently, because skilled developers are hard to find). Or, it can be a case of acquiring a new technology, and a team of experts in that technology, in order to either kick-start a new product, or to kick new life into an existing product. That last reason was the primary reason for all of the acquisitions I've been involved in.</div>
<div>
<br /></div>
<div>
<h2>
<b>What's the most-comon mistake acquiring companies make?</b></h2>
</div>
<div>
Understandably, big companies often look to smaller companies as an engine to drive innovation. There's a perception that small companies can move faster and be more nimble than larger companies. So there's often a desire to let the new acquisition run itself, as a sort of independent entity inside the larger company. Being hands-off seems like the obviously-right thing to do if you wanted increased agility to start with, but this is generally not as good of an idea as it'd seem at first blush.</div>
<div>
<br /></div>
<div>
Yes, you absolutely don't want to break up the functional team you just acquired, and spread them out willy-nilly throughout your company. You don't want to drag them into the bureaucracy and infighting that has marred all of your internal attempts at innovation. But guess what? If you don't make an effort to get them at least nominally integrated with the rest of the company, you will, at best, end up with an isolated group, who continue to do their one thing, but don't meaningfully contribute to your larger organization's bottom line. And the smaller group will also get none of the benefits of scale of being part of the larger group. It's lose-lose.</div>
<div>
<br /></div>
<div>
<h2>
<b>Examples of the Good, the Bad, and the Ugly</b></h2>
</div>
<div>
Tune in next Monday (and the Monday after that) for real-life tales of integrations gone well, gone poorly, and gone horribly wrong.</div>
<div>
<br /></div>
Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com0tag:blogger.com,1999:blog-14034177.post-61609986722373416652017-11-06T09:00:00.000-08:002017-11-06T09:38:52.095-08:00That delicate line between security and convenienceA key problem, maybe <i>the key problem</i> in software security is how to properly balance user convenience with security. Adding additional security to a system often includes extra work, more time, or other compromises from the end-user. And reasonable people can disagree about where the line is for the appropriate trade-off.<br />
<div>
<br />
<b>That iPhone camera permissions "flaw"</b></div>
<div>
There was a brief flurry of articles in the news recently, talking about a "flaw" in iOS permissions which would allow applications to take your picture without you being aware. Typically, these were presented with click-bait headlines like:</div>
<div>
<br /></div>
<div>
<a href="https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=newssearch&cd=3&cad=rja&uact=8&ved=0ahUKEwji7uaxu6XXAhWELmMKHdZECo8Qu4gBCDUoAjAC&url=http%3A%2F%2Fwww.trustedreviews.com%2Fnews%2Fiphone-apps-secret-camera-access-3317475&usg=AOvVaw11cHeJs6Br7I4zGU8WoQFq">Google developer shows how iPhone apps could secretly record you</a></div>
<div>
<br /></div>
<div>
or </div>
<div>
<br /></div>
<div>
<a href="https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=newssearch&cd=1&cad=rja&uact=8&ved=0ahUKEwil3-Heu6XXAhVX-GMKHfsZDOkQu4gBCCkoAjAA&url=https%3A%2F%2Fgizmodo.com%2Fdeveloper-shows-how-iphone-apps-can-theoretically-spy-o-1819874714&usg=AOvVaw08QhoH8UQ0z2yik7ZvmgsY">Developer Shows How iPhone Apps Can Theoretically Spy on You ...</a></div>
<div>
<br /></div>
<div>
The blog post of the actual security researcher who raised this issue (Felix Krause) is substantially less-sensational:</div>
<div>
<br /></div>
<div>
<a href="https://krausefx.com/blog/ios-privacy-watchuser-access-both-iphone-cameras-any-time-your-app-is-running" target="_blank">Access both iPhone cameras any time your app is running</a></div>
<div>
<br /></div>
<div>
It's good that this issue is getting some attention, but it's important to understand where we're coming from, what the actual issue is, and possible ways to mitigate it. As a quick aside, I find it annoying that the articles say "Google engineer". Yes, Krause works for Google, but this work is not coming out of his "day job", but rather his own work in security research. Also, Android has exactly this same problem, but it doesn't merit a blog post or worldwide news coverage, because apparently nobody expects even minimal privacy from Android devices.</div>
<div>
<br /></div>
<div>
<b>How camera permissions work on iOS today</b></div>
<div>
The current version of iOS asks the user for permission to use the camera the first time that an application tries to access it. After that, fi the application is running in the foreground, it can access the camera whenever it wants to, without any additional interaction. And typically, this is actually what the user wants.</div>
<div>
<br /></div>
<div>
It's convenient and fun to be able to use the built-in camera support in Facebook without having to answer "yes I do want to use the camera" each time that you choose to share a photo on social media. And replacements for the built-in camera app, like Instagram, Snapchat, and Halide, would be pretty much unusable if you had to answer a prompt Every. Single. Time. you wanted to take a photo.</div>
<div>
<br /></div>
<div>
<b>How it used to work</b></div>
<div>
Previous versions of iOS actually <a href="https://developer.apple.com/documentation/uikit/uiimagepickercontroller" target="_blank">required applications to use the built-in camera interface</a> to take pictures. You still only had to give permission once, but <i>it was really obvious</i> when the app was taking you picture, because the camera preview was right there in your face, taking over your screen. This design was widely criticized by app developers, because it made for a really jarring break in their carefully-crafted use experience to have the built-in camera appear, and they couldn't provide a preview that actually showed what was going to be captured (with the rise of photo filters, this is especially problematic).</div>
<div>
<br /></div>
<div>
At some point, Apple added the capability to capture photos and video, while presenting the app's own interface. This makes for a more-cohesive experience for the end-user, and makes it possible for apps to preview what they're actually going to produce, filters, silly hats, and all. This is clearly a win for the app developers, and I'd argue it is also a win for the end-user, as they get a better experience with the whole picture taking process.</div>
<div>
<br /></div>
<div>
<b>What's the actual privacy issue here?</b></div>
<div>
I use Facebook to post photos and videos, sometimes. But I don't really want Facebook taking pictures of my face when I'm not using the camera features, and analyzing that data to better serve me content, including advertisements.</div>
<div>
<br /></div>
<div>
If I'm scrolling through my news feed, and Facebook is silently analyzing the images coming through the back camera, so that they can discover my location and serve me ads for whatever business I'm standing in front of, that's intrusive and creepy. If they're reading my facial expression to try to determine how I feel about the items in my news feed, that's even worse.</div>
<div>
<br /></div>
<div>
<b>How Apple can better-inform users</b></div>
<div>
I don't think anybody wants to go back to using the UIImagePicker interface, and I don't think anybody (except possibly security researchers) wants to have to affirmatively give permission every time an application wants to take a picture or video. One alternative that I like (and Krause mentions this in his initial blog) is some kind of persistent system UI element that indicates that the camera is on. Apple already does something similar with a persistent banner on the top of the screen when applications in the background are capturing audio (for VoIP communications). A little dot on the status area would go a long way, here.</div>
<div>
<br /></div>
<div>
It'd also be really nice to have a toggle in Preferences (or better, in Control Center) to disable the camera system-wide, so if you know you're heading somewhere that you shouldn't be taking pictures, you can temporarily disable the camera.</div>
<div>
<br /></div>
<div>
<b>What users can do to better protect themselves</b></div>
<div>
Obviously, just don't grant camera permission to applications that don't actually need them. think most social network software falls into this category. Twitter and Facebook don't actually need to access my camera, so I have it disabled for both of them. If you actually DO use Facebook and Twitter to take pictures, then I guess you'll just need to be more aware of the tradeoffs.</div>
<div>
<br /></div>
<div>
If you "have to" enable camera access to certain apps, but you don't fully-trust them, there are honest-to-goodnes lens caps you can buy which will cover your iPhone camera when you're not using it. Or a piece of tape works. There's even specially-made tape tabs for just this purpose.</div>
<div>
<br /></div>
<div>
<br /></div>
Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com0tag:blogger.com,1999:blog-14034177.post-21107653296769224322017-10-17T09:00:00.000-07:002017-10-17T09:00:27.066-07:00"Responsible Encryption" - what does that mean?This weekend I read <a href="https://alexgaynor.net/2017/oct/13/rosenstein-encryption-response/" target="_blank">this excellent article</a> by Alex Gaynor responding to Deputy Attorney General Rod Rosenstein's remarks on encryption to two different audiences last week. Please do go and read it when you get a chance, as it delves into the sadly common tactic of pointing to a bunch of scary criminal incidents, then saying "unbreakable encryption enables criminals and terrorists", without presenting any evidence that those crimes were enabled by encryption technology, or that law enforcement officers were actually hampered in their investigations by encryption.<br />
<br />
In fact, in the case of the FBI, Apple, and the San Bernardino shooter, AG Rosenstein repeats all of the same false narrative that we've <a href="http://codemines.blogspot.com/2016/02/apple-vs-fbi.html" target="_blank">been presented with before</a> - that the shooter's phone possibly contained vital information, that Apple "could" decrypt the information, and that they fought the FBI's legal attempts to force them to do so. Read my previous blog post (linked above) for background on that line of argument, and how the FBI willfully twists the facts of the case, to try to get something <i>much</i> more far-reaching than what they claim to want.<br />
<br />
One thing not addressed directly in Alex's article is the frustration that the FBI and other law enforcement officials have expressed over the inability to execute a legal search warrant, when they're faced with a locked phone, or a communications system that provides end-to-end encryption.<br />
<br />
From Rosenstein's remarks to the Global Security Conference<br />
<blockquote class="tr_bq">
We use the term “responsible encryption” to describe platforms that allow police to access data when a judge determines that compelling law enforcement concerns outweigh the privacy interests of a particular user. In contrast, warrant-proof encryption places zero value on law enforcement. Evidence remains unavailable to the police, no matter how great the harm to victims.</blockquote>
First, what a bunch of emotionally-charged words. And again we see the disconnect between what the FBI and other agencies <i>say</i> that they want (a way to unlock individual phones), and what they seem to keep asking for (a key to unlock any phone they can get their hands on).<br />
<br />
But the man does have a point - there is some value to society in the FBI being able to execute a valid search warrant against someone's phone, or to "tap" the communications between known criminals. And I think he's also right that that sort of access is not going to be provided if the free market is allowed to set the rules. It'll never be in Apple's or any individual customer's interest to make it easier to access a locked phone. So, it'll come down to a matter of legislation, and I think it's worth the tech folks having this conversation before Congress sits down with a bill authored by the FBI and the NSA to try to force something on us.<br />
<br />
The encryption-in-flight question is very complicated (and crypto protocols are hard to get right - see the recent KRACK security vulnerabilities), so I'll leave that for a future post. I do believe that there are reasonable ways for tech companies to design data-at-rest encryption that is accessible via a court order, but maintains reasonably-good security for customers. Here's a sketch of how one such idea might be implemented:<br />
<br />
<h3>
On-device Key Escrow</h3>
<b><br /></b>
<b>Key escrow </b><br />
The basic idea of key escrow is that there can be two keys for a particular piece of encrypted data - one key that the user keeps, and one that is kept "in escrow" so another authorized agent can access the data, if necessary. The ill-fated <a href="https://en.wikipedia.org/wiki/Clipper_chip" target="_blank">Clipper Chip</a> was an example of such a system. The fatal flaw of Clipper (well, <i>one of them</i>) is that it envisioned every single protected device would have its secondary key held securely by the government to be used in case of a search warrant being issued. If Clipper had ever seen broad adoption, the value of that centralized key store would have been <i>enormous</i>, both economically and militarily. We're talking a significant fraction of the US GDP, probably trillions of dollars. That would have made it the #1 target of thieves and spies across the world.<br />
<br />
<b>Eliminating central key storage</b><br />
But the FBI really doesn't need the ability to decrypt <i>every</i> phone out there. They need the ability to decrypt <i>specific</i> phones, in response to a valid search warrant. So, how about storing the second key on the device itself? Every current on-device encryption solution that I know of provides for the option of multiple keys. And in fact, briefly getting back to the San Bernardino shooter's phone, if the owners of that phone (San Bernardino County) had had a competent IT department, they would have set up a second key that they could then have handed over to the FBI, neatly avoiding that whole mess with suing Apple.<br />
<br />
You could imagine Apple generating a separate "law enforcement" key for every phone, and storing that somewhere, but that has all the same problems as the Clipper central key repository, just on a slightly smaller scale. So those keys need to stored separately. How about storing them on the device itself?<br />
<br />
<b>Use secure storage</b><br />
Not every phone has a "secure enclave" processor like the iPhone, but it's a feature that you'll increasingly see on newer phones, as Apple and other manufacturers try to compete on the basis of providing better privacy protection to their customers. The important feature of these processors is that they don't allow software running on the phone to extract the stored keys. This is what keeps the user's data secure from hackers. So, if the key is stored in there, but the phone software can't get it out, how will the FBI get the key?<br />
<br />
<b>Require physical access</b><br />
My preferred solution would be for the secure enclave to have a physically-disconnected set of pins that can be used just for extracting the second key. In order to extract the key, you'd need to have physical access to the device, disassemble it, and solder some wires on it. This is, I think, sufficiently annoying that nobody would try to do it without getting a warrant first.<br />
<br />
It also means that nobody can search your phone without taking it out of your possession for a good long while. This seems like a reasonable trade-off to me. If someone executes a search warrant on your house, you'll certainly know about it. There's such a thing as "sneak and peek" warrants, or delayed-notice warrants, where police sneak in and search your home while you're not there, but I'm not particularly interested in solving that problem for them.<br />
<br />
<b>Conclusion</b><br />
Is this a perfect solution? Of course not. But I think something like this is a reasonable place to <i>start</i> when discussing law enforcement access to personal electronics. And I think we want to have this conversation sooner, rather than later. What do you think?Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com0tag:blogger.com,1999:blog-14034177.post-59639581859893097462017-10-02T09:00:00.000-07:002017-10-02T09:00:13.344-07:00The "Just Smart Enough" House<h2>
Less Architectural Digest, more "This is our home"</h2>
We've been doing some remodeling on our house, and the overarching theme of the renovations has been "make this house convenient for real humans to live in". When we bought the house, it was "perfect" in one sense - the house is broken up into two sections, with a central courtyard between, and we were looking for a place where my Father-in-law could come live with us, and still have some space to himself and some privacy.<br />
<div>
<br /></div>
<div>
In many other respects, it was a wildly-impractical house. There's a sad story there, of a couple who fall into and out of love during a remodel, of a mother who overruled the architect in a few critical ways, of a home that was left unfinished when the couple living there split up, and of a house split (illegally) into two units to try to keep it, by supplementing income via renting out the back. </div>
<div>
<br /></div>
<div>
The end result was a house that certainly looks "fancy", in that it's got a Great Room with a wall entirely filled up by windows and sliding doors, a big fireplace faced in Travertine, and a ridiculous number of doors to the outside, for that "indoor/outdoor living" feeling. Seriously, there are <i>11 doors to the outside</i>, not including the garage door. Other than being slightly unfinished, it could totally have been a house featured in Architectural Digest.</div>
<div>
<br /></div>
<div>
But when you're living there, you start to notice some of the compromises. I don't think I've ever lived in a house that didn't have a coat closet before. Or a broom closet. Or a linen closet. Hence the remodel, the first part of which was just turning the illegal 2nd unit into a more-reasonable bedroom suite for Bob, and <i>adding some damn storage</i>.</div>
<div>
<br /></div>
<div>
We added a bunch more storage into the Great Room, and that meant adding new electrical circuits for new under-cabinet and in-cabinet lighting. And because I'm a total nerd, that meant researching Smart Switches to control all of the new lighting (and ideally move some of the more-inconvenient switches to a better location).</div>
<div>
<br /></div>
<h2>
Who do you trust?</h2>
<div>
I pretty quickly settled on getting my smart switches from an electrical equipment manufacturer, rather than some startup "home automation" company. I <i>really, really don't want my house to burn down</i>, and while I have no reason to think that the quality of the zillions of Wi-Fi enabled switches on Amazon.com is anything but excellent, I felt more-comfortable going with a company that has a hundred years or so of experience with not burning people's houses down.</div>
<div>
<br /></div>
<h2>
Lutron vs Leviton</h2>
<div>
<div>
<h4>
(that really sounds like a super-hero movie, doesn't it?)</h4>
</div>
<div>
Lutron and Leviton are two of the largest electrical fixture manufacturers, and choosing between one or the other when buying a regular switch or receptacle is mostly just a matter of which brand your local hardware store carries, and whether or not you want to save $0.25 by buying the store brand.</div>
<div>
<br /></div>
<div>
In the "Home Automation" arena, they each have <a href="http://www.lutron.com/en-US/Residential-Commercial-Solutions/Pages/Residential-Solutions/WholeHomeSolutions.aspx" target="_blank">a variety of solutions</a>, ranging from giant panel-based systems that you're expected to put in a closet somewhere and have installed by a "trained integrator", to simpler systems which are aimed at the DIY market.</div>
<div>
<br /></div>
<h2>
You can go all-in, or you can just put a toe in</h2>
<div>
It didn't take long for me to decide that the fancier integrated systems were not really what I wanted. First off, they're fairly expensive, though the expense looks a little less extreme once you start comparing the per-switch cost of the smart switches vs the centralized version. But ultimately, I didn't really want to deal with a "system integrator" setting the thing up (though apparently it's very easy to get certified by Lutron if you're a licensed electrician, which I'm not). Also, nobody had anything good to say about the phone apps that were available for these systems. And finally, the high-end systems are all about providing a touch pad interface, to give your home that proper Jetsons look. I have no interest in having touch screens mounted on the wall in every room, so that was more of a downside for me, than an attraction. The stand-alone switches from either vendor look more-or-less like standard Decora-style dimmers.</div>
</div>
<div>
<br /></div>
<div>
In the consumer-focused lines, there are some interesting differences between the two companies. Leviton's consumer products are mostly compatible with the Z-Wave standard, which means they work with third-party smart home hubs. The reviews online for the Smart Things and Wink hubs weren't particularly encouraging to me, so that was a bit of a bummer.</div>
<div>
<br /></div>
<div>
The Lutron stuff uses a proprietary wireless protocol, and they sell their own hub. The <a href="http://www.casetawireless.com/Pages/Caseta.aspx" target="_blank">Caseta</a> hub (Lutron's hub) seemed to actually get pretty good reviews. It isn't as capable as the Smart Things hub but, and this was pretty critical for me - it does connect to HomeKit, Apple's home automation system (it also works with Amazon's Alexa and the Google Home device). So, we went with the Lutron Caseta stuff, because it's easy to use, looks reasonable in our house and is available at both Home Depot and Lowes, as well as the local electrical supply store.<br />
<br />
<h2>
Hardware from the hardware store, software from a software company</h2>
</div>
<div>
The connection to HomeKit means that even though the Caseta hub isn't as full-featured as some of the other smart home hubs, I don't really need to care. We're pretty much an all-Apple shop here at Casa de Bessey, so knowing that I could control all of the things attached to the Caseta hub from my phone, using Apple's Home app, is a pretty big draw for me. </div>
<div>
<br /></div>
<div>
I know it's the 21st century, and everybody needs to have an App, but that doesn't mean every application is equally well-made. If there's a feature that I really "need", and it's not available in the standard software that comes with the Caseta, I could (at least in theory) set up an Apple TV or an iPad as an always-on HomeKit hub, and write my own damn software to run on it.</div>
<div>
<br /></div>
<div>
HomeKit will likely continue to gain new features over the years, so I may never need to do anything custom. But if I do, it's nice to know that I can work with familiar tools and environment, rather than struggling with some obscure system provided by the switch manufacturer.</div>
<div>
<br /></div>
<h2>
The Caseta Wireless experience</h2>
<div>
We're a couple of months into using the Caseta hardware, and here's how it's been going so far.</div>
<div>
<br /></div>
<h3>
The Good</h3>
<div>
<div>
<b>Dimmers everywhere</b><br />
One thing I hadn't really thought about before doing this work is that the dimmer-switch version of the Caseta switches is almost the same price as the plain switch version. We were in the process of gradually replacing our CFL bulbs with LED bulbs anyway, so we've gone with dimmer switches basically everywhere. The added flexibility of being able to set the brightness of any particular light is a nice upgrade.</div>
<div>
<br /></div>
<b>The basics are all there</b><br />
All of the fancy features in the world wouldn't be helpful if the basic features weren't there. The switches feel nice, they look nice, and they're easy to install. The software makes it easy to set up "scenes" where you can hit a single button, and set the brightness level of any sub-set of lights in the house.<br />
<b><br /></b>
<b>HomeKit/Siri integration</b></div>
<div>
It just works. There really is something magical about being able to say "Siri, turn out all the lights", and have the entire house go dark. Or indeed saying "Siri, turn out the light in Jeremy's Room" to my watch, and having that work on the first try.</div>
<div>
<b><br /></b></div>
<div>
<b>Easy to setup and use</b></div>
<div>
You basically plug in the hub, press a button to pair it with the app on your phone, and then start adding devices. The switches are direct replacements for your existing switches, so installing them is basically:</div>
<div>
<ol>
<li>Turn off the power</li>
<li>Remove the old switch</li>
<li>Wire the new switch/dimmer in</li>
<li>Turn the power back on</li>
</ol>
</div>
<div>
The only slightly-complex cases are when you're replacing a three-way switch. The Caseta solution for 3-way (or more) situations is to install the switch at one end, then just install battery-powered remotes at any other location you need to control that light from. When you take out the 3-way, you do need to connect the "traveller" wires together, but they provide instructions online to show you how to do that.</div>
<div>
<br /></div>
<div>
You do have to add each individual switch to the app one at a time, which could get tedious in a large installation. It sure made things easy for the electricians, though - they just had to wire things up, without keeping track of which switch went in which room, since I would set all that up later after they left. From talking to them, I got the impression that the usual install of the higher-end stuff <i>does</i> involve writing down a bunch of "this circuit is on switch #12345" notes, then going back and fixing things later when setting up the controller.</div>
<div>
<br /></div>
<div>
<b>Reliable</b></div>
<div>
Unless the WiFi in the house is down, I haven't had any problems connecting to the hub, either from the Lutron app (when adding new hardware) or from Apple's Home App. Because the individual switches all have controls on them, even in the case of catastrophic failure, you can still walk around and turn off everything "by hand". That's another point in favor of the non-centralized system, I guess.</div>
<div>
<br /></div>
<div>
<b>Supports "enough" devices for my house</b></div>
<div>
One of the big differences between the Caseta stuff and Lutron's next higher tier (Radio RA2), is the number of "devices" they support. Every switch, every dimmer, and every remote control is a "device" for these counts. Caseta only supports 50 devices. I haven't come anywhere close to the limit yet, but we haven't replaced every last switch in the house yet, either. I think we'll be over 40 once all of the switches I care about have been replaced. Our house is close to 2,000 square feet, so if your house is smaller than that, I doubt the limit will ever matter much. And here's where the connection to HomeKit also helps - if we ever do hit the device limit, I can buy another Caseta hub for $75, and have another 50 devices.</div>
<div>
<br /></div>
<h3>
The Bad</h3>
<div>
<b>Range and range extenders</b></div>
<div>
The Caseta documentation says that every controlled device needs to be within 30 feet of the hub. In practice, the maximum reach is just a bit longer than that in our house, but not very much farther. You can extend the range of the system, by using a plug-in dimmer as a repeater. You can have exactly one repeater, which is another limitation compared to the higher-end systems, which support multiple repeaters. But again - if I ever did run into this in practice, I'd probably just get another hub, and have one for each end of the house, since the hubs really aren't all that expensive.</div>
<div>
<br /></div>
<div>
<b>Pricing structure</b></div>
<div>
Honestly, the way that Lutron prices this stuff makes almost no sense at all. You can buy various "kits" with a hub, a dimmer and a remote, or a hub and a few dimmers and remotes, or a hub and some plug-in dimmers. The individual kit components cost more separately, which is no surprise, but some of the prices are weirdly inverted - it costs more to buy just a dimmer than it does to buy the dimmer, a remote, and all of the trim hardware. I assume anybody who makes extensive use of this product line eventually ends up with a box full of unused remotes, but that's just slightly wasteful, not an actual problem.</div>
<div>
<br /></div>
<div>
<b>Trigger configuration is very basic</b></div>
<div>
The "smart" hub isn't very smart. You can bind particular remotes to particular switches, set up scenes, and do some very basic automation. A recent software update improved some of this so that you can now do some more scheduling.</div>
<div>
<br /></div>
<div>
But take, for example, the "arriving home" automation. I can set up a scene to activate when I arrive home. That's nice, but I can't actually set up a scene to activate when I'm the first one home, or the last to leave. HomeKit supports this, so that might be the thing that gets me to finally set up an Apple TV as a HomeKit hub. Or maybe I'll wait for the HomePod...</div>
<div>
<br /></div>
<h3>
The Unknown</h3>
<div>
<b>Security</b></div>
<div>
I haven't done a basic security audit on the Caseta hub, yet. That'll make a fun weekend project. The online component of the hub is protected by a user name and password, at least. And if I do get totally paranoid, I can always disconnect the hub from the internet, and route everything through an iOS HomeKit hub, which is likely to be more-secure.</div>
<div>
<br /></div>
<div>
<b>Longevity</b></div>
<div>
What happens if Lutron decides to end-of-life the Caseta line? Will I still be able to get replacement parts, or a new hub if the old one breaks? For that matter, what if Apple stops supporting HomeKit, or removes the Lutron app from the App Store?<br />
<br />
This is the problem with going with the proprietary solution. I am somewhat dependent on both Lutron and Apple staying in this business, and getting along with each other. The hub is basically unusable without the app, so that's definitely a concern. I suspect if Lutron found themselves in a situation where they could no longer provide the iOS app, they'd be motivated to provide another solution, or at the very least, a migration strategy to one of their other control hubs.<br />
<br />
At the absolute worst-case scenario, the Caseta switches and the remote controls can be set up and paired to operate completely independently of the hub. I'd lose all of the "smart" features, but at least I'd still have working light switches.<br />
<br />
<h2>
Conclusions</h2>
Overall, this was a really great way to get my feet wet with "smartening up" my home. The increased control over the lights in the house is convenient, and actually helps make the house more livable. The potential downsides are limited by the design of the Caseta system, which gracefully falls back to "no worse than just having old light switches", something which is not necessarily true of other connected home devices, like thermostats, which can have terrible failure modes.<br />
<br />
If you're interested in adding some smarts to your home, I can definitely recommend the Caseta products. They're easy to set up and use, and have been very reliable for us so far.<br />
<br /></div>
Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com1tag:blogger.com,1999:blog-14034177.post-91468911643245004222017-09-25T09:00:00.000-07:002017-09-25T09:00:03.742-07:00Follow up: LockState security failuresI wrote a blog post last month on <a href="http://codemines.blogspot.com/2017/08/what-your-internet-of-things-startup.html" target="_blank">what your IoT startup can learn from the LockState debacle</a>. In the intervening weeks, not much new information has come to light about the specifics of the update failure, and it seems from their public statements that LockState thinks it's better if they don't do any kind of public postmortem on their process failures, which is too bad for the rest of us, and for the Internet of Things industry, in general - if you can't learn from others' mistakes, you (and your customers) might have to learn your own mistakes.<br />
<br />
However, I did see a couple of interesting articles in the news related to LockState. <a href="https://ctovision.com/4-lessons-businesses-can-learn-smart-lock-malfunction/" target="_blank">The first one</a> is from a site called CTOVision.com, and it takes a bit more of a business-focused look at things, as you might have expected from the site name. Rather than looking at the technical failures that allowed the incident to happen, they take LockState to task for their response after the fact. There's good stuff there, about how it's important to help your customers understand possible failure modes, how you should put the software update process under their control, and how to properly respond to an incident via social media.<br />
<br />
And on The Parallax, a security news site, I found <a href="https://www.the-parallax.com/2017/08/23/default-codes-remotelock-6i-iot/" target="_blank">this article</a>, which tells us about another major failure on the part of LockState - they apparently have a default 4-digit unlock code set for all of their locks from the factory, and also an 8-digit "programming code", which gives you total control over the lock - you can change the entry codes, rest the lock, disable it, and disconnect it from WiFi, among other things.<br />
<br />
Okay, I really shouldn't be surprised by this at this point, I guess - these LockState guys are obviously a bit "flying by the seat of your pants" in terms of security practice, but seriously?<i> Every single lock comes pre-programmed with the same unlock code and the same master programming code?</i><br />
<i><br /></i>
Maybe I'm expecting too much, but if a $2.99 cheap combination lock from the hardware store comes with a slip of paper in the package with its combination printed on it, maybe the <i>$600 internet-connected smart lock can do the same</i>? Or hell, use a laser to mark the master combination on the inside of the case, so it's not easily lost, and anyone with the key and physical access can use the code to reset the lock, in the (rare) case that that's necessary.<br />
<br />
Or, for that matter - if you <i>must</i> have a default security code for your device (because your manufacturing line isn't set up for per-unit customization, maybe?), then make it part of the setup process to change the damn code, and don't let your users get into a state where they think your product is set up, but they haven't changed the code.<br />
<br />
It's easy to fall into the trap of saying that the user should be more-aware of these things, and they should know that they need to change the default code. But your customers are not typically security experts, and you (or at least some of your employees) <i>should be</i> security experts. You need to be looking out for them, because they aren't going to be doing a threat analysis while installing their new IoT bauble.<br />
<br />Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com0tag:blogger.com,1999:blog-14034177.post-1157250395053441892017-09-18T09:00:00.000-07:002017-09-18T10:59:14.905-07:00A short rant on XML - the Good, the Bad, and the Ugly[<b>editor's note:</b> This blog post has been in "Drafts" <b>for 11 years</b>. In the spirit of <i>just getting stuff out there</i>, I'm publishing it basically as-is. Look for a follow-up blog post next week with some additional observations on structured data transfer <i>from the 21st century</i>]<br />
<br />
So, let's see if I can keep myself to less than ten pages of text this time...
<br />
<br />
XML is the e<strong>X</strong>tensible <strong>M</strong>arkup <strong>L</strong>anguage. It's closely related to both HTML, the markup language used to make the World Wide Web, and SGML, a document format that you've probably never dealt with unless you're either a government contractor, or you used the Internet back in the days before the Web. <em>For the pedants out there, I <u>do</u> know that HTML is actually an SGML "application" and that XML is a proper subset of SGML. Let's not get caught up in the petty details at this point.</em>
<br />
<br />
XML is used for a variety of different tasks these days, but the most common by far is as a kind of "neutral" format for exchanging structured data between different applications. To keep this short and simple, I'm going to look at XML strictly from the perspective of a data storage and interchange format.<br />
<br />
<br />
<h2>
The good</h2>
<h3>
Unicode support</h3>
XML Documents can be encoded using the Unicode character encoding, which means that nearly any written character in any language can be easily represented in an XML document.<br />
<br />
<h3>
Uniform hierarchical structure</h3>
XML defines a simple tree structure for all the elements in a file - there's one root element, it has zero or more children, which each have zero or more children, ad infinitum. All elements must have an open and close tag, and elements can't overlap. This simple structure makes it relatively easy to parse XML documents.
<br />
<br />
<h3>
Human-readable (more or less)</h3>
XML is a text format, so it's possible to read and edit an XML document "by hand" in a text editor. This is often useful when you're learning the format of an XML document in order to write a program to read or translate it. Actually writing or modifying XML documents in a text editor can be incredibly tedious, though a syntax-coloring editor makes it easier.
<br />
<br />
<h3>
Widely supported</h3>
Modern languages like C# and Java have XML support "built in" in their standard libraries. Most other languages have well-supported free libraries for working with XML. Chances are, whatever messed up environment you have to work in, there's an XML reader/writer library available.<br />
<br />
<br />
<h2>
The bad</h2>
<h3>
Legacy encoding support</h3>
XML Documents can <em>also</em> be encoded in whatever wacky character set your nasty legacy system uses. You can put a simple <em>encoding="Ancient-Elbonian-EBCDIC"</em> attribute in the XML declaration element, and you can write well-formed XML documents in your favorite character encoding. You probably shouldn't expect that anyone else will actually be able to read it, though.
<br />
<br />
<h3>
Strictly hierarchical format</h3>
Not every data set you might want to interchange between two systems is structured hierarchically. In particular, representing a relational database or an in-memory graph of objects is problematic in XML. A number of approaches are used to get around this issue, but they're all outside the scope of standardized XML (obviously), and different systems tend to solve this problem in different ways, neatly turning the "standardized interchange format for data" into yet another proprietary format, which is only readable by the software that created it.
<br />
<br />
<h3>
XML is verbose</h3>
A typical XML document can be 30% markup, sometimes more. This makes it larger than desired in many cases. There have been several attempts to define a "binary XML" format (most recently <a href="http://www.w3.org/TR/exi/">by the W3C group</a>), but they really haven't caught on yet. For most applications where size or transmission speed is an issue, you probably ought to look into compressing the XML document using a standard compression algorithm (gzip, or zlib, or whatever), then decompressing it on the other end. You'll save quite a bit more that way than by trying to make the XML itself less wordy.<br />
<br />
<h3>
Some XML processing libraries are extremely memory-intensive</h3>
There are two basic approaches to reading an XML document. You can read the whole thing into memory and re-construct the structure of the file into a tree of nodes in memory, and then the application can use standard pointer manipulation to scan through the tree of nodes, looking for whatever information it needs, or further-transforming the tree into the program's native data structures. One XML processing library I've used loaded the whole file into memory all at once, then created a second copy of all the data in the tags. Actually, it could end up using up to the size of the file, plus twice the combined size of all the tags.
<br />
<br />
Alternatively, the reader can take a more stream-oriented approach, scanning through the file from beginning to end, and calling into the application code whenever an element starts or ends. This can be implemented with a callback to your code for every tag start/end, which gives you a simple interface, and doesn't require holding large amounts of data in memory during the parsing.
<br />
<br />
<h3>
No random access</h3>
This is just fallout from the strict hierarchy, but it's extremely labor intensive to do any kind of data extraction from a large XML document. If you only want a subset of nodes from a couple levels down in the hierarchy, you've still got to step your way down there, and keep scanning throught the rest of the file to figure out when you've gone up a level.<br />
<br />
<br />
<h2>
The ugly</h2>
By far, the biggest problems with XML don't have anything to do with the technology itself, but with the often perverse ways in which it's misapplied to the wrong problems. Here are a couple of examples from my own experience.<br />
<br />
<h3>
Archiving an object graph, and the UUID curse</h3>
<div>
XML is a fairly reasonable format for transferring "documents", as humans understand them. That is, a primarily linear bunch of text, with some attributes that apply to certain sections of the text.</div>
<div>
<br /></div>
<div>
These days, a lot of data interchange between computer programs is in the form of relational data (databases), or complex graphs of objects, where you'll frequently need to make references back to previous parts of the document, or forward to parts that haven't come across yet.</div>
<div>
<br />
The obvious way to solve this problem is by having a unique ID that you can reference to find one entity from another. Unfortunately, the "obvious" way to ensure that a key is unique is to generate a globally-unique key, and so you end up with a bunch of 64-bit or 128-bit GUIDs stuck in your XML, which makes it really difficult to follow the links, and basically impossible to "diff' the files, visually.<br />
<br />
One way to avoid UUID proliferation is to use "natural unique IDs, if your data has some attribute that needs to be unique anyway.<br />
<br />
<h3>
What's the worst possible way to represent a tree?</h3>
I doubt anybody's ever <i>actually</i> asked this question, but I have seen some XML structures that make a pretty good case that that's how they were created. XML, by its heirarchical nature, is actually a really good fit for hierarchical data. Here is one way to store a tree in XML:<br />
<br />
<pre><pants color="blue" material="denim">
<pocket location="back-right">
<wallet color="brown" material="leather">
<bill currency="USD" value="10"></bill>
<bill currency="EURO" value="5"></bill>
</wallet>
</pocket>
</pants>
</pre>
<br />
And here's another:<br />
<pre></pre>
<pre><object>
<id>
20D06E38-60C1-433C-8D37-2FDBA090E197
</id>
<class>
pants
</class>
<color>
blue
</color>
<material>
denim
</material>
</object>
<object>
<id>
1C728378-904D-43D8-8441-FF93497B10AC
</id>
<parent>
20D06E38-60C1-433C-8D37-2FDBA090E197
</parent>
<class>
pocket
</class>
<location>
right-back
</location>
</object>
<object>
<id>
AFBD4915-212F-4B47-B6B8-A2663025E350
</id>
<parent>
1C728378-904D-43D8-8441-FF93497B10AC
</parent>
<class>
wallet
</class>
<color>
brown
</color>
<material>
leather
</material>
</object>
<object>
<id>
E197AA8D-842D-4434-AAC9-A57DF4543E43
</id>
<parent>
AFBD4915-212F-4B47-B6B8-A2663025E350
</parent>
<class>
bill
</class>
<currency>
USD
</currency>
<denomination>
10
</denomination>
</object>
<object>
<id>
AF723BDD-80A1-4DAB-AD16-5B37133941D0
</id>
<parent>
AFBD4915-212F-4B47-B6B8-A2663025E350
</parent>
<class>
bill
</class>
<currency>
EURO
</currency>
<denomination>
10
</denomination>
</object>
</pre>
<br />
So, which one of those is easier to read? And did you notice that I added another 5 Euro to my wallet, while translating the structure? Key point here: try to have the structure of your XML follow the structure of your data.<br />
<br />
<br /></div>
Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com0tag:blogger.com,1999:blog-14034177.post-64655133854442148992017-09-04T18:55:00.002-07:002017-09-05T08:50:46.911-07:00Post-trip review: Telestial “International Travel SIM”<span style="font-family: inherit; white-space: pre-wrap;">For our recent trip to Europe, Yvette and I tried the seasoned-traveler technique of swapping out the SIM cards in our phones, rather than paying AT&T’s fairly extortionate international roaming fees. It was an interesting experience, and we learned a few things along the way, which I’ll share here.</span><br />
<span style="font-family: inherit;"><br /></span>
<br />
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="font-family: inherit;"><span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">We used </span></span>Telestial<span style="font-family: inherit;"><span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">, which is apparently Jersey Telecom. Not </span><span style="background-color: transparent; color: black; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><i>New Jersey</i></span><span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">: JT is headquartered in the Jersey Isles, off the coast of Britain. JT/</span></span>Telestial's<span style="font-family: inherit;"><span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"> claim to fame is really their wide roaming capability. Their standard “I’m traveling to Europe” SIM is good pretty much across all of Europe. It definitely worked just fine in Germany, Denmark, Sweden, Austria, Estonia, Russia, and Finland. They claim a few dozen more, and I don’t doubt it works equally-well in those countries.</span></span></div>
<span style="font-family: inherit;"><br /></span>
<br />
<h4>
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">Why not just get a local SIM in the country you’re visiting? Isn’t that cheaper?</span></span></h4>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="font-family: inherit;"><span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">People who frequently travel overseas will often just pop into a phone retailer, or buy a SIM from a kiosk in the airport. Based on the comparison shopping I did while we were traveling, this is </span><span style="background-color: transparent; color: black; font-style: italic; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">definitely</span><span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"> cheaper tan the </span></span>Telestial<span style="font-family: inherit;"><span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"> solution. However, it’s not at all clear in many cases how well an Austrian SIM is going to work in Finland (for example), and just how much you’ll be paying for international roaming.</span></span></div>
<span style="font-family: inherit;"><br /></span>
<br />
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">So, I think if you’re traveling to just one country (especially one with really cheap mobile phone service costs), buying a local SIM is definitely the way to go. I didn’t really want to keep updating people with new contact numbers every other day as we switched countries. I might look into one of the “virtual phone number” solutions, like Google Voice, for the next multi-country trip. Being able to give people one number, and still roam internationally, seems like it’d be useful, but I don’t know what the restrictions are.</span></span></div>
<span style="font-family: inherit;"><br /></span>
<br />
<h3>
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">What does setup look like?</span></span></h3>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">First of all, you need a compatible phone. Not all American mobile phones will wrk in Europe. You can check the technical specs for the particular phone mode you have, to see which radio frequencies it supports. Alternatively, you can buy any iPhone more-recent than the iPhone 4s, all of which are “world phones”, as far as I know. Verizon and Sprint still some phones that are CDMA-only, which means they can’t work anywhere but the USA, but most CDMA smartphones also have a GSM SIM slot, so it’s worth taking a look to see, even if you’re on Verizon.</span></span></div>
<span style="font-family: inherit;"><br /></span>
<br />
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">Secondly, your phone needs to not be “activation locked” to a particular carrier. Most phones sold on contract in the US are set up this way, so you can’t just default on the contract and get a phone you can use on another network. Ideally, your phone would get unlocked automatically at the end of the contract, but US law doesn’t require this, so you’ll need to request an unlock from your carrier. AT&T has made this process a lot easier since the last time I tried to do it, which is great, because I forgot to check that Yvette’s phone was unlocked before we left. I did manage to make the unlock request from the other phone while we were in a taxi on the freeway in Austria, which is a testament to how easy this stuff is these days, I guess.</span></span></div>
<span style="font-family: inherit;"><br /></span>
<br />
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">Assuming you have a compatible phone, then process is basically power off phone, pop out the SIM tray with a paper clip, swap the SIMS, turn on the phone, and wait. For the </span>Telestial<span style="font-family: inherit;"> SIM, you probably really want to activate it and pre-pay for some amount of credit before your trip, which is easy to do on their website.</span></span></div>
<span style="font-family: inherit;"><br /></span>
<br />
<h4>
<span style="font-family: inherit;">What kind of plan did you get?</span></h4>
<span style="font-family: inherit;">We had a pre-paid fixed allowance for calls and text, and 2GB of data for each phone. Calls were $0.35 a minute, and texts were $0.35 each. Pre-loading $30 on the phone was enough for and hour and a half of phone calls, or a fairly large number of texts. When we had data coverage, we could use iMessage or WhatsApp fro basically free text messages. I don't know whether Voice Over LTE actually worked, and if it </span>avoided<span style="font-family: inherit;"> the per-minute charge, since we just didn't call that much.</span><br />
<h4>
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">Did you actually save any money?</span></span></h4>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">Compared to what it cost to pay AT&T for an international roaming plan while Yvette was in the UK for a month, we definitely did save a substantial amount of money. This is true even with the crazy cruise ship issue (see below). Without that, it would have been massively less-expensive. And compared to AT&T’s “no plan” international rates (which I got to try out in Israel), there’s absolutely no comparison.</span></span></div>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;"><br /></span></span></div>
<h4>
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">What happened on the cruise ship?</span></span></h4>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">Most of the time, the cruise ship did not have cell service. Which was pretty much fine - we had good coverage when we were in port, and there was WiFi on the ship, if we wanted to pay for it (we did not). We had, on two occasions, a weird thing were our phones managed to connect to a shipboard cell network (maybe on another passing ship?), where we were charged truly </span></span><span style="white-space: pre-wrap;">outrageous roaming data rates - several dollars a megabyte, which obviously burned through the $30 in prepaid credit really fast. On the other hand, prepaid means that we didn't lose more than $30 (twice, so $60 total). I still don't know exactly what happened there, but if I do this again sometime, I'm going to keep data turned off on the phone when not in port.</span></div>
<br />
<h4>
<b style="font-family: inherit; white-space: pre-wrap;">The good:</b></h4>
<br />
<ol style="margin-bottom: 0pt; margin-top: 0pt;">
<li dir="ltr" style="background-color: transparent; color: black; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; list-style-type: decimal; text-decoration: none; vertical-align: baseline;"><div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">Pre-paid, which limits crazy bills</span></span></div>
</li>
<li dir="ltr" style="background-color: transparent; color: black; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; list-style-type: decimal; text-decoration: none; vertical-align: baseline;"><div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">Easy setup</span></span></div>
</li>
<li dir="ltr" style="background-color: transparent; color: black; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; list-style-type: decimal; text-decoration: none; vertical-align: baseline;"><div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">Easy to recharge, either over the phone, or using the app</span></span></div>
</li>
<li dir="ltr" style="background-color: transparent; color: black; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; list-style-type: decimal; text-decoration: none; vertical-align: baseline;"><div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">Per-minute and per-text rates not <i>too</i> terrible</span></span></div>
</li>
<li dir="ltr" style="background-color: transparent; color: black; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; list-style-type: decimal; text-decoration: none; vertical-align: baseline;"><div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">Works pretty much anywhere in Europe</span></span></div>
</li>
</ol>
<span style="font-family: inherit;"><br /></span>
<br />
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;"><b>The bad:</b></span></span></div>
<ol style="margin-bottom: 0pt; margin-top: 0pt;">
<li dir="ltr" style="background-color: transparent; color: black; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; list-style-type: decimal; text-decoration: none; vertical-align: baseline;"><div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">Cruise ship roaming will use up your data allowance right quick</span></span></div>
</li>
<li dir="ltr" style="background-color: transparent; color: black; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; list-style-type: decimal; text-decoration: none; vertical-align: baseline;"><div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">Fixed recharge sizes, and monthly expiration</span></span></div>
</li>
<li dir="ltr" style="background-color: transparent; color: black; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; list-style-type: decimal; text-decoration: none; vertical-align: baseline;"><div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">Forwarding doesn’t work for texts</span></span></div>
</li>
<li dir="ltr" style="background-color: transparent; color: black; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; list-style-type: decimal; text-decoration: none; vertical-align: baseline;"><div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">Some weirdness with “from” numbers on texts (apparently Austria-only)?</span></span></div>
</li>
<li dir="ltr" style="background-color: transparent; color: black; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; list-style-type: decimal; text-decoration: none; vertical-align: baseline;"><div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">No tethering</span></span></div>
</li>
<li dir="ltr" style="background-color: transparent; color: black; font-style: normal; font-variant-caps: normal; font-variant-east-asian: normal; font-variant-ligatures: normal; font-variant-position: normal; list-style-type: decimal; text-decoration: none; vertical-align: baseline;"><div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">Email support non-responsive</span></span></div>
</li>
</ol>
<h4>
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">Conclusion: would we do it again?</span></span></h4>
<div dir="ltr" style="line-height: 1.38; margin-bottom: 0pt; margin-top: 0pt;">
<span style="background-color: transparent; color: black; font-style: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;"><span style="font-family: inherit;">Overall, the process was fairly-painless, other than the cruise ship issue. If there’s a simple way to fix that, I’d have no problem doing this again. Otherwise, I’d have to recommend during cell data off when you’re not in port, to avoid accidentally costing yourself a bunch of money.</span></span></div>
<br />Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com2tag:blogger.com,1999:blog-14034177.post-16574886353141089872017-08-28T21:56:00.001-07:002017-08-28T21:56:29.323-07:00A brief history of the Future<h2>
<span style="font-family: inherit; font-size: x-large;">A brief history of the Future</span></h2>
<h3>
<span style="font-family: inherit; font-size: large;">Lessons learned from API design under pressure</span></h3>
<div style="line-height: normal; min-height: 14px;">
<span style="font-family: inherit;">It was August of 2009, and the WebOS project was in a bit of trouble. The decision had been made to change the underlying structure of the OS from using a mixture of JavaScript for applications, and Java for the system services, to using JavaScript for both the UI and the system services. This decision was made for a variety of reasons, primarily in a bid to simplify and unify the programming models used for application development and services development. It was hoped that a more-familiar service development environment would be helpful in getting more developers on board with the platform. It was also hoped that by having only one virtual machine running, we'd save on memory.</span><br />
<span style="font-family: inherit;"><br /></span>
<span style="font-family: inherit;">Initially, this was all built on top of a customized standalone V8 JavaScript interpreter, with a few hooks to system services. Eventually, we migrated over to Node.js, when Node was far enough along that it looked like an obvious win, and after working with the Node developers to improve performance on our memory-limited platform.</span></div>
<h4>
<span style="font-family: inherit;">The problem with going from Java to JavaScript</span></h4>
<div style="line-height: normal;">
<span style="font-family: inherit;">As you probably already know, despite the similarity in the names, Java and JavaScript are very different languages. In fact, the superficial similarities in syntax were only making things harder for the application and system services authors trying to translate things from Java to JavaScript.</span></div>
<div style="line-height: normal; min-height: 14px;">
<span style="font-family: inherit;"><br /></span></div>
<div style="line-height: normal;">
<span style="font-family: inherit;">In particular, the Java developers were used to a multi-threaded environment, where they could spin off threads to do background tasks, and have them call blocking APIs in a straightforward fashion. Transitioning from that to JavaScript's single-threaded, events and callbacks model was proving to be quite a challenge. Our code was rapidly starting to look like a bunch of "callback hell" spaghetti.</span></div>
<h4>
<span style="font-family: inherit;">The proposed solution</span></h4>
<div style="line-height: normal;">
<span style="font-family: inherit;">As one of the most-recent additions to the architecture group, I was asked to look into this problem, and see if there was something we could do to make it easier for the application and service developers to write readable and maintainable code. I went away and did some research, and came back with an idea, which we called a <b>Future</b>. </span><span style="font-family: inherit;">The Future was a construct based on the idea of a value that would be available "in the future". You could write your code in a more-or-less straight-line fashion, and as soon as the data was available, it'd flow right through the queued operations.</span><br />
<span style="font-family: inherit;"><br /></span>
<span style="font-family: inherit;">If you're an experienced JavaScript developer, you might be thinking at this point "this sounds a lot like a Promise", and you'd be right. So, why didn't we use Promises? At this point in history, the Proamises/A spec was still in active discussion amongst the CommonJS folks, and it was not at all obvious that it'd become a popular standard (and in fact, it took Promises/A+ for that to happen). The Node.js core had in fact just removed their own Promises API in favor of a callback-based API (this would have been </span>around<span style="font-family: inherit;"> Node.js v0.1, I think).</span></div>
<h4>
<span style="font-family: inherit;">The design of the Future</span></h4>
<div style="line-height: normal; min-height: 14px;">
<span style="font-family: inherit;">Future was based on ideas from SmallTalk(promise), Java(future/promise), Dojo.js(deferred), and a number of other places. The primary design goals were:</span></div>
<div style="line-height: normal;">
</div>
<ul>
<li><span style="font-family: inherit;">Make it easy to read through a piece of asynchronous code, and understand how it was supposed to flow, in the "happy path" case</span></li>
<li><span style="font-family: inherit;">Simplify error handling - in particular, make it easy to bail out of an operation if errors occur along the way</span></li>
<li><span style="font-family: inherit;">To the extent possible, use Future for <i>all</i> asynchronous control flow</span></li>
</ul>
<div style="line-height: normal; min-height: 14px;">
<span style="font-family: inherit;">You can see <a href="https://github.com/openwebos/foundation-frameworks/blob/master/foundations/javascript/control/future.js" target="_blank">the code for Future</a>, because it got released along with the rest of the WebOS Foundations library as open source for the Open WebOS project.</span></div>
<div style="line-height: normal; min-height: 14px;">
<span style="font-family: inherit;"><br /></span></div>
<div style="line-height: normal; min-height: 14px;">
<span style="font-family: inherit;">My design brief looked something like this:</span></div>
<blockquote class="tr_bq">
<span style="font-family: inherit;">A <b>Future</b> is an object with these properties and methods:<br /><b>.result </b>The <i>current value</i> of the Future. If the future does not yet have a value, accessing the result property raises an exception. Setting the result of the Future causes it to execute the next "then" function in the Future's pipeline. </span></blockquote>
<blockquote class="tr_bq">
<span style="font-family: inherit;"><b>.then(next, error) </b>Adds a stage to the Future's pipeline of steps. The Future is passed as a parameter to the function "next". The "next" function is invoked when a value is assigned to the future's result, and the (optional) "error" function is invoked if the previous stage threw an exception. If the "next" function throws an exception, the exception is stored in the Future, and will be re-thrown if the result of the Future is accessed.</span></blockquote>
<span style="font-family: inherit;">This is more-or-less what we ended up implementing, but the API did get more-complicated along the way. </span>Some<span style="font-family: inherit;"> of this was an attempt to simplify common cases that didn't match the initial </span>design well. Some of it was to make it easier to weld Futures into callback-based code, which was ultimately a waste of time, in that Future pretty much wiped out all competing approaches to flow control. And one particular set of changes was thrown in at the last minute to satisfy a request that should just have been denied (see <i>What went wrong</i>, below).<br />
<h3>
<span style="font-family: inherit; font-size: large;">What went right</span></h3>
<h4>
<span style="font-family: inherit;">We shipped a "minimal viable product" </span>extremely quickly</h4>
<div>
Working from the initial API design document, Tim got an initial version of Future out to the development team in just a couple of days, which had all of the basics working. We continued to iterate for quite a while afterwards, but we were able to start the process of bring people up to speed quickly.</div>
<h4>
<span style="font-family: inherit;">We did, in fact, eliminate "callback hell" from our code base</span></h4>
<div style="line-height: normal; min-height: 14px;">
<span style="font-family: inherit;">After the predictable learning curve, the former Java developers really took to the new asynchronous programming model. We went from "it sometimes kind of works", to "it mostly works" in an impressively-short time. Generally speaking, the Future-based code was shorter, clearer, and much easier to read. We did suffer a bit in ease of debugging, but that was as much due to the primitive debugging tools on Node as it was to the new asynchronous model.</span></div>
<h4>
<span style="font-family: inherit;">We doubled-down on our one big abstraction</span></h4>
<div>
Somewhat surprisingly to me, the application teams also embraced Futures. They actually re-wrote significant parts of their code to switch over to Future-based APIs at a deeper level, and to allow much more code sharing between the front end and back end of the Mail application, for example. This code re-use was on the "potential benefits" list, but it was much more of a win than anyone originally expected.</div>
<div>
<br /></div>
<div>
We wrote a bunch of additional libraries on top of Future, for all sorts of asynchronous tasks - for file I/O, database access, network and telecoms, for the system bus (dbus) interface, basically anything that you might have wanted to access on the platform, was available as a Future-based API.</div>
<h4>
<span style="font-family: inherit;">The Future-based code was very easy to reason about in the "happy path" case</span></h4>
<div style="line-height: normal; min-height: 14px;">
<span style="font-family: inherit;">One of the best things about all this, is that with persistent use of Futures everywhere, you could write code that looked like this:</span></div>
<blockquote class="tr_bq">
<span style="font-family: inherit;">downloadContacts().then(mergeContacts).then(writeNewContacts).then(success, error)</span></blockquote>
Most cases were a bit more-complicated than that (often using inline functions), but the pattern of only handling the success case, and just letting errors propagate, was very common. And in fact, the "error" case was, as often as not, logging a message and rescheduling the task for later.<br />
<h4>
<span style="font-family: inherit;">The all-or-nothing error propagation technique fit (most of) our use cases really well</span></h4>
<div style="line-height: normal; min-height: 14px;">
The initial use case of the Future was for a WebOS feature called "Synergy". This was a framework for combining data from multiple sources into a single uniform format for the applications. So, for example, you could combine your contacts from Facebook, Google, and Yahoo into a single address book list, and WebOS would automatically de-dubplicate and link related contacts, and sync changes made on the phone to the proper remote service that the data originally came from. Similarly, all of your incoming e-mail went into the same "Mail" database on-device.</div>
<div style="line-height: normal; min-height: 14px;">
<br /></div>
<div style="line-height: normal; min-height: 14px;">
In a multi-stage synchronization process like this, there are all sorts of ways that the operation can fail - the remote server might be down, or the network might be flaky, or the user might decide to put the phone into airplane mode in the middle of a sync operation. In the vast majority of cases, we didn't actually care what the error was, just that an error had occurred. When an error happened, the usual response was to leave the on-phone data the way it was, and try again later. In those cases where "fuck it, I give up" was not the right error handling strategy, the rough edges of the error-handling model were a bit easier to see.</div>
<h3>
<span style="font-family: inherit; font-size: large;">What went wrong</span></h3>
<h4>
<span style="font-family: inherit;">The API could have been cleaner/simpler</span></h4>
<div style="line-height: normal; min-height: 14px;">
<span style="font-family: inherit;">It didn't take long before we were adding convenience features to make some of the use cases simpler. Hence, the "whilst" function on Future, which was intended to make it easier to iterate over a function that returned Futures. There were a </span>couple of other additions that also got a very small amount of use, and could have easily been replaced by documentation of the "right" way to do things.<br />
<br />
<h4>
Future had more-complex internal state than was strictly needed</h4>
If you look at Promises, they've really only got the minimal amount of state, and you chain functions together by returning a Promise from each stage. Instead of having lots and lots of Futures linked together to make a pipeline of operations, Future <i>was the pipeline</i>. I think that at some level this both decreased heap churn by not creating a bunch of small objects, and it probably made it somewhat easier to debug broken pipelines (since all of the stages were visible). Obviously, if we'd known that Promises were going to become a big thing in JavaScript, we would have stayed a lot closer to the Promises/A spec.</div>
<h4>
<span style="font-family: inherit;">Error handling was still a bit touchy, for non-transactional cases</span></h4>
<div style="line-height: normal; min-height: 14px;">
<span style="font-family: inherit;">If you had to write code that actually cared about handling errors, then the "error" function was actually located in a pretty terrible place, you'd have all these happy-path "then" functions, and one error handler in the middle. Using named functions instead of </span>anonymous<span style="font-family: inherit;"> inline functions helped a bit with this, but I would still occasionally get called in to help debug a thrown exception that the developer couldn't find the source for.</span></div>
<div style="line-height: normal; min-height: 14px;">
<span style="font-family: inherit;"><br /></span></div>
<div style="line-height: normal; min-height: 14px;">
<span style="font-family: inherit;">It would have been really nice to have a complete stack trace for the exception that was getting re-thrown, but we unfortunately didn't have stack traces available in both the </span>application<span style="font-family: inherit;"> context and the service context. In the end, "thou shalt not throw an exception unless it's uniquely identifiable" was <i>almost</i> sufficient to resolve this.</span></div>
<h4>
<span style="font-family: inherit;">I caved on a change to the API that I should have rejected</span></h4>
<div style="line-height: normal; min-height: 14px;">
<span style="font-family: inherit;">Fairly late in the process, </span>someone<span style="font-family: inherit;"> came to me and said "I don't </span>like<span style="font-family: inherit;"> the 'magic' in the way the result property works. People don't expect that accessing a property will throw an exception, so you should provide an API to access the state of the </span>Future<span style="font-family: inherit;"> via function calls, rather than property access". At this point, we had dozens of people successfully </span>using<span style="font-family: inherit;"> the .result API, and very little in the way of complaints about that part of the design.</span></div>
<div style="line-height: normal; min-height: 14px;">
<span style="font-family: inherit;"><br /></span></div>
<div style="line-height: normal; min-height: 14px;">
<span style="font-family: inherit;">I agreed to make the addition, so we could "try it out" and see whether the functional API was really easier or clearer to use. </span>Nobody seemed to think so, except for the person who asked for it. Since they were using it, it ended up having to stay in the implementation. and since it was in the implementation, it got documented, which just confused later users (especially third parties), who didn't understand why there were two different ways to accomplish the same tasks.<br />
<br />
<br /></div>
<div style="line-height: normal; min-height: 14px;">
<h3>
How do I feel about this, 8 years later?</h3>
<div>
Pretty good, actually. Absent a way to see into the future, I think we made a pretty reasonable decision with the information we had available. The Bedlam team did an amazing bit of work, and WebOS got rapidly better after the big re-architecturing. In the end, it was never <i>quite enough</i> to displace any of the major mobile OSes, but I still miss some features of Synergy, even today. After all the work Apple has done over the years to improve contact sync, it's still not quite as good (and not nearly as open to third parties) as our solution was.</div>
<div>
<br /></div>
</div>
Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com0tag:blogger.com,1999:blog-14034177.post-89215621219698824432017-08-21T11:00:00.000-07:002017-08-21T11:00:14.281-07:00What your Internet Of Things startup can learn from LockStateThe company LockState has been in the news recently for sending an over-the-air update to one of their smart lock products <a href="https://arstechnica.com/information-technology/2017/08/500-smart-locks-arent-so-smart-anymore-thanks-to-botched-update/" target="_blank">which "bricked" over 500</a> of these locks. This is a pretty spectacular failure on their part, and it's the kind of thing that ought to be impossible in any kind of well-run software development organization, so I think it's worthwhile to go through a couple of the common-sense processes that you can use to avoid being the next company in the news for something like this.<br />
<br />
The first couple of these are specific to the problem of shipping the wrong firmware to a particular model, but the others apply equally well to an update that's for the right target, but is fundamentally broken, which is probably the more-likely scenario for most folks.<br />
<br />
<b>Mark your updates with the product they go to</b><br />
The root cause of this incident was apparently that LockState had produced an update intended for one model of their smart locks, and somehow managed to send that to a bunch of locks that were a different model. Once the update was installed, those locks were unable to connect to the Internet (one presumes they don't even boot), and so there was no way for them to update again to replace the botched update.<br />
<br />
It's trivially-easy to avoid this issue, using a variety of different techniques. Something as simple as using a different file name for firmware for different devices would suffice. If not that, you can have a "magic number" at a known offset in the file, or a digital signature that uses a key unique to the device model. Digitally-signed firmware updates are a good idea for a variety of other reasons, <i>especially for a security product</i>, so I'm not sure how they managed to screw this up.<br />
<br />
<b>Have an automated build & deployment process</b><br />
Even if you've got a good system for marking updates as being for a particular device, that doesn't help if there are manual steps that require someone to explicitly set them. You should have a "one button" build process which allows you to say "I want to build a firmware update for *this model* of our device, and at the end you get a build that was compiled for the right device, and is marked as being for that device.<br />
<br />
<b>Have a staging environment</b><br />
Every remote firmware update process should have the ability to be tested internally via the same process end-users would use, but from a staging environment. Ideally, this staging environment would be as similar as possible to what customers use, but available in-company only. Installing the bad update on a single lock in-house before shipping it to customers would have helped LockState avoid bricking any customer devices. And, again - this process should be automated.<br />
<br />
<b>Do customer rollouts incrementally</b><br />
LockState might have actually done this, since they say only 10% of their locks were affected by this problem. Or they possibly just got lucky, and their update process is just naturally slow. Or maybe this model doesn't make up much of the installed base. In any case, rolling out updates to a small fraction of the installed base, then gradually ramping it up over time, is a great way to ensure that you don't inconvenience a huge slice of your user base all at once.<br />
<br />
<b>Have good telemetry built into your product</b><br />
Tying into the previous point, wouldn't it be great if you could measure the percentage of systems that were successfully updating, and automatically throttle the update process based on that feedback? This eliminates another potential human in-the-loop situation, and could have reduced the damage in this case by detecting automatically that the updated systems were not coming back up properly.<br />
<br />
<b>Have an easy way to revert firmware updates</b><br />
Not everybody has the storage budget for this, but these days, it seems like practically every embedded system is running Linux off of a massive Flash storage device. If you can, have two operating system partitions, one for the "current" firmware, and one for the "previous" firmware. At startup, have a key combination that swaps the active install. That way, if there is a catastrophic failure, you can get customers back up and running without having them disassemble their products and send them in to you, which is apparently how LockState is handling this.<br />
<br />
If your software/hardware allows for it, you can even potentially automate this entirely - have a reset watchdog timer that gets disabled at the end of boot, and if the system reboots more than once without checking in with the watchdog, switch back to the previous firmware.<br />
<br />
<b>Don't update unnecessarily</b><br />
No matter how careful you are, there are always going to be some cases where a firmware update goes bad. This can happen for reasons entirely out of your control, like defective hardware that <i>just happens to work</i> with version A of the software, but crashes hard on version B.<br />
<br />
And of course the easiest way to avoid having to ship lots of updates is sufficient testing (so you have fewer critical product defects to fix), and reducing the attack surface of your product (so you have fewer critical security issues that yo need to address on a short deadline.<br />
<br />
<br />Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com0tag:blogger.com,1999:blog-14034177.post-67658325839993232652017-08-13T05:30:00.001-07:002017-08-13T05:30:28.246-07:00Why I hate the MacBook Pro Touchbar<b>Why I hate the MacBook Pro Touchbar</b><br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://support.apple.com/library/content/dam/edam/applecare/images/en_US/macbookpro/macbookpro-touch-bar-control-strip-expand.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="30" data-original-width="800" height="11" src="https://support.apple.com/library/content/dam/edam/applecare/images/en_US/macbookpro/macbookpro-touch-bar-control-strip-expand.png" width="320" /></a></div>
<br />
The Touchbar that Apple added to the MacBook Pro is one of those relatively-rare instances in which they have apparently struck the wrong balance between form and function. The line between “elegant design” and “design for its own sake” is one that they frequently dance on the edge of, and occasionally fall over. But they get it right often enough that it’s worth sitting with things for a while to see if the initial gut reaction is really accurate.<br />
<br />
I hated the Touchbar pretty much right away, and I generally haven’t warmed up to it at all over the last 6 months. Even though I’ve been living with it for a while, I have only recently really figured out why I don’t like it.<br />
<br />
<b>What does the Touchbar do?</b><br />
<br />
One of the functions of the Touchbar, of course, is serving as a replacement for the mechanical function keys at the top of the keyboard. It can also do other things, like acting as a slider control for brightness, or quickly allowing you to quickly select elements from a list. Interestingly, it’s the “replacement for function keys” part of the functionality that gets most of the ire, and I think this is useful for figuring out where the design fails.<br />
<br />
<b>What is a “button” on a screen, anyway?</b><br />
<br />
Back in the dark ages of the 1980s, when the world was just getting used to the idea of the Graphical User Interface, the industry gradually settled on a series of interactions, largely based on the conventions of the Macintosh UI. Among other things, this means “push buttons” that highlight when you click the mouse button on them, but don’t actually take an action until you release the mouse button. If you’ve used a GUI that’s takes actions immediately on mouse-down, you might have noticed that they feel a bit “jumpy”, and one reason the Mac, and Windows, and iOS (mostly) perform actions on release is exactly because action on mouse-down feels “bad”.<br />
<br />
<b>Why action on release is good:</b><br />
<br />
<b>Feedback</b> — When you mouse-down, or press with your finger, you can see what control is being activated. This is really important to give your brain context for what happens next. If there’s any kind of delay before the action completes, you will see that the button is “pressed”, and know that your input was accepted. This reduces both user anxiety, and double-presses.<br />
<br />
<b>Cancelability</b> — In a mouse-and-pointer GUI, you can press a button, change your mind, and move the mouse off before releasing to avoid the action. Similar functionality exits on iOS, by moving your finger before lifting it. Even if you hardly ever use this gesture, it’s there, and it promotes a feeling of being in control.<br />
<br />
Both of these interaction choices were made to make the on-screen “buttons” feel and act more like real buttons in the physical world. In the case of physical buttons or keyswitches, the feedback and the cancelability are mostly provided by the mechanical motion of the switch. You can rest your finger on a switch, look at which switch it is, and then decide that you’d rather do something else and take your finger off, with no harm done. The interaction with a GUI button isn’t exactly the same, but it provides for “breathing space” in your interaction with the machine, which is the important thing.<br />
<br />
<b>The Touchbar is (mostly) action on finger-down</b><br />
<br />
With a very few exceptions [maybe worth exploring those more?], the Touchbar is designed to launch actions on finger-down. This is inconsistent with the rest of the user experience, and it puts a very high price on having your finger slip off of a key at the top of the keyboard. This is exacerbated by bad decisions made by third-party developers like Microsoft, who ridiculously put the “send” function in Outlook on the Touchbar, because if there was ever anything I wanted to make easier, it’s sending an email before I’m quite finished with it.<br />
<br />
<b>How did it end up working that way?</b><br />
<br />
I’m not sure why the designers at Apple decided to make things work that way, given their previous experience with GUI design on both the Mac and iOS. If I had to take a guess, the logic might have gone something like this:<br />
<br />
The Touchbar is, essentially, a replacement for the top row of keys on the keyboard. Given that the majority of computer users are touch-typists, then it makes sense to have the Touchbar buttons take effect immediately, in the same way that the physical keyswitches do. Since the user won’t be looking at the Touchbar anyway, there’s no need to provide the kind of selection feedback and cancelability that the main UI does.<br />
<br />
There are a couple of places where this logic goes horribly wrong. First off, a whole lot of people are not touch typists, so that’s not necessarily the the right angle to come at this from. Even if they were, part of the whole selling point of the Touchbar is that it can change, in response to the requirements of the app. So you’re going to have to look at it some of the time, unless you’ve decide to lock it into “function key only” mode. In which case, it’s a strictly-worse replacement for the keys that used to be there, and you’re not getting the benefits of the reconfigurability.<br />
<br />
Even if you were going to use the Touchbar strictly as an F-key replacement, it still doesn’t have the key edges to let you know whether you’re about to hit one key or two, so you’ll want to look at what you’re doing anyway. I know there are people out there who use the function keys without looking at them, but the functions there are rarely-enough used that I suspect the vast majority of users end up having to look at the keyboard anyway, in order to figure out which one is the “volume up” key, or at least the keyboard-brightness controls.<br />
<br />
<b>How can Apple fix this?</b><br />
<br />
First, and primarily, make it less-likely for users to accidentally activate functions on the Touchbar. I think that some amount of vertical relief could make this failure mode less-likely, though I’m not sure if the Touchbar should be higher or lower than it is now. I have even considered trying to fabricate a thin divider to stick to the keyboard to keep my finger away from accidentally activating the “escape” key, which is my personal pet-peeve with the touch bar.<br />
<br />
A better solution to that problem is probably to include some amount of pressure-sensitivity and haptic feedback. The haptic home button on the iPhone 7 does a really good job of providing satisfying feedback without any visuals, so we know thiscan work well. Failing that, at least some way to bail out of hitting the Touchbar icons would be worth pursuing - possibly making them less senstive to grazing contact, though that would increase the cases where you didn’t activate a button while actually trying to.<br />
<br />
Another option would be bringing back the physical function keys. This doesn’t necessarily mean killing the Touchbar, but maybe just moving it away from the rest of the keyboard a bit. This pretty much kills the “you can use it without taking your eyes off the screen or your hands off the home row” argument, but I’m really not convinced that’s at all realistic, unless you only ever use one application.<br />
<br />
<b>So, is Apple going to do anything to address these issues?</b><br />
<br />
Honestly? You can never tell with them. Apple’s history of pushing customers forward against their will (see above) argues that they’ll pursue this for a good while, even if it’s clearly a bad idea. On the other hand, the pressure-sensitivity option seems like the kind of thing they might easily decide to add all by themselves. In the meantime, I’ll be working out my Stick On Escape Key Guard…<br />
<br />Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com0tag:blogger.com,1999:blog-14034177.post-11739124662477839812016-12-18T13:11:00.004-08:002016-12-18T13:11:57.222-08:00Rogue One: A Star Wars Review<h3>
Spoiler-free intro</h3>
If you haven't seen the movie yet, you might want to not read past the spoiler alert warning below.<br />
<br />
So, you've no doubt seen reviews that say <i>Rogue One</i> is the best Star Wars film, and reviews that claim it's utterly disappointing. I liked it a lot, and I think it's a movie that gets better the closer you look at it. I definitely plan to see it again, and see how much more I can pick out of it.<br />
<br />
I have a theory about how people's enjoyment of <i>Rogue One</i> related to their overall level of nerdiness, specifically their level of Star Wars nerdiness. The theory goes like this: I think that the graph of enjoyment vs Star Wars nerdiness has two peaks, one on the low end, and one on the high end, with a substantial dip in between.<br />
<br />
If you only know the Star Wars films a little bit, or just aren't that much of nerd, in general, <i>Rogue One</i> is a pretty serviceable Sci-Fi action adventure, with some shootouts, some chases, and some amazing visual effects. If you're a super-fan, you get all of that, AND a truly prodigious number of cameos, offhand references, and call-outs to just about every Star Wars movie or TV series, from the original trilogy, to the prequels, to <i>The Force Awakens</i>, to <i>Rogue One</i> to <i>Rebels</i>. I wouldn't be surprised if there's a reference to the <i>Star Wars Christmas Special</i> hidden in there somewhere.<br />
<br />
In the middle, we run into the unfortunate people who just really liked the original 3 films (and maybe The Force Awakens), and are headed into <i>Rogue One</i> expecting more of the same I expect these people to come out a little disappointed with <i>Rogue One</i>, because it's really quite different from the original Star Wars movies. This is, I believe, intentional, and brilliant in its own way, but it's definitely going to turn some people off.<br />
<br />
Here are some of the ways in which <i>Rogue One</i> carves out new territory in the Star Wars universe, and some of my favorite bits of clever film-making in it.<br />
<br />
<br />
<br />
<br />
<br />
<h2>
SPOILERS START HERE!!!!</h2>
<br />
<br />
<br />
<br />
<h3>
Mirrors and reflections</h3>
We all remember how the original <i>Star Wars</i> started, I hope. There's the opening text crawl, and then we jump right into the action. A small space ship, fleeing under heavy fire, retreats into the background, and then their pursuer comes into view - a mind-bogglingly-massive grey wedge of death, the Super Star Destroyer.<br />
<br />
Roque One has no text crawl, but it does open on a scene in space, relatively peaceful, or so it seems. And then a great grey wedge starts to intrude into the scene. For just a fraction of a moment, your brain says "Oh, it's one of those giant Imperial ships", but then you realize it just doesn't look right, and the camera tilts and pans, revealing that you're looking at the edge of a planetary ring. As you're settling in, waiting to see what happens, a small ship (an Imperial shuttle, this time) crosses the frame, and heads into the background. Where as in Star Wars, we're immediately ready to cheer on the rebels, in this scene, we don't yet know what to think of this spaceship, heading alone down to land on the planet. It's obviously not good news for the locals, though...<br />
<br />
So, there's an obvious echo here between the original and the prequel. There's just enough similarity to trip you up if you think you know what you're about to see. It's a bit like watching the original Star Wars through a blurry lens, or in a fun-house mirror. <i>Rogue One</i> does a lot of this.<br />
<br />
It's in the nature of sequels, and even more so in the nature of prequels, to be defined to some extent by how they fit together with, and how they differ from, the original film. It's a bit like how people who have a twin will often grow up to define themselves in terms of their differences from their sibling.<br />
<br />
As an immediate prequel to the original movie, <i>Rogue One</i> basically ends right where <i>Star Wars</i> begins. This single moment becomes the mirror in which the original Star Wars is reflected both back in time and backwards in outlook.<br />
<br />
When <i>The Empire Strikes Back</i> was released, Star Wars was re-titled as <i>Star Wars: A New Hope</i>. What comes before hope? Despair. <i>Rogue One</i>'s story picks up at probably the lowest point of the Rebellion agains the Empire. The rebels can't decide on strategy, they've lost control over some of their own best operatives, and now they've learned of a super-weapon, the Death Star, which has been created specifically to crush the rebellion once and for all.<br />
<br />
<h3>
It's very dark in here</h3>
Some people have called <i>Rogue One</i> a "dark" film, and that's true on all sorts of different levels. First off, it's <i>actually literally dark</i> in places where the original is light, and vice-versa. Tatooine, the location of most of the beginning of Star Wars, is a blindingly-bright, glaring white sand desert. The first location we visit in <i>Rogue One</i> is a black sand beach, with clouds overhead. It's literally exactly the opposite of Tatooine.<br />
<br />
When we first see the main bad guy (Darth Vader) in <i>Star Wars</i>, he's in a black outfit, surrounded by foot soldiers wearing white armor. In <i>Rogue One</i>, the main villain appears wearing a white outfit, accompanied by a squad of stormtroopers in black armor.<br />
<br />
This is a fairly subtle bit of film-making, but I think it's brilliant. It could have easily gone some other way, with Orson Krennic, the weapons director, wearing any of the other Imperial uniforms we've seen before, and accompanied by the traditional white-armored Storm Troopers. But it's subtly "off", and sets up the expectation that this movie is not going to follow the familiar conventions and story arcs of the series.<br />
<br />
<h3>
War is hell, and hell is the Middle East (or Jedha)</h3>
Somewhat ironically, for a series of films called "Star Wars", the previous films never really touched on the chaos, terror, and moral grey areas of fighting a war. <i>Rogue One</i> finally does that, in a big way. There has been a lot of talk about the "it's a war movie" aspect of the film, with critics comparing it to WWII movies or Vietnam movies. There are obvious parallels there, but this <i>also</i> feels to me like a war movie with its roots in Afghanistan and Syria, a throughly-modern take on the nature of fighting a war.<br />
<br />
There's a scene where a group of rebels ambush an Imperial patrol, disabling their armored vehicle and killing the stormtroopers. It's the first real fighting we see in the movie, and it's important for setting the tone of the "rebellion on the ground", as opposed to the "rebellion in theory" that we're introduced to when Jyn meets Mon Mothma and the rest of the Rebel leadership after being rescued from imperial custody.<br />
<br />
The whole sequence is just fraught with references to recent middle eastern conflicts. There's the tank ambush itself, which feels like a reference to 1988's <i>The Beast</i>, a film about the Soviet invasion of Afghanistan. Then there are the rebels hiding in caves outside the city proper, a definite nod the the Mujahideen of Afghanistan, as well as the later Al-Qaeda in Pakistan. <br />
<br />
The war machines of the Empire are literally powered by the ancient religious treasures of the local population, a reference to both ISIL's destruction of historic sites, and their use of captured oil production infrastructure to finance their war against the governments of the region.<br />
<br />
In the middle of the chaos of the tank battle, the nominally-heroic Captain Cassian shoots one of the local rebels, in order to keep Jyn safe. He doesn't even hesitate, he just does it. That kind of divided loyalty is the very essence of modern coalition warfare.<br />
<br />
In the end, the Empire solves their rebel "problem" by blasting the entire city from space, with a "precision" shot from the Death Star. Just like the "smart bombs" used in recent real-world conflicts, the single-reactor shot from the Death Star does far less damage than the Empire is easily capable of, but it still causes massive amounts of "collateral damage". It's a very <i>Apocalypse Now</i>, "We had to destroy the village in order to save it" moment.<br />
<br />
And speaking of <i>Apocalypse Now</i>, what about Saw Gerrera? Here's a high-ranking Rebel commander, who's gone off the grid on Jedha, commanding his own personal band of fanatics in a mountain stronghold, leading guerrilla attacks on the Imperial troops, in a way that the "official" rebellion leadership does't approve of. The Rebel leadership dispatches an intelligence officer / field agent to track Gerrerra down, but because this film doesn't perfectly copy any of its inspirations, Captain Caspian isn't ordered to kill Gerrerra, but instead is supposed to use him to find Galen Erso, who he is ordered (secretly) to kill.<br />
<br />
<h3>
You call these people "heroes"?</h3>
We've got Galen Erso, a collaborator, complete with the classic "If I didn't do it, they'd just get someone else to" excuse. He tries to sabotage the Empire's war machine form the inside, perhaps in guilt over what he's helped to build. In the end, this man, who nobody in the rebellion will even remember, is in fact the only reason the Empire didn't manage to totally destroy the Rebellion, one planet at a time.<br />
<br />
Saw Gerrera, an insane ex-rebel, who's decided to fight the war on his own terms. Bodhi Rook, the Imperial shuttle pilot, defects on Galen Erso's orders. He's supposed to take the news of the Death Star's weakness to the rebellion. Unfortunately, Erso sends him to Gerrera, not knowing that his former friend has gone off the deep end. Gerrera's men blindfold Rook, in a scene that echoes the CIA's "extraordinary rendition" program, and deliver him to Gerrara, who tortures Rook, just to be sure of the truth of the message from his former friend. After torturing him, Gerrera just discards Rook, even though we later learn that it would take very little actual effort to help him recover from his experience.<br />
<br />
Let's hear it for Captain Cassian. He's the anti- Han Solo. Instead of being a smuggler who reluctantly gets pulled into the rebellion, he's a Special Ops soldier who's engaged in all sorts of dirty tactics for the rebellion, a man who's so completely bought in to the cause, that's he's perfectly happy to lie to everybody who's working with him, and accept a mission to use a young woman to get to her father, then assassinate him. When he finally has that moment of clarity, and realizes that he can't just execute Erso in cold blood, it actually feels like it means something.<br />
<br />
And then there's the hero/viewpoint character, Jyn. She's an orphan, of course (what is it with orphans in Star Wars?), but instead of secretly being the child of the big bad guy, who's been raised by the rebels until she can challenge her parent and save the galaxy, she's...basically a nobody. Her father's kind of important to the Death Star project, but I'd bet that other than his boss (who has a much higher opinion of his skills than even Erso does), nobody in the Imperial hierarchy has even heard of him.<br />
<br />
<h3>
Miscellaneous cool bits</h3>
Darth Vader is back, and he's completely terrifying. The scene in Vader's castle establishes the quiet menace with which Vader keeps the Empire under tight control. And the scene at the end of <i>Rogue One</i>, where he's in full-on psycho killer mode, stabbing rebel soldiers left and right, slicing through bulkheads? That's the Vader we know from the end of <i>The Empire Strikes Back</i>, and from the climactic lightsaber duel at the end of <i>Return of The Jedi</i> - a dark Jedi who's perfectly willing to give in to his hate, to his savage bloodlust, and use that power to beat down whatever gets in his way. Also a great call-out to <i>The Force Awakens</i>' Kylo Ren, who has those same emotional tendencies, without the control to go with it.<br />
<br />
I love everything about Chirrut Îmwe, the blind monastic warrior. He's not a Jedi Knight, but he's everything that we loved about Ben Kenobi from the original, and he finally puts the "Midichlorian" nonsense from the prequels to rest for good. The Force isn't a bacterial infection, it's a mystical connection between every living thing, and if you believe, if you give yourself over to it, you can achieve amazing things.<br />
<br />
One subtle bit that I loved was his reciting his mantra, walking utterly unharmed through a killing field of covering fire, right up to the control console for the communication relay...and then totally fumbling for the switch. Because of course, he can't just see the switch - he's blind. He can "see" all of the soldiers via the Force, he can get all of them to miss him (or maybe not shoot in his particular direction?), but the switch is just a dead metal lever - it can't be easily seen or influenced through the Force.<br />
<br />
Alan Tudyk plays K-2SO, who's a sarcastic pilot who dies on the final mission to try to "get the signal out". That one's a bit heavy-handed by <i>Rogue One</i> standards, but I loved <i>Firefly</i>, so I'll let that one slide.<br />
<br />
And this is now two Star Wars films in a row with a female lead, and a major secondary male character, who don't fall in love, but do develop a strong friendship through their adventures. I get really sick of the idea that every movie needs to have a romantic sub-plot, and I'm glad to see the idea of platonic friendships between men and women being treated as just as relevant as romantic entanglements.<br />
<div>
<br /></div>
Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com0tag:blogger.com,1999:blog-14034177.post-19678003571350708672016-02-22T00:32:00.001-08:002016-02-22T08:18:23.690-08:00Apple vs the FBI<h2>
<span style="font-size: x-large;">
What's up with Apple and the FBI?</span></h2>
Several of my friends and family have asked me about <a href="http://www.wired.com/2016/02/apples-fbi-battle-is-complicated-heres-whats-really-going-on/" target="_blank">this case</a>, which has been in the news a lot recently. A whole lot of news stories have been written trying more-or-less successfully to explain what's going on here, often with ill-fitting analogies to locks and keys, and it seems like a lot of people (including some of our presidential candidates) are just as confused about what's going on now as they were when the whole thing started. The Wired article above is really very good, but it's long, fairly-technical, and doesn't cover the non-technical side of things particularly well.<br />
<br />
So, since y'all asked, here are some of my thoughts on the case. I'm going to be kind of all over the map here, because I've gotten questions about the moral side of things as well as the technical. I'm going to mostly skip over the legal side of things (because I'm unqualified to comment), except for a couple of specific points.<br />
<br />
On the off-chance that someone stumbles across this who doesn't already know who I am, I'm a computer programmer, and I have worked on encryption and digital security software for a number of different companies, including 3 of the 5 largest PC manufacturers.<br />
<br />
I'm going to try to stay away from using any analogies, and just explain the actual technology involved as simply as I can, since I know you can handle a bit of jargon, and the analogy-slinging I see on Facebook isn't making things any clearer for people, as far as I can see. There will be links to Wikipedia articles in here. You don't need to read them, but they are there in case you want to read more about those subjects.<br />
<br />
First, a very quick run-down of what this is all about:<br />
<ul>
<li>The FBI has an iPhone that was used by Syed Rizwan Farook, one of the shooters in the <a href="https://en.wikipedia.org/wiki/2015_San_Bernardino_attack" target="_blank">San Bernardino shootings</a> last December.</li>
<li>The phone is locked (of course), and the FBI wants Apple to help them unlock it, and in fact has a court order requiring Apple to do so.</li>
<li>Apple is refusing to do what the FBI wants, for some fairly-complicated reasons.</li>
<li>A whole lot of people, including information security experts, law experts, and politicians, have weighed in on how they think this should go.</li>
</ul>
<br />
So, what's my take on all this?<br />
<h3>
<span style="font-size: large;">
Encryption does not work the way you might think it does, from watching movies or TV.</span></h3>
<br />
In the movies, you always see "hackers" running some piece of software that puts up a progress bar, and the software makes gradual progress over the course of seconds or minutes, until the encryption is "broken", and the spy gets access to the data they need. In the real world, unless the encryption implementation is fundamentally-broken by design, the only way to break in is by trying every possible key (we call this a "brute force attack"), and there are an enormous number of possible keys. You could get in with the very first key you try, or you might end up checking every possible key before you find the right one. Nothing about this process gives you any information about whether you're "close" to getting the right key, or whether you've still got billions of keys to try.<br />
<br />
<h4>
The data on the iPhone is encrypted with a key long enough that trying to decrypt it through brute force is essentially impossible.</h4>
The data on the iPhone is encrypted using <a href="https://en.wikipedia.org/wiki/Advanced_Encryption_Standard" target="_blank">AES</a>, the Advanced Encryption Standard, which was developed by the US government for companies like Apple to use to secure data for their customers. as far as anybody knows, brute-force is the only way to attack AES, and with a 256-bit key (as is used on the iPhone), it'd take literally billions of years to try every possible key, if you used all of the computing power in the world.<br />
<br />
<h4>
Apple doesn't have that key to hand it over to the FBI</h4>
The key used to encrypt data on the iPhone is derived from a combination of a device-specific key, and the pass-code which the user has set on the phone. There's no way to extract the device-specific key from the phone, and there's no record of which phone uses which device-specific key. This is done on purpose, because if you could get that data, it'd make it much easier for anyone to extract your personal data from your phone.<br />
<br />
Given that you can't get the device-specific key, then even if all of the data was extracted from the phone, you'd be faced with performing a brute-force attack on the encryption (which is impossible, see above).<br />
<br />
<h4>
You don't need the device-specific key if you can guess the pass-code to the phone</h4>
Obviously, if the phone has a 4-digit pass-code, you only need to try <strike>1,000</strike> 10,000 different codes in order to unlock it (0000-9999). You could sit an FBI intern down in a cubicle with the phone, and a day or so later, it'd be unlocked. That'd be a really boring shift for them, but you could still do it. If the phone has a 6-digit lock code, that becomes substantially less-convenient, and you're into the range of a full-time job for a year or more.<br />
<br />
But you might not be able to do that either, depending on the phone's settings. One of the security settings you can set on the iPhone is for it to erase the data on the phone after 10 incorrect password attempts. The FBI seems to think that this option is enabled for Farook's iPhone.<br />
<br />
<h3>
<span style="font-size: large;">
Here's what the FBI says that they want Apple to do</span></h3>
The FBI wants Apple to produce a custom version of iOS (the iPhone software), and load it onto Farook's iPhone, to enable them to quickly try all of the possible pass-codes.<br />
<br />
This custom software would:<br />
<br />
<ol>
<li>Disable the "erase after 10 incorrect codes are entered" feature (of course)</li>
<li>Allow the FBI to feed possible pass-codes to the iPhone from a connected computer, rather than requiring some poor intern to enter each one by hand.</li>
<li>Reduce the amount of time required between entering each code, so they can check them faster. That wouldn't matter if there was a 4-digit code set, so maybe Farook used a longer code.</li>
</ol>
<br />
<br />
<h4>
Can Apple do it?</h4>
Apparently so, or at least Apple CEO Tim Cook hasn't made the claim that they can't comply with the court order, just that they shouldn't be required to. It probably would not be that much work, actually. Items 1 and 3 up there should be trivially-easy to change, and #2 is probably not a huge amount of work for someone who's familiar with interfacing the iPhone to a PC. Somewhere between "one guy working over the weekend" and "two guys working on it for a week" is probably a reasonable guess.<br />
<br />
<h3>
<span style="font-size: large;">
Here's why Apple says that they shouldn't be forced to do this</span></h3>
<br />
<h4>
It's a violation of their customers' privacy</h4>
Tim Cook says in his open letter that the FBI's request amounts to:<br />
<blockquote class="tr_bq">
The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. </blockquote>
Earlier models of the iPhone were much simpler for Apple to bypass the pass-code on, and they've expended substantial effort over the last few revisions to make it much harder for people to break into iPhones (and newer ones are even more-secure than the phone in this case). This is valuable protection for the individual customers' data, and has contributed in large part to reducing the number of phones stolen, since they can be locked in such a way that they can't be easily re-sold. This same cryptographic technology is also what keeps trade secret information that's stored on businesspeople's phones from being copied as a matter of course overtime they travel to any foreign country.<br />
<br />
<h4>
This is not a normal subpoena, it's a special court order</h4>
Normally, law enforcement agencies will get a court order to compel a company or individual to turn over information or physical evidence that is relevant to a particular investigation. Apple has cooperated in previous investigations (and even in this specific case) with those sorts of orders. This is something else entirely.<br />
<br />
Using the <a href="https://en.wikipedia.org/wiki/All_Writs_Act" target="_blank">All Writs Act</a>, an obscure 18th-century law, the FBI is trying to force Apple to engage in an activity that they wouldn't otherwise do (and which will have a negative impact on their business and customers). The All Writs act has some significant restrictions in terms of when it can be invoked, but there's remarkably-little restriction on what a court can use it to order.<br />
<br />
Once the FBI successfully uses the All Writs Act to force Apple to produce a custom version of iOS, they will have established a precedent where they can use it to compel Apple (or any other technology company) to take whatever actions they think might be helpful to aid any investigation they might launch. Lest you think I'm veering into conspiracy-theory territory here, consider the following:<br />
<br />
<h3>
<span style="font-size: large;">
Several statements that the FBI has made to the court and in the news are either extremely naive or deliberately misleading.</span></h3>
The FBI has made statements both in their court filings and in the press which are simply untrue. If it weren't for the fact that the people making these claims are actual forensics experts (or work with such experts), I'd be inclined to say that they just don't know what they're talking about. Given that <i>they do work for the FBI</i>, I think it's reasonable to hold them to a much higher standard of clueful-ness.<br />
<br />
<h4>
It's just for this one phone for this one investigation</h4>
I can't believe that anybody would think they could get this argument past a judge. Of course if this tool exists, the FBI (and every other police/security agency in the US and every other country in the world) will require that a this custom firmware version be loaded on whatever iPhones they might have picked up in the course of an investigation. And it'd be so much easier if they could just hold on to the firmware themselves, and apply it themselves to iPhones where they have a warrant. This isn't even a "slippery slope" argument, it's just what will obviously happen.<br />
<br />
Several news articles have mentioned China, but really any country that has a poor human rights record would obviously misuse this tool, if it was available. In particular, the Gulf states have an atrocious record on human rights, and a track record of forcing technology companies to compromise customer security to make things easier on their state security agencies (See: Saudi Arabia and Blackberry).<br />
<br />
<h4>
There may be critical information on this phone that leads to other terrorists that Farook was in contact with.</h4>
It's very unlikely that there's actually any information on this phone that'd be useful to the FBI investigation. First off, this isn't even Farook's personal phone. It's the work phone that was issued to him by his employer, the County of San Bernardino. I mean, you can never underestimate the intelligence of criminals, but what kind of idiot would plan their attack on a county facility using their county-supplied phone?<br />
<br />
In any case, Farook destroyed his own personal phone, as well as a laptop and several other devices, before carrying out the attack. If he went to all that trouble to destroy evidence, it seems unlikely that he just plain forgot to destroy his work phone. It's much more-likely that there was never anything remotely-incriminating on it to begin with.<br />
<br />
Secondly, the FBI already has access to backups of that phone all the way up to 1 month before the attack. So they'd only be potentially getting information that was added to the phone in the last couple of weeks before the attack.<br />
<br />
And finally, almost all of the relevant data you might get from that phone is already in the FBI's hands through other channels. They've already got access to the call records, emails, and other communications from that phone and Farook's other devices.<br />
<br />
<h4>
Apple can make this hack so that it only works on this one iPhone, eliminating any risk to Apple's other customers.</h4>
Well, sure, in a trivial sense. In a much more-significant sense, this is a content-free statement. In the trivial sense, Apple cannot course add extra code to this custom version of iOS so that it only works on Farook's phone. But really, they can't do that - they have to test it first, of course, so that means it has to be installable on at least two phones. And it'd obviously be trivial to change which phones it works on later, which brings us back to the original "it's only for this one phone" nonsense above.<br />
<br />
Additionally, this runs into conflict with the requirements of the "All Writs Act", which is the justification for this order. They're not allowed to create an "undue burden" on Apple, and having Apple set up a whole new internal process for creating thousands of custom versions of iOS for every investigation in which it might be useful is not a trivial thing.<br />
<br />
Right now, Apple needs to be very careful about which OS updates it digitally "signs", which is the process that's needed to allow the software to be installed on customers' phones. There are hundreds or maybe thousands of Apple employees who have access to the tools and the source code to make changes in iOS. But that final step of signing an update is necessarily restricted, because the key for that process allows you to say to the world "this software is approved by Apple". They're presumably quite careful with that key. You can make the argument (and people have) that digitally-signing a file is essentially the same as a physical signature, and you shouldn't be able to compel someone to sign something digitally any more than you can legally compel them to sign a physical piece of paper.<br />
<br />
I don't know about Apple, but at one of my former employers, we kept our code-signing key, and the laptop with the software for signing our updates, physically locked up in a safe. The key was actually split into multiple parts, which were only accessible to certain people. Because if you can sign software, you can make it do anything you want. You can remove the DRM which is used to secure purchased content, steal all of a customer's personal data, anything.<br />
<br />
<h3>
<span style="font-size: large;">
There's a larger issue at stake here - the very idea of privacy is under attack</span></h3>
Ever since the ratification of the Bill Of Rights, there has been a back-and-forth argument in this country over the right balance between the citizen's right to privacy, and the state's need for security. Since the development of effective commercial cryptography in the late 20th century, the debate has gotten significantly more-heated.<br />
<br />
<h4>
Privacy is a basic right, guaranteed by the Bill of Rights here in the US</h4>
The 4th Amendment to the US Constitution says:<br />
<blockquote class="tr_bq">
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.</blockquote>
<br />
This controls the sorts of searches that the police (FBI, etc) can perform. In particular, they need probable cause, and a court-issued warrant. Over the last few centuries, that's been dialed back a bit, and US citizens are routinely searched without a warrant, and without probable cause. But there are still limits, and if you, your home, or your stuff is unreasonably-searched, you can contest that in court (and you might even win).<br />
<br />
<h4>
When the constitution was written, the founding fathers could not have imagined the sort of surveillance technology we have today.</h4>
In 1789, if you wanted to have a private conversation with a friend or family member, you could take them aside into a room, or go for a walk in the woods, and if you couldn't see anybody else, chances are nobody would overhear what you had to say. With a wax seal on your mail, you could see whether it had been tampered with (or read by someone else) in transit.<br />
<br />
Modern communication systems (email, telephone, chat) are much easier to listen in on, and when new technology comes along, it has typically taken a while for the Supreme Court to come to the conclusion that whatever new-fangled communication system you use, it's essentially the same as a letter, for legal purposes. Tapping phone lines without a warrant used to be totally legal. Same with intercepting email and other electronic communications.<br />
<br />
The question of whether or not you can be compelled to unlock your own phone, even if it contains potentially-incriminating evidence, is possibly still open, despite the fact that that seems like an obvious violation of the 5th Amendment.<br />
<br />
<h4>
Strong encryption flips the balance of privacy back to the way things were in the 18th century</h4>
When you have access to strong encryption, you have privacy by default. This is as it should be. Until the early 1990s, most encryption that was available commercially was just terrible. Since the development of the World Wide Web, the level of sophistication of the cryptography available to individuals and commercial users has vastly improved.<br />
<br />
<h4>
The US government has fought the availability of effective encryption for decades</h4>
After World War II, a war which the Allies won in part <a href="https://en.wikipedia.org/wiki/Ultra" target="_blank">due to their ability to decrypt</a> German secret messages, the US government set up the <a href="https://en.wikipedia.org/wiki/National_Security_Agency" target="_blank">NSA</a> to ensure that they had a lead in cryptographic technology. And until the growth of academic cryptographic research in the 1980s and 1990s, their expertise was unmatched. The NSA has a weird double mission. On the one hand, they're supposed to protect US military and civilian communications from foreign spies. On the other side, they're supposed to develop ways to break encryption used by other organizations, to support US intelligence-gathering. When it comes to commercial encryption, these goals are directly in conflict.<br />
<br />
When the first truly effective encryption systems began to become commercially available, the NSA tried to keep their ability to listen in on communications by restricting the length of keys that could be used in software that was being exported. Eventually, it became obvious that that was only going to disadvantage US software makers, and the restriction was lifted.<br />
<br />
During the Clinton administration, the NSA proposed <a href="https://en.wikipedia.org/wiki/Clipper_chip" target="_blank">Clipper</a>, a cryptography system that would make it easy for law enforcement to listen in on communications (with a warrant, at least in principle), but would be very difficult for foreign governments, hackers, and others to break. It turned out to have a number of fundamental flaws, and was pretty quickly killed.<br />
<br />
More-recently, the NSA has been possibly caught <a href="https://en.wikipedia.org/wiki/Dual_EC_DRBG" target="_blank">inserting a flaw</a> into a security standard that they helped develop.<br />
<br />
<h3>
<span style="font-size: large;">
Law enforcement and security agencies now have much greater ability to collect data that's not specifically protected with encryption</span></h3>
Despite better security of communications overall, the security apparatus has continued to press the boundaries of what information they can gather without a proper warrant. Here are a few recent(wish) examples.<br />
<br />
<h4>
The FISA court</h4>
In order to allow allow Federal law enforcement and intelligence agencies to obtain search warrants, without having to publicly disclose what they're searching for, and who they're searching, Congress created a parallel court system, the <a href="https://en.wikipedia.org/wiki/United_States_Foreign_Intelligence_Surveillance_Court" target="_blank">Federal Intelligence Surveillance Court</a>. This court provides search warrants, and has been involved in issuing court orders to compel commercial companies to cooperate with the NSA in collecting data, including information on US citizens, which the NSA is explicitly barred from collecting.<br />
<br />
<h4>
Telephone metadata collection</h4>
The NSA has been, for many years, collecting telephone meta-data (who's calling whom) for essentially all telephone call placed inside the United States (and several other countries). This only came to light because of Edward Snowden's whistle-blowing, because <i>of course</i> they got the authority for that from the secret FISA court.<br />
<br />
<h4>
StingRay</h4>
The <a href="https://en.wikipedia.org/wiki/Stingray_phone_tracker" target="_blank">StingRay</a> system is basically a "fake" cell tower that broadcasts a signal that causes every mobile phone within range to report its location. They can be used to track the location of mobile-phone users in bulk, and can also apparently be used to intercept calls. These systems have been provided to local police forces via grants from the Department of Homeland Security, and they're used in a variety of ways that are at best quasi-legal (in that they haven't been specifically declared illegal <i>yet</i>).<br />
<br />
<h4>
Automated number plate readers</h4>
<a href="https://en.wikipedia.org/wiki/Automatic_number_plate_recognition" target="_blank">These machines</a> are just about everywhere. They're used to automatically collect tolls, the police use them to search for cars that are associated with wanted criminals, repo men use them to look for cars that the owners have stopped making payments on, etc, etc. My local city has them mounted on the parking enforcement golf-carts, so they can just cruise down the street and collect the location and license plate numbers of every single car parked anywhere in the city.<br />
<br />
And again, there's no law telling either the police or the private companies what they can or can't do with this information, so it gets used (and mis-used) for all sorts of things. The police have no need and no right to know where my car is at all times, as long as I'm actually following the parking rules.<br />
<br />
<h3>
<span style="font-size: large;">What happens now?</span></h3>
I think there's a good chance that the court will make the "right" decision here, and side with Apple after considering their response. Either way, you should expect that Apple (and other manufacturers) will make additional efforts to ensure that they themselves cannot circumvent their own security systems. If the court makes the "wrong" decision, then there will be a whole lot more of these court orders issued in the near future, and that's bad news for privacy.<br />
<br />
<br />Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com1tag:blogger.com,1999:blog-14034177.post-41988794845936372832014-09-08T12:23:00.001-07:002014-09-08T12:29:33.607-07:00Predictions for Apple's big announcement event tomorrow<div style="font-family: Helvetica; font-size: 12px;">
So, Apple has scheduled some new product announcements tomorrow, which will certainly include a new iPhone (it’s the right time of year for that). There’s a lot of buzz on the internet about the event, based on oblique references from various Apple employees that this event is about much more than just a new iPhone.</div>
<div style="font-family: Helvetica; font-size: 12px; min-height: 14px;">
<br /></div>
<div style="font-family: Helvetica; font-size: 12px;">
Despite the fact that I haven’t worked there in a decade, some people have asked me what I think Apple’s going to announce. For everybody’s amusement, here are my predictions, so we can all have a good laugh about them tomorrow. But first, some background:</div>
<div style="font-family: Helvetica; font-size: 12px; min-height: 14px;">
<br /></div>
<h2>
I’m really bad at this</h2>
<div style="font-family: Helvetica; font-size: 12px;">
As many of my friends and family already well know, I have a history of really, really bad predictions of what Apple will and won’t do. A couple of notable failure in the past include:</div>
<div style="font-family: Helvetica; font-size: 12px; min-height: 14px;">
<br /></div>
<div style="font-family: Helvetica; font-size: 12px;">
<b>“Apple wouldn’t <i>buy</i> NeXT. That would make no sense. They might license some of the technology”</b></div>
<div style="font-family: Helvetica; font-size: 12px;">
When I said this, Apple was actually currently in negotiations to purchase NeXT, which ended up being their largest acquisition value-wise, until they acquired Beats Electronics this year.</div>
<div style="font-family: Helvetica; font-size: 12px; min-height: 14px;">
<br /></div>
<div style="font-family: Helvetica; font-size: 12px;">
<b>“Mac OS X will never ship. It’s a doomed project”</b></div>
<div style="font-family: Helvetica; font-size: 12px;">
This was while I was working on the OS X team, and more than a little depressed at the level of infighting and backstabbing going on between various teams. It took almost another year, but OS X 1.0 did actually ship,</div>
<div style="font-family: Helvetica; font-size: 12px; min-height: 14px;">
<br /></div>
<div style="font-family: Helvetica; font-size: 12px;">
<b>“Clearly, the Mac will be transitioning to a new architecture again. It won’t be X86, though”</b></div>
<div style="font-family: Helvetica; font-size: 12px;">
I had assumed X86-64 on AMD processors was the new target. I take some satisfaction from the fact that Apple relatively-quickly obsoleted the X86 processors in Macs, for 64-bit capable ones. I *almost* got this one right, but I underestimated how much influence non-technical factors would have on the decision.</div>
<div style="font-family: Helvetica; font-size: 12px; min-height: 14px;">
<br /></div>
<div style="font-family: Helvetica; font-size: 12px;">
That’s a common theme amongst many of the times that I mis-predict what Apple is going to do - because I’m this hyper-logical engineer-type person, it always surprises me when they do something that’s not the “right” decision technically, but makes sense economically or in some other way.</div>
<div style="font-family: Helvetica; font-size: 12px; min-height: 14px;">
<br /></div>
<h2>
Predictions</h2>
<div style="font-family: Helvetica; font-size: 12px;">
Okay, so here are my logical predictions, almost none of which will likely come to pass.</div>
<div style="font-family: Helvetica; font-size: 12px; min-height: 14px;">
<br /></div>
<h3>
What I think of the popular rumors</h3>
<div style="font-family: Helvetica; font-size: 12px; min-height: 14px;">
<br /></div>
<div style="font-family: Helvetica; font-size: 12px;">
<b>iPhone 6</b></div>
<div style="font-family: Helvetica; font-size: 12px;">
No doubt that this is going to be announced. It’ll be lighter, better battery life, faster. Rumors are that there will be a physically much-larger model, with a 5.5 inch screen. That’s totally ridiculous. We’ve all seen someone using one of those massive Android phones, and I think we can all agree that they look like total dorks. No way that Apple is going to make an iPhone that you have to use both hands to use.</div>
<div style="font-family: Helvetica; font-size: 12px; min-height: 14px;">
<br /></div>
<div style="font-family: Helvetica; font-size: 12px;">
<b>iWatch</b></div>
<div style="font-family: Helvetica; font-size: 12px;">
Not a chance in hell that Apple will produce a smart watch like the Galaxy Gear or Moto 360. Again with the “dork” factor - who even wears a watch those days? I haven’t worn a watch since I got my first Palm Pilot, back in the day. My iPhone goes with me nearly everywhere I go, already. I look at higher-end wristwatches, and I can appreciate the craftsmanship, but I have no more interest in wearing them than any other piece of jewelry. If Apple does introduce a piece of “wearable technology”, then it won’t be a conventional watch. I could see something playing up the health-monitor angle, but a wristwatch? No way. A $300 accessory for my iPhone that saves me the effort of pulling my phone out of my pocket to read the calendar notifications? Ridiculous.</div>
<div style="font-family: Helvetica; font-size: 12px; min-height: 14px;">
<br /></div>
<h3>
”Obvious” things, which I haven’t seen rumors about</h3>
<div style="font-family: Helvetica; font-size: 12px; min-height: 14px;">
<br /></div>
<div style="font-family: Helvetica; font-size: 12px;">
<b>New Macs</b></div>
<div style="font-family: Helvetica; font-size: 12px;">
Weirdly, there’s not much buzz about this in the rumor-sphere. There was a little bit of buzz about that early on, given that the event is at the Flint Center, where the introduction of the original Macintosh was held, as well as the iMac, the machine that saved the whole Macintosh line. But the rumor mill died out, partly due to lack of information, and I think partly due to people being unable to figure out <i>how</i> a new Mac development would be any kind of big deal.</div>
<div style="font-family: Helvetica; font-size: 12px; min-height: 14px;">
<br /></div>
<div style="font-family: Helvetica; font-size: 12px;">
What kind of announcements could they make about the Mac that’d revitalize that line, and the company, again? There are a couple of “obvious” things they could do, based on the technology that Apple’s products are built on, and recent changes in their products.</div>
<div style="font-family: Helvetica; font-size: 12px; min-height: 14px;">
<br /></div>
<div style="font-family: Helvetica; font-size: 12px;">
<b>A new Macbook Air, based on a 64-bit ARM processor</b></div>
<div style="font-family: Helvetica; font-size: 12px;">
The 64-bit ARM processor in the iPhone 5s and iPad Air is <i>this close</i> to being a capable desktop replacement, and we already know that OS X runs on ARM (after all, iOS is basically OS X with a few additions/deletions, and maintaining processor-neutrality is something Apple’s been focused on since the Intel transition.</div>
<div style="font-family: Helvetica; font-size: 12px; min-height: 14px;">
<br /></div>
<div style="font-family: Helvetica; font-size: 12px;">
What would you get with this new Mac? All-day battery life, at least, but given that you could then run both iOS and OS X on the same hardware, it would make even more sense to unify them. There are already *far more* applications for iOS than for OS X, and integrating iOS app support would tie in nicely with the changes to full-screen mode that we’ve seen in recent versions of OS X.</div>
<div style="font-family: Helvetica; font-size: 12px; min-height: 14px;">
<br /></div>
<div style="font-family: Helvetica; font-size: 12px;">
The Mac App store already exists, so for people writing OS X apps, it’d be a simple re-compile to target the new architecture. Also, the most-recent Mac Pro was a design focussed much more on exploiting the GPU, rather than being dependent on the CPU. Any apps that are optimized for the Mac Pro will run great on an ARM machine with a proper GPU. All in all, the pain of moving to a new architecture will be much lower now than it was for the Intel transition in 2005/2006.</div>
<div style="font-family: Helvetica; font-size: 12px; min-height: 14px;">
<br /></div>
<div style="font-family: Helvetica; font-size: 12px;">
<b>30th Anniversary Mac</b></div>
<div style="font-family: Helvetica; font-size: 12px;">
Oh, and it’s the 30th anniversary of the Mac, so a “30th Anniversary Mac” seems like a good bet. Not sure whether that’d be a new iMac (to keep with the all-in-one form factor), or a laptop (the most-popular Mac form factor these days). Unlike the much-mocked “20th Anniversary Mac”, I expect this to actually be a product that the average Mac user would want, and that they’ll actually be able to buy.</div>
<div style="font-family: Helvetica; font-size: 12px; min-height: 14px;">
<br /></div>
<div style="font-family: Helvetica; font-size: 12px;">
<b>Apple TV finally “grows up” and stops being a “hobby” for Apple</b></div>
<div style="font-family: Helvetica; font-size: 12px;">
As someone who’s worked on Smart TV software, I can tell you that the problems with the TV watching experience today aren’t really the sorts of things you can solve with better hardware and software. They’re structural problems in the way TV content is produced, delivered, and consumed. Why does your fancy digital cable box take 2 seconds to switch channels? Why is your DVR’s interface so ugly? Why can’t you watch back episodes of your favorite show for any reasonable price, until they’re released on DVD? Why is it so *much* more convenient to pirate content than it is to pay for it?</div>
<div style="font-family: Helvetica; font-size: 12px; min-height: 14px;">
<br /></div>
<div style="font-family: Helvetica; font-size: 12px;">
It’s all down to *lack of integration* - the cable company, the folks making the set-top box, and the people making the content that makes having cable worthwhile all work for different companies, with different goals.</div>
<div style="font-family: Helvetica; font-size: 12px; min-height: 14px;">
<br /></div>
<div style="font-family: Helvetica; font-size: 12px;">
Apple has enough money in the bank to outright buy AMC, CBS, HBO and Netflix with cash. They’d have to borrow a little bit to buy Comcast, but not actually all that much. That’d change the TV landscape a bit, I think.</div>
<br />
<div style="font-family: Helvetica; font-size: 12px; min-height: 14px;">
<h2>
Conclusions</h2>
<div>
Based on my previous track record, here's what I think you should expect:</div>
<div>
<ul>
<li>iPhone 6, with a comically-large screen. In a year, I'll deny I ever mocked "phablets" as a bad idea.</li>
<li>An iWatch (not with that name), which is a "me too" smart watch. People will buy it, because it'll be oh-so pretty. But in a year, nobody will be wearing one anymore.</li>
<li>No new Macs, except for an utterly-unremarkable "30th Anniversary Macintosh", which will be a gold-plated turd, just like the 20th Anniversary Mac</li>
<li>Apple TV continues to be that weird box that your Apple fanboy friend / relative has, that you just don't "get" why they have it.</li>
</ul>
</div>
</div>
Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com2tag:blogger.com,1999:blog-14034177.post-45481122782615005362013-02-05T00:35:00.002-08:002013-02-06T21:37:40.254-08:00One down, 11 to go<h2>
January OneGameAMonth post-mortem</h2>
January is over, and I'm done working on Rocks! (for now, at least), and it's time to go over what worked, what didn't, and what I'll do differently for February.<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://3.bp.blogspot.com/-ChUlYphjmvc/URC-YRf-3EI/AAAAAAAAALU/F2tcVeC-a9k/s1600/rocks3.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="242" src="http://3.bp.blogspot.com/-ChUlYphjmvc/URC-YRf-3EI/AAAAAAAAALU/F2tcVeC-a9k/s320/rocks3.jpg" width="320" /></a></div>
<div>
<br /></div>
<div>
First, here's a link to the current version:</div>
<div>
<a href="http://www.mesasteps.com/Rocks" target="_blank">Rocks!</a></div>
<div>
<br /></div>
<div>
And here's the Github repository with the source code:</div>
<div>
<a href="https://github.com/mbessey/Rocks" target="_blank">Repo!</a></div>
<div>
<br /></div>
<div>
<b>What I was trying to do:</b></div>
<div>
This was the first month of the <a href="http://onegameamonth.com/" target="_blank">One Game A Month</a> challenge, and I really wanted to make sure I finished something, so I'd get started off on the right foot. To that end, I tried to shrink the scale of what I was trying to do for January to something I was sure I'd be able to finish. Rather than design a game from scratch, I started with a well-known design, and implemented it on an unknown (to me) technology stack. So, I decided to do a clone of Asteroids, running in the web browser, using the <a href="http://work/multipage/the-canvas-element.html" target="_blank">canvas</a> element for graphics, and the <a href="https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html" target="_blank">Web Audio API</a> for sound.</div>
<div>
<br /></div>
<div>
I wanted to produce something with a retro feel, true to the spirit of the original, even if it wasn't exactly the same in execution. And I decided to do the whole thing without the use of any frameworks or libraries, both because I thought that the game was simple enough that I could just bang it out without much help, and because I wanted to actually learn the browser APIs, not some third-party library.</div>
<div>
<br /></div>
<div>
<b>What went right:</b></div>
<div>
<i>Got something working very fast, then iterated</i></div>
<div>
By the end of the first week, I had a playable game, if not a very interesting one. That took a lot of the pressure off, knowing that even if I ran out of time, I'd have *something* to show for it.</div>
<div>
<br /></div>
<div>
<i>Scope creep (mostly) avoided</i></div>
<div>
Although lots of really great ideas came to me while working on Rocks!, I managed to avoid the temptation to add in a bunch of extra features. I feel especially good about this given that I didn't quite meet the initial goals - I'd have felt a lot worse if I didn't manage to make a complete game, because I'd gotten distracted by doing something cool, but not part of the core gameplay.</div>
<div>
<br /></div>
<div>
<i>Proper "retro-twitch" feel</i></div>
<div>
I spent a fair amount of time tweaking the controls, to get ship movement that felt "right". I think this is something that really distinguishes my game from the other Asteroids-like games that were submitted to OneGameAMonth last month. My ship is very responsive, it turns and accelerates quickly enough to get out of trouble, which makes the player feel like they're in control of their own fate.</div>
<div>
<br /></div>
<div>
<i>No Art</i></div>
<div>
I didn't want to spend a lot of time drawing terrible art that I then hated. I figured that going with the vector approach would encourage (enforce?) a simple graphical design, and save me from spending hours tweaking art trying to make it look less goofy. My inability to draw well is going to be an ongoing issue for the next 11 games, too.</div>
<div>
<br /></div>
<div>
<i>I "Finished" on time</i></div>
<div>
Actually a bit ahead of time. Which is good, because a bunch of "real world" stuff came up in the last few weeks of January.</div>
<div>
<br /></div>
<div>
<b>What went wrong:</b></div>
<div>
<i>Spent much more time on art & sound than expected</i></div>
<div>
Despite the fact that I went with a totally minimalist look & sound, I still had to do a fair amount of tweaking. But with everything defined in code (see next item), it was pure tedium to make any changes in the graphics or sound.</div>
<div>
<br /></div>
<div>
<i>No creative tools</i></div>
<div>
I ended up doing the entire art design by sketching things out on graph paper and manually copying down the coordinates into my code. This wasn't *terrible*, but it was tedious and error-prone. I didn't produce an editor for shapes and sounds because that sounded like more work than actually finishing the game. For *this* game, that was arguably true - but a couple of features got left out, rather than going through the process of manually designing graphics & sound for them. I'm planning on using the same technologies in future games, so I'll be able to amortize the effort to produce the tools over several projects. Conveniently enough, the optional theme element for OneGameAMonth February is "sound", so I'll have good incentive to build (or find) at least a rudimentary sound editor.</div>
<div>
<br /></div>
<div>
<b>What ended up on the cutting-room floor:</b></div>
<div>
<div>
* High score board</div>
<div>
</div>
* Touch browser controls<br />
* Enemies</div>
<div>
* Hyperspace</div>
<div>
<div>
<br /></div>
</div>
<div>
These are all things I intended to do, but just didn't get around to. Technically, there is a high-score board, it just doesn't allow you to put in your initials. This is because I didn't feel like I could implement it without needing to make some major changes somewhere else.<br />
<br />
I didn't do touch controls for keyboard-less tablets and phones because I wanted to do the controls on a kind of virtual arcade cabinet presentation. I never did get any designs for that panel that I liked, so you still can't play the game out your iPad,<br />
<br />
Enemies were going to be UFOs like in Asteroids, with an occasional power-up coming from each enemy shot down. I think I could get a fairly rudimentary version of alien AI done in a couple of days, but I just ran out of time.<br />
<br />
<b>What about February?</b><br />
February will be crazy busy for me, so I'll be setting my sights low for this month as well. The massively-multiplayer infinite-world Sci-Fi adventure game will have to wait a month or two.<br />
<br />
Amongst other things, I will be adopting some helpful libraries and/or frameworks, rather than trying to do everything myself. In particular, it'd be an interesting exercise to build a videogame using the <a href="http://www.enyojs.com/">Enyo.js</a> framework, since we've never really pushed that particular use case, Enyo being more focused toward native-equivalent mobile productivity apps.</div>
Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com3tag:blogger.com,1999:blog-14034177.post-73263657438959824172013-01-11T16:09:00.002-08:002013-01-11T16:20:43.532-08:00Rocks! Update #2 - it's a game<h2>
It's an actual game now!</h2>
So, first things first - here's the current version of Rocks!<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<img border="0" height="304" src="http://3.bp.blogspot.com/-W8BHe353B_s/UPCijWbGmjI/AAAAAAAAALA/YPbFK_HFcNg/s400/rocks2.jpg" width="400" /></div>
<a href="http://www.mesasteps.com/Rocks_v2/" target="_blank">Rocks!</a><br />
<br />
New features include:<br />
<br />
<ul>
<li>updated graphics - random rock shapes, and a progression of sizes</li>
<li>on-screen instructions</li>
<li>better sounds</li>
<li>proper collision detection</li>
<li>particle effects when things are destroyed</li>
<li>more than one level</li>
<li>a "shield" that will prevent rocks from running into you</li>
</ul>
<br />
It's looking a lot more like a real game now.<br />
<h3>
</h3>
<h3>
Sound design is hard</h3>
Oddly enough, the hardest thing for me so far has been making those decidedly "retro", simple sound effects. The <a href="https://dvcs.w3.org/hg/audio/raw-file/tip/webaudio/specification.html" target="_blank">Web Audio API </a>is very powerful, but it's also very much focused on doing sophisticated manipulation of sampled sound. I certainly could have grabbed appropriate sampled sounds, or built some in Audacity, but I wanted to push the "classic" feel of the thing, and I thought - "I've done this sort of thing before, how hard can it be"? Besides, attaching a couple of huge sample files to a game that's currently under 20kb total in size felt a bit like the tail wagging the dog.<br />
<br />
Of course, the last time I tried to create synthesized sounds from scratch was probably 30 years ago, on an 8-bit home computer with a <a href="http://en.wikipedia.org/wiki/SN76489" target="_blank">fixed-voice synthesizer chip</a>. There's something to be said for the existence of fewer choices helping to focus your efforts. When you're faced with an API that supports multi-channel surround sound, arbitrary frequency- and time-domain manipulation, 3-D positional audio, dynamics compression, and all the rest, it's a little difficult to figure out how to just make a simple "beep".<br />
<h3>
</h3>
<h3>
Here's what I've learned so far about using the Web Audio API:</h3>
<b>Web Audio is based on a connected graph of nodes, leading from one or more sources through the graph to the ultimate audio output</b><br />
This is enormously-flexible, and each of the individual node types is jut about as simple as it can be to do the thing it's designed for. There's a "gain" node that just multiplies the input by a constant and feeds it to the output, for instance. The source nodes don't have individual volume controls (because there's the gain node for that).<br />
<br />
There's one weird quirk to my old-school sensibilities, which is that every note requires making another source node and connecting it to the node graph. When a note stops playing, the source node is automatically removed and garbage collected. If you want to play the same sound over and over, you're continuously creating and destroying nodes and connecting them to the graph.<br />
<b><br /></b>
<b>There's a simple oscillator source node that's very flexible</b><br />
You can easily create an oscillator that uses an arbitrary waveform (square, triangle, sine, on user-defined), plays at a specific frequency, and starts and stops at a specific time. This is about 80% of what you need to make a "beep", but:<br />
<br />
<b>Oddly, there's no built-in ADSR envelope support</b><br />
Back in the day, we'd set up ADSR (attack, decay, sustain, release) parameters for a sound, which would control how quickly it came up to full volume, how loud it was as the note progressed, and how quickly it faded. There are probably about 10 different ways to do the same thing in Web Audio, but nothing with the same simplicity.<br />
<br />
<b>There's no simple white-noise source</b><br />
This is a bit of a weird omission, in that noise sources are the basic building blocks of a lot of useful effects, including explosions, hissing, and roaring noises. And again, there's probably 10 different ways to solve this with the existing building blocks, each with their own limitations and quirks. I ended up using Javascript to create a buffer of random samples, which I could then feed through filters to get the appropriate noises for thrust and explosions.<br />
<br />
<b>The API is very much a work in progress</b><br />
Despite the fact I wasn't trying to anything particularly sophisticated, I ran into a few bugs in both Safari and Chrome. I imagine a certain amount of this is to be expected with an in-development API that hasn't been standardized yet.<br />
<h3>
<b><br /></b></h3>
<h3>
<b>Next Up: Enemies!</b></h3>
The next big feature for Rocks! is to have some enemies to chase you around and shoot at you.<br />
<br />Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com4tag:blogger.com,1999:blog-14034177.post-59083523870046002102013-01-05T10:17:00.001-08:002013-01-05T10:46:17.149-08:00One Game a Month, One Blog a Month?<h1>
A New Year Brings a Fresh Start</h1>
I swear, I'm not going to start this post out with how disappointed I am at my lack of writing output over the last year. Oops...<br />
<h3>
<br />
</h3>
<h3>
The Problem</h3>
<div>
No matter how much I promise myself I'm going to update my blog more often, it tends to languish. I have a bunch of half-written articles waiting to be published, but in the absence of any compelling deadline, I can continue to look at them as "not quite ready for public view" for forever.</div>
<h3>
<br />
</h3>
<h3>
A possible solution</h3>
Something I've seen work really well for other people who struggle with producing consistent output are what I think of as "creative challenges". Things like the "take a picture every day for a year" challenge that a lot of people are doing to improve their photography.<br />
<br />
I just can't face the idea of a "blog a day" challenge, though - I like the idea of something a little more long-form, and a daily deadline would force me to cut corners to an extent I'm not ready for yet.<br />
<br />
So instead, I signed up for the <a href="http://onegameamonth.com/" target="_blank">OneGameAMonth</a> challenge. Game design is one of my non-programming passions, so I feel like I'll be able to stay motivated and really try to see this through. A month is a long-enough deadline that I feel like I can produce something worth examining, and the practical problems and "stuff I learned along the way" should provide ample material for *at least* one blog entry a month.<br />
<h3>
<br />
</h3>
<h3>
The Plan</h3>
I haven't planned the whole 12 months out yet, but here's what I do know my plans:
<br />
<ul>
<li>I will create a variety of games in different formats, including video games, board games, and card games</li>
<li>I will explore different genres in each format</li>
<li>Everything I do will be open-source on <a href="http://github.com/mbessey" target="_blank">my Github account</a></li>
<li>I will write <u>at least</u> one blog entry every month, about the current game</li>
<li>If I don't finish a game in a particular month, I will not give up - I'll just do something less ambitious for the next month</li>
</ul>
<h3>
<br />
</h3>
<h3>
The Proof</h3>
<div>
And to prove that I'm not completely full of it, here's the in-progress game for January, after two days of after-hours hacking:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://1.bp.blogspot.com/-nVj12iFeUH8/UOh0nMhaTSI/AAAAAAAAAKw/6ZXAwnL4Oyc/s1600/rocks.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="242" src="http://1.bp.blogspot.com/-nVj12iFeUH8/UOh0nMhaTSI/AAAAAAAAAKw/6ZXAwnL4Oyc/s320/rocks.jpg" width="320" /></a></div>
It's named <a href="http://www.mesasteps.com/Rocks_v1/" target="_blank">Rocks!</a></div>
<div>
<br />
And here's <a href="https://github.com/mbessey/Rocks" target="_blank">the GitHub repository</a> for it.<br />
<div>
<br /></div>
<div>
This is an HTML5 Canvas & WebAudio version of the old Asteroids arcade game. Because it uses some cutting-edge web features, it only runs properly in recent WebKit-based browsers. That's Google Chrome and Safari. Future games will likely be more cross-platform, but I wanted to learn a bit about the Web Audio API.</div>
<h3>
<br />
</h3>
<h3>
What I've learned on this project so far</h3>
This first version is very limited, and frankly pretty buggy:<br />
<ul>
<li>There's no proper collision detection - it's hard to die, unless you <b>try</b> to hit a rock with the ship</li>
<li>The asteroids don't start larger and break up into smaller ones</li>
<li>There's no level progression, and no game-over when you die 3 times</li>
<li>No enemy UFOs yet</li>
<li>There are missing sound & visual effects</li>
</ul>
And the code is, frankly, a mess. But on the other side, there's a lot I've learned over the last two days:<br />
<ul>
<li>All of the rendering is done using the Canvas line-drawing primitives</li>
<li>The sounds are synthesized on-the-fly using Web Audio units instead of sampled sounds</li>
<li>The animation is driven using requestAnimationFrame, so it should throttle back when in the background</li>
<li>The whole thing is less than 11k in size, and there's about 400 lines of Javascript in the main game file. That's smaller than a typical iOS app icon...</li>
</ul>
</div>
Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com1tag:blogger.com,1999:blog-14034177.post-54327149267096790512012-02-10T09:46:00.000-08:002012-02-10T10:10:33.005-08:00The simplest possible computer<p style="text-align: left;margin-top: 0px; margin-right: 0px; margin-bottom: 0px; margin-left: 0px; font: normal normal normal 12px/normal Helvetica; "><b><span class="Apple-style-span" style="font-size:130%;">The simplest possible computer</span></b></p> <p style="margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Helvetica">So, if we were going to build a model of the simplest possible computer, where would we start? As it turns out, you probably have such a model in your home already. </p> <p style="margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Helvetica; min-height: 14.0px"><br /></p> <p style="margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Helvetica">Many homes have what's known as a "three-way" switch, which is a light switch that you can turn on and off from two different locations. This circuit can be used as a simple digital computer.</p><br /><a href="http://1.bp.blogspot.com/-svNmpEGngx4/TzVdEG41_CI/AAAAAAAAAKM/GtFFdJzlLNo/s1600/Switches.png" onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}"><img style="display:block; margin:0px auto 10px; text-align:center;cursor:pointer; cursor:hand;width: 320px; height: 128px;" src="http://1.bp.blogspot.com/-svNmpEGngx4/TzVdEG41_CI/AAAAAAAAAKM/GtFFdJzlLNo/s400/Switches.png" border="0" alt="" id="BLOGGER_PHOTO_ID_5707570427911863330" /></a><p style="margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Helvetica; min-height: 14.0px"><br /></p> <p style="margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Helvetica"><br /></p> <p style="margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Helvetica; min-height: 14.0px"><br /></p> <p style="margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Helvetica">By properly labeling the switch positions and the light bulb, we can use them to solve a logic problem.</p> <p style="margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Helvetica; min-height: 14.0px"><br /></p> <p style="margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Helvetica">Let's say that you need a system to tell you whether to have dessert with your lunch, but you have some specific rules to follow:</p> <p style="margin: 0.0px 0.0px 0.0px 49.0px; text-indent: -12.0px; font: 12.0px Helvetica">1. If you have a salad for lunch, you'll have dessert.</p> <p style="margin: 0.0px 0.0px 0.0px 49.0px; text-indent: -12.0px; font: 12.0px Helvetica">2. If you have soup for lunch, you'll have dessert.</p> <p style="margin: 0.0px 0.0px 0.0px 49.0px; text-indent: -12.0px; font: 12.0px Helvetica">3. If you have both soup and salad for lunch, you'll skip dessert (since you'll be over-full).</p> <p style="margin: 0.0px 0.0px 0.0px 49.0px; text-indent: -12.0px; font: 12.0px Helvetica">4. If you haven't had anything for lunch, you won't have dessert (because dessert on an empty stomach will make you sick).</p> <p style="margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Helvetica; min-height: 14.0px"><br /></p> <p style="margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Helvetica">Here's how to solve this problem with the three-way switch:</p> <p style="margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Helvetica; min-height: 14.0px"><br /></p> <p style="margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Helvetica">If necessary, flip one of the switches so that the light is off. Label the positions that the switches are currently in. Label one "had soup", and the other "had salad". Label the other two positions "no soup" and "no salad", respectively. Hang a sign on the light bulb that reads "have dessert". </p> <p style="margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Helvetica; min-height: 14.0px"><br /></p> <p style="margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Helvetica">Congratulations! You now have a computer that will tell you, based on whether you've had soup and/or salad, whether you should have dessert. Try it out, and you'll find that it follows the rules given above, and the light will only come on if you've had either soup or salad (but not both).</p> <p style="margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Helvetica; min-height: 14.0px"><br /></p> <p style="margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Helvetica">This isn't all that exciting by itself, but this same circuit can be used to solve an entire family of related logic problems, just by changing the labels on the switches and the light bulb. This ability to use the same logic to solve many different problems is the source of the flexibility of computers, and is what enables them to be useful for so many different things.</p> <p style="margin: 0.0px 0.0px 0.0px 0.0px; font: 12.0px Helvetica; min-height: 14.0px"><br /></p>Mark Besseyhttp://www.blogger.com/profile/12091448340989293403noreply@blogger.com4