Apple's chip development and what it could mean for macOS
After reading this article, I thought it would be interesting to read thoughts from this forum, as posts speculating on the future of macOS, FCP, Mac Pro, etc. are frequent here.
"macOS is, in many ways, legacy software just waiting for the right moment to be deprecated. It’s getting a fresh lick of paint now and then, but most of its novelties now relate to how it links back to Apple’s core iOS and iPhone business."
Steve Jobs' dream was a factory where sand was loaded in one side, and computers were shipped out the other.
Vertical integration at its best.
[David Sikes] (quoting TheVerge.com article): "macOS is, in many ways, legacy software just waiting for the right moment to be deprecated"
The article makes a big deal about the iPhone 7 single-threaded CPU performance being faster than a one core of a 12-core nMP. So what? My 2015 MacBook Pro also has faster single-threaded Geekbench numbers than 1/12th of a nMP. Does that mean the MBP will replace the Mac Pro?
Re his statement of "Adobe’s Photoshop...being ported to iOS in almost their full functionality", is he talking about Photoshop Express for iOS? Or some future full-featured Photoshop for iOS?
If that's what he means, this was mentioned in a May 2015 CNET article, saying it would became available later that year (2015): https://www.cnet.com/news/adobe-discontinues-photoshop-touch-app-previews-i...
I don't recollect seeing a full-featured version of Photoshop for iOS or Android. I further don't see how this is possible from a UI standpoint. Professional apps like Photoshop, Premiere, etc. have hundreds of menu options -- pull down, fly-out, pick screens, etc. In the background the apps are doing heavy multithreaded and memory allocation work involving specific APIs for thread synchronization, data sharing, exception handling, etc.
Such complex apps can be ported between OS X and Windows since those are similarly robust operating systems from both an internal standpoint and UI sophistication. It is unclear if iOS has the back-end sophistication to run a high-end app like these, but from a pure UI standpoint there is no place to put those UI design elements. An app like Photoshop or Premiere Pro cannot be "ported" to iOS and retain anything approaching full functionality. It would have to be totally redesigned.
There was discussion of this point when Microsoft developed their "Metro" tile-based UI. While promoting that as the preferred new platform for apps, the designers themselves admitted such an OS and UI system could not support complex full-featured apps like Photoshop. The "paradigm mismatch" between the two differing UI systems was too great.
If it's possible to port a full-featured complex app like Photoshop, Premiere, Excel, etc. to a mobile OS and UI, the proponents are free to demonstrate this. Despite all the talk I have never seen this achieved. In Microsoft's case, the recent Windows versions of Excel and Word were given some "flat" stylistic UI elements, but this didn't port the apps to a mobile platform at full functionality. Basically the mobile and full-featured Windows versions were whitewashed to have superficial aesthetic similarity, but the mobile version does not remotely approach the desktop feature set.
The writer of the Verge article built an entire viewpoint based on one misleading benchmark program. While it's inevitable that mobile devices will gradually absorb new workloads as they continue to grow in power and sophistication, I don't see myself transcoding and editing hundreds of gigabytes of 4k video on a tablet anytime soon.
The author was saying that Intel should be worried, not that they are in imminent danger.
Yes, the A10 Fusion is not a replacement for a Xeon, yet. The danger is if Apple starts to build bigger chips with more of these cores, unencumbered by the thermal performance needs of a handheld device.
[Scott Thomas] "The author was saying that Intel should be worried, not that they are in imminent danger."
Actually I don't think the author knew what he wanted to say. He metaphorically described the ARM threat to Intel as "...systematic downfall is all but assured...As The Matrix’s Agent Smith put it to an overweening police lieutenant, 'your men are already dead.' We may be experiencing such a moment in the tech industry today".
Then he turned around and said "I’m not saying that Intel will be crippled or surpassed anytime soon".
So which is it? Is it "all but assured" or not? He is not the first to notice ARM's potential to compete with Intel in higher-end applications. This has been extensively discussed in the industry. Although the author concentrated exclusively on client side computers, in general the risk is greater on the server side, simply because what OS or CPU a server uses is mostly invisible to the client:
A client side scenario with extensive legacy software is much more difficult. For one thing it would mean the end using Boot Camp, Parallels, VMWare, etc. In that case the choice would be between changing the CPU for all MacOS devices and writing off Windows compatibility, or splintering the product line between ARM-based and x86-based computers.
The rapid performance progress of ARM-based CPUs like the A10 can be misleading. It's easy to make initial progress when you're starting from a low point. The author of The Verge article did not indicate any awareness this conflict has already been played out before many times in the computer industry. For decades there have been repeated attempts to advance new CPUs with special instruction sets as vastly superior. In the 1970s it was high-level instruction sets which culminated in "high level language" machines which could directly execute in hardware a computer language. IBM's Future Systems Project, the Burroughs B1700 and the Data General Fountainhead Processor (described in Soul of a New Machine) were examples of this. Then it went to the other extreme with RISC. Along the way there were attempts to use VLIW (Very Long Instruction Word) techniques to bypass some performance limitations of superscalar designs. In general the success of those approaches was limited relative to improvements in fabrication technology and to microarchitectural improvements for existing instruction set architectures.
There is no simple solution to this and I don't see it as "all but assured".
[Joe Marler] "Actually I don't think the author knew what he wanted to say."
Well, we are talking about The Verge.
[Joe Marler] "The rapid performance progress of ARM-based CPUs like the A10 can be misleading."
ARM began as a desktop CPU in the late 1980s. It's been in constant use ever since.
[Joe Marler] "In that case the choice would be between changing the CPU for all MacOS devices and writing off Windows compatibility, or splintering the product line between ARM-based and x86-based computers."
There's also the speculation that Apple could go with AMD. Who knows? They could put both an ARM and X86 processor in the machines. We could go back to having Amiga Bridgeboards!