The inside story of the iPhone X ‘brain,’ the A11 Bionic chip

You could be forgiven if, after Apple’s giant, two-hour product launch event on Tuesday, the only thing you remembered is the stunning iPhone X. It is special. It’s also nothing without the A11 Bionic CPU.

And the A11 is nothing without Apple’s insanely focused silicon team behind it.

“We’re clearly on a path now where, with generations of our products, one of the core elements is the chips in them that, to us, they’re intrinsically part of the definition of the product,” said Apple Senior Vice President of Worldwide Marketing Phil Schiller who, along with SVP of Hardware Technologies Johny Srouji, sat down with me 24 hours after the big unveil for an intense chat about silicon, the Apple way.

SEE ALSO:

Why Apple is risking everything with the iPhone X

I had many questions about the A11 Bionic, Apple’s fifth-generation CPU that sits inside not only the iPhone X, which ships in November, but also the iPhone 8 and 8 Plus — mostly about just how many things this new system on a chip (SoC) could do. Srouji, who runs the silicon team, and Schiller were taking me deep, or at least as deep as Apple is comfortable going on its proprietary technology.

The A11 Bionic is yet another important example of the incredible control Apple wields over the entire device-creation process. It’s not only about the new iPhone’s gleaming glass-covered chassis or the upgraded iOS 11 software. Apple thinks and works at a much deeper level and whether they’re building pieces by themselves or working with partners to create its SoC, Apple is in total control.

“This is something we started 10 years ago, designing our own silicon, because that’s the best way to truly customize something that’s uniquely optimized for Apple hardware and software,” said Srouji.

Mashable Image

The silicon heart of the iPhone X is  Apple’s custom silicon handiwork.

Credit:

For Apple, silicon development is an intrinsic part of the iPhone creation process. “It’s not just something you drop in or build around,” said Schiller.

Apple takes deep pride in its homegrown silicon, even if it isn’t always clear how or what they’ve managed to build on 4.3 billion transistors.

During Schiller’s presentation at the product launch keynote, he announced a series of new iPhone features, like a home-grown graphics processing unit, updated image processing, and the underappreciated neural engine. With each announcement, Schiller showed, briefly, a slide image of a chip with some portion highlighted in green. I soon realized it was all just the same image of the A11 Bionic and wondered if people watching around the globe understood that Schiller was, essentially, showing off different rooms inside the same giant processor house.

Here’s Apple’s new, custom GPU on the A11 Bionic.

Credit: Apple

The two updated performance cores.

Credit: Apple

Could one chip do so much? Could anyone develop so many parts, the phone design, its various new components (updated cameras, TrueDepth module, an operating system), while designing and building a single piece of silicon to support them all and satisfy the needs of all the various development and design teams?

Srouji told me that when Apple architects silicon, they start by looking three years out, which means the A11 Bionic was under development when Apple was shipping the iPhone 6 and its A8 chip. Back then we weren’t even talking about AI and machine learning at a mobile level and, yet, Srouji said, “The neural engine embed, it’s a bet we made three years ahead.”

It is virtually impossible, though, to make bets like this unless you build silicon the Apple way. To be clear, Apple isn’t manufacturing the CPUs. They still work with foundries, which Apple will not name. In truth, the foundries work for Apple and follow their instructions to the letter. To insure the communication is clear Srouji, has a small technology team that works directly with the foundry on things like schedules and choice of transistors.

‘The neural engine embed, it’s a bet we made three years ahead.’

Communication is also the key ingredient inside Apple. Schiller and Srouji described disparate teams that somehow take a collaborative approach. So that three-year roadmap, it’s subject to change — within reason.

Teams like Schiller’s marketing group and the display team come to Srouji with requirements, essentially ideas about what they think they’ll need in three years (How can we support a Super Retina display?).

“The process is flexible to changes,” said Srouji, who’s been with Apple since the first iPhone. If a team comes in with a request that wasn’t part of the original plan, “We need to make that happen. We don’t say, ‘No, let me get back to my road map and, five years later, I’ll give you something.”

Mashable Image

The A11 Bionic’s ISP (Image Signal Processor).

Credit: Apple

Schiller and Srouji wouldn’t get into specific requests, but Schiller admitted to me, “There have been some critical things in the past few years, where we’ve asked Johny’s team to do something on a different schedule, on a different plan than they had in place for years, and they moved heaven and earth and done it, and it’s remarkable to see.”

Apple is not, of course, always starting with a clean slate. “Every generation, we take the previous architecture, and — it depends on the building blocks — we decided either improve or start from scratch,” said Srouji. Even with the new name, which is a reference to the focus on AI-influenced technology, Schiller and Srouji confirmed to me that the A11 Bionic builds on many of the performance gains and technologies first introduced in the A10 Fusion.

Schiller described the A11 Bionic as a mix of design, architecture, and technology changes — some completely new, some updates to existing processor designs.

The high-performance cores and efficiency cores introduced with the A10 Fusion CPU got an iterative update, including the addition of two more cores and the ability to handle asymmetric multi-processing, which means the chip can run 1, 2, 3, 4, 5, or 6 cores at once. Managing the core use on the now 10-nanometer CPU is one of the reasons the A11 Bionic, according to Apple, is 70 percent more energy efficient (even while being 25 percent faster than the A10). How the system decides which cores to use (high performance or high efficiency) and how many is a little non-obvious.

‘Every generation, we take the previous architecture, and it depends on the building blocks, we decided either improve or start from scratch.’

Gaming might use more cores, said Srouji, but something as simple as predictive texting, where the system suggests the next word to type, can tap into the high-performance CPUs, as well.

The image signal processor, which works with the cameras, got an update for improved color and low-light performance. And it’s helping power the new Portrait Lighting mode (front and back cameras), which uses two different forms of 3D-face mapping to create on-the-fly studio lighting effects

Video encoding is now updated to handle higher frame rates and better slow motion.

The secure element has been redesigned. “Without going into detail, we take security very seriously,” said Schiller.

However, the Neural Engine and Graphics Engine are entirely new.

I asked Srouji why Apple decided now, after years of using third-party GPUs, most recently PowerVTR’s GT7600 GPU, to build and integrate their own. I should have anticipated his answer.

“If you look at our system on a chip, CPU, ISP, display, where we believe we can differentiate and provide an optimized value custom to Apple, we go out and own it. We’ve done that consistently for 30 years.”

Mashable Image

The iPhone X cameras are powered by the A11 Bionic’s ISP.

Credit: lANCE ULANOFF/MASHABLE

Building its own GPU is also another key to owning the entire stack. Schiller noted that now Apple has everything from graphics hardware to compilers, programming languages, to the OS — including frameworks and libraries.

“It’s not just Lego pieces stacked together,” said Schiller, “the team designed them to work together.”

Team efficient

The more Apple owns across the iPhone and silicon development, the more efficiencies it can create.

Apple’s silicon team is, for example, obsessed with energy efficiency, but never at the expense of responsiveness.

“How we treat silicon when it’s asleep, when your device is not active. We don’t want the battery to drain when you’re not using it. We call this low leakage; when you’re not using it, you’re not using it,” said Srouji. But Srouji’s silicon doesn’t put the iPhone in some kind of deep sleep. If you lift the iPhone, it wakes up instantly.

Mashable Image

Of course, there’s custom silicon inside the Apple Watch Series 3.

Credit: Apple

It’s not just the iPhone. Srouji’s team designed the silicon inside the Apple Watch Series 3 where, according to Schiller, the silicon team is “talking in square millimeters, like ‘How many square millimeters can I save on power?’” They managed to gain new efficiencies despite adding LTE and doubling the number of cores. Part of the solution was the new, more energy-efficient W2 Wi-Fi and Bluetooth chip (the Watch, according to Apple, still promises 18 hours of battery life).

The AI brain inside the brain

The attention to silicon detail also lets Apple micro-manage fresh SOC features like the Neural Engine.

This intriguing bit of innovation may be the most interesting A11 Bionic bit of all. It’s artificial intelligence on a mobile CPU, a part of the chip that thinks differently than everything else.

The creation of a Neural Engine is, of course, also connected, in part, to the silicon team’s never-ending quest for greater system efficiency.

Mashable Image

You see the iPhone X and it see you, thanks to the A11 Bionic.

Credit: apple

“When you look at applications and software, there are certain algorithms that are better off using a functional programming model,” said Srouji.

This includes the iPhone X’s new face tracking and Face ID as well as the augmented-reality-related object detection. All of them use neural networks, machine learning or deep learning (which is part of machine learning). This kind of neural processing could run on a CPU or, preferably, a GPU. “But for these neural networking kinds of programming models, implementing custom silicon that’s targeted for that application, that will perform the exact same tasks, is much more energy efficient than a graphics engine,” said Srouji.

The secret sauce of a Neural Engine, what makes it different from other parts of the A11 Bionic, is its ability to handle matrix multiplications and floating-point processing.

Apple is not, however, opening this neural brain to everyone.

“The Neural Engine is there for a specific set of tasks, not general purpose,” said Schiller. One such task is the face tracking used in the engaging and adorable (at least adorable to some) animojis.

Mashable Image

These adorable animoji’s track your face in real time thanks to the A11 Bionic and its new Neural Engine.

Credit:

Developers can, however, tap into the engine in a tangential way through any facial-recognition work they do with Apple’s augmented reality toolbox, ARKit.

There are other things the A11 Bionic controls that Apple doesn’t often talk about, including the storage controller that includes custom error-correcting code (ECC) algorithms. “When the user buys the device, the endurance and performance of our storage is going to be consistent across the product,” Srouji told me.

It’s also home to the digital signal processor, which is responsible for audio quality. “That’s an area we work really hard at,” said Schiller, adding, “I have friends who are extreme audiophiles who love testing the cleanliness of the audio signal out for these digital ports.”

Over the course of a decade, Apple’s made remarkable progress in silicon, going from a 65-nanometer process to, now, 10, and from roughly 100 million transistors to 4.31 billion.

Even Srouji marvels at the feat. “Doing this year over year and pushing complexity to the limit… I believe we have a world-class team.”

Silicon, though, is reaching its physical limits(Opens in a new tab), which has many in the industry looking at new materials and technologies, including quantum computing.

I asked Srouji if Apple is considering the next generation of silicon (or non-silicon-based) solutions.

“We’re thinking ahead, I’ll tell you that, and I don’t think we’ll be limited,” and then he added, almost as a post-script, “It’s getting harder.”

Featured Video For You

Steve Jobs’ voice opens the 2017 Apple Event

Alternate Text Gọi ngay