Why Intel got inside BMW-Mobileye deal

Why Intel got inside BMW-Mobileye deal

Business news |
Described their collaboration as a “commitment to strive for an industry standard and define an open platform for autonomous driving,” the BMW Group, Intel and Mobileye announced Friday (July 1) that they are joining forces to bring fleets of fully self-driving vehicles to the streets by 2021.
By Christoph Hammerschmidt

Share:

While the BMW-Mobileye connection has been known to industry observers, Intel was a surprise element in the three-way deal.
It’s not all that clear if the BMW-Mobileye-Intel agreement extends to plans of “Intel inside” the autonomous vehicle. Even if it does, analysts are skeptical that Mobileye – known as a jealous defender of its algorithms – would gladly share with anyone, let alone Intel. 

Nonetheless, in a climate where autonomous car RFQs are flying around and no car company CEOs can avoid talking about driverless vehicles in quarterly financial calls, it’s not surprising to see technology suppliers rushing into the spotlight.

Accelerate autonomous car development
Following the announcement of the three-way deal, EE Times asked automotive industry analysts a few questions: 1) Is it time for Mobileye’s competitors — NXP, Nvidia, Qualcomm et al – to start freaking out? 2) Are Mobileye’s current partners — such as Volkswagen Group, General Motors, Nissan, Tesla Motors — likely to revolt? 3) Or, should everyone just chill and take this new development in stride?
Egil Juliussen, director research, Infotainment & ADAS at IHS Automotive, elicited the latter opinion.
He told EE Times, “It makes a lot of sense. Autonomous cars are very difficult to do. You need so many different technologies – deep learning, sensor fusion and others – to replace a human driver in a car.”

On a higher level, he concluded, “The joint effort like this will move things forward and faster.”
BMW, Intel and Mobileye hope to establish a big lead in the nascent self-driving car segment. Many other automakers are also racing to get there, but in reality, smaller companies probably don’t have deep enough pockets to pull it off.

Google factor
Then, “there is a Google factor,” Juliussen added. The automotive industry can’t afford to ignore Google, whose self-driving cars are viewed as “far ahead in some areas,” he noted.
Although Mobileye’s competitors may be concerned about the BMW/Mobileye/Intel alliance, autonomous cars are “still in a pre-competitive stage,” stressed Juliussen.
He pointed out, “Think about complexities that will come with standards for testing and verifying autonomous cars.” Individual technology suppliers should benefit from an autonomous car platform, he pointed out, so that they can avoid duplication in their development efforts.

Consider the GENIVI Alliance, a non-profit automotive industry alliance, Juliussen added. The industry group – originally started by Intel, BMW, GM and many more – has succeeded with its public open-source software project. “Carmakers [and chip vendors] can still compete on implementation levels.”
Whether the newly announced BMW/Mobileye/Intel deal will become the one to define an autonomous car platform is a tougher question.

Jeremy Carlson, a senior analyst with IHS Automotive, pointed out that he’s aware of at least two other companies talking of the central computing architecture for the autonomous car.  One is called zFAS, on which Audi has worked with Delphi (integrating chips from Nvidia and Mobileye). Mercedes-Benz has been working with a Tier One company on a different autonomous car platform, he added.


Intel’s Xeon Phi chip
Carlson said he’s surprised to see Intel in the three-company deal.
Of course, we may be looking at a platform as simple as Intel’s CPU as the ‘computer’ in an autonomous car, working in conjunction with Mobileye’s processor, as Carlson explained. “But we don’t know.” There isn’t just enough information to confirm or deny that.
A joint press release never spelled out Intel’s specific role. It vaguely mentions that Intel “brings a comprehensive portfolio of technology to power and connect billions of smart and connected devices, including cars.”

A more convincing theory is that Intel will play a big role in the infrastructure side, rather than inside autonomous cars.
Luca De Ambroggi, principal analyst, automotive semiconductor at IHS Technology, suspects, “Intel might provide and take care of the connectivity/telematics link, as well as connect the vehicle to the ‘cloud,’ and the rest of the IoT to store and elaborate data, whether maps or data point & pattern for AI applications.” In his view, Intel is there to leverage its “infrastructure knowhow.”
More specifically, De Ambroggi is referring to Kinghts Landing version of Xeon Phi processors Intel recently announced at the International Supercomputing Conference in Frankfurt, Germany.
The Xeon Phi chip comes with 72 cores running at 1.5GHz, alongside 16GB of integrated stack memory. It has been already designed into several supercomputers.

Intel has been relatively quiet on the topic of Deep Learning thus far. But the company is clearly hoping that the new Xeon Phi will finally open the door to compete in the fast-growing market, which has been so far dominated by Nvidia. To make that point, Intel shared that 4 new Phi processors completed the training of the Caffe Alexnet imaging neural network with 1.33 billion images in 10.5 hours, compared with four Maxwell GPUs that they pegged at 25 hours (see below).

How Intel is positioning its Knights Landing version of Xeon Phi processors for deep learning (Source: Intel)

Mike Demler, senior analyst at The Linley Group, agreed. Under the new partnership, “I think the bigger role that Intel envisions is for development systems (like Nvidia’s Digits workstations) and in the data center,” Demler noted. “Mobileye carefully worded the sentence in the press release, saying algorithms will be deployed on EyeQ processors, but algorithms will be collaboratively developed on Intel platforms.”

Demler also added that Mobilieye’s Road Experience Management (REM) “requires a real-time connection to the cloud, so Intel hopes to play there.”


Intel’s shopping spree
Let’s face it. Intel’s outsized ambition for the automotive market is no secret.
Intel already has underlying automotive technologies such as Wind River’s embedded operating system, software foundation and security expertise.
Further, in the last several months alone, Intel has been busy shopping around for more automotive-related technologies.
Earlier this year, Intel acquired Yogitech, which makes safety tools for autonomous car chips. In parallel, Intel’s Wind River unit bought Arynga, which offers GENIVI-compliant CarSync software for enabling Over-the-Air updates in automotive computers. Common to the two acquisitions is that both will be used by Intel’s future chips and reference designs aimed at fully autonomous cars. 

Then, in late May, Intel announced the acquisition of Itseez Inc., a company armed with Computer Vision algorithms and implementations for embedded and specialized hardware.
De Ambroggi wonders if Intel’s recent acquisitions – added them all together – might spell trouble down the line, as Intel becomes a competitor to Mobileye, rather than a partner. If that’s the case, how much can Mobileye trust Intel?
Then, there’s the nagging mystery of whether Intel — under this deal with Mobileye and BMW – will push its own processor as a decision-making CPU.
Intel said in the press release, “To handle the complex workloads required for autonomous cars in urban environments Intel provides the compute power that scales from Intel Atom to Intel Xeon processors delivering up to a total of 100 teraflops of power efficient performance without having to rewrite code.”

Going against the Intel-inside-in-autonomous-car theory is Intel’s poor record of automotive design wins – thus far.
The Linley Group’s Demler can’t recall too many Intel victories. He explained, “First, the vast majority of processors in automotive applications are MCUs, so there’s no Intel play there. Intel claims a few design wins for In Vehicle Infotainment systems with Nissan (Infiniti) and Hyundai (Kia), but those are basically repurposed Bay Trail tablet processors.” In Demler’s view, “Intel doesn’t have a chip for embedded-vision processing like Mobileye’s, or even something equipped to run deep neural networks. They have Wind River software, and they’ve made some acquisitions for in-vehicle software, but nothing for ADAS hardware.”

Mobileye’s upcoming EyeQ5 block diagram (Source: Mobileye)

Jim McGregor, a principal analyst for Tirias Research added that Intel doesn’t have “a strong track record at this point.” In the current market, “the leaders in the command and control systems are NXP and Renesas, the leader in communications is Qualcomm, and the leader in computer vision is MobileEye,” he noted.


Foundry deal in the offing?
Intel’s role in the deal could be explained with the theory that Intel might become a potential foundry for Mobileye’s EyeQ5, which is expected to ready in two years and scheduled to be manufactured by using a 10nm or below FinFET technology node.

In fact, although Mobileye’s EyeQ4 is being produced by STMicroelectronics using ST’s 28nm FD-SOI process technology, both Mobileye and ST have acknowledged that the future of EyeQ5 production must rely on a finer-note process technology – which ST doesn’t have.
Mobileye has not disclosed a foundry for EyeQ5.  The Linley Group’s Demler remains skeptical of Intel getting the deal.  “Mobileye uses Imagination’s MIPS CPUs, which I doubt have been run in any Intel processes,” he said.

What open platform?
Although the joint announcement talks about the three companies’ commitment to define an open platform for autonomous driving,” what “open” means is anybody’s guesses.
Tirias Research’s Jim McGregor said, “I doubt that Intel and MobilEye’s definition of ‘open’ will be the same as the rest of the industry. It will likely mean that anyone can use their technology if they choose to do so and they will try to push that as an industry standard.”

The Linley Group’s Demler said, “Every company has its own definition of ‘open,’ but that would be a huge change for Mobileye.” Demler said Mobileye has “hinted at opening up when they announced EyeQ5 in their Q1 earnings call, but right now everything they do is proprietary.”
Nvidia’s Drive system is open in the sense that it doesn’t supply the ADAS software, but a hardware-software development platform, he added.  “Mobileye does it all in house, and nobody knows how they develop their algorithms. Nvidia supports open-source neural-network frameworks like Caffe. Google has an open platform for CNN development with TensorFlow.”

In short, Demler sees current autonomous-vehicle development echoing the smartphone platform wars between the closed Apple iOS (Mobileye) and “open” Google-Android (Nvidia and everybody else). “I believe that there’s going to be no way for autonomous vehicles to proliferate on closed, proprietary systems. There’s too much at risk to rely on one company’s secret sauce.”

While it’s an easy-to-understand analogy, the current status of autonomous cars might be far from smartphone level of maturity. In self-driving cars, too many technologies are involved, and they need to be sorted out before the industry aligns.

Mapping – HERE and REM
IHS’ Juliussen senses that the automotive industry is finally moving into that much needed “open” development phase.
Mapping is an example. HERE, a company co-owned by German automotive companies Audi, BMW, and Daimler, developed Sensoris, an open spec for vehicle sensor data to be collected and transmitted to the cloud by connected vehicles. Sensoris is billed as a common language for all autonomous vehicles.

How Mobileye’s REM works in the backend (Source: Mobileye)

HERE this week took a step toward pushing itself as the de-facto standard by submitting to ERTICO – ITS Europe, a private/public partnership for intelligent transportation systems in Europe.

HERE believes Sensoris is critical for combining data from all vehicles on the road. Once combined, data can be used in a system tracking traffic patterns, making predictions about potential bottlenecks and adjusting a self-driving car’s path automatically, even intelligently updating mapping data.

The Linley Group’s Demler explained that REM and HERE are two components that could be used for autonomous-vehicle navigation.
Earlier this year, when Mobileye introduced REM, the company talked extensively about REM’s advantage over Google’s approach.

As Demler pointed out, it’s too early to say how autonomous-vehicle navigation will play out. But BMW, a key owner of HERE, is now working with Mobileye, inventor of its own REM. Juliussen remains optimistic. He noted that this indicates that the automotive industry is migrating into a more convergent approach, using a combination of technologies.

 

Related news:

BMW, Intel, Mobileye to announce “future of driving”

Mobileye to supply cameras, software for Valeo’s future ADAS

Video: Mobileye CTO on deep learning and automotive sensing

Got Self-Driving Architecture? Show Me

How will deep learning change SoCs?

Linked Articles
eeNews Automotive
10s