Mobileye ADAS products to be mass-produced next year after being acquired by Intel

(Original title: David Oberman, Global Sales Director, Mobileye, Israel Driver Assistance Systems Development Company: EYEQ4 will be put into mass production in 2018)

Unlike other businesses, we have chosen an unusual path. On the technical route, we chose the vision-based ADAS direction. We believe that this way of using cameras to obtain information is the most essential and most important input source for smart driving in the future.

On March 13th this year, Intel announced the acquisition of Israel’s driver assistance system development company Mobileye at a price of US$63.54 per share in cash. The total price reached US$15.3 billion, which is 35% higher than the market value of Mobileye and is the highest amount ever made by Israeli technology company. A purchase transaction.

"This transaction has not yet been completed because it is still waiting for approval from the US Securities and Exchange Commission." David Oberman, Global Sales Director of Mobileye, told the 21st Century Business Herald that Intel valued the leading position of its ADAS product.

The so-called ADAS (Advanced Driver Assistance Systems), which is an advanced driver assistance system, is one of the smart car technologies actively developed by depots in recent years, and is intended to enable an unmanned technical advanced process in the future. The main function of ADAS is not to control the car. Instead, it analyzes the driver’s information on the working conditions of the vehicle and the changes in the environment outside the vehicle, and warns in advance of possible dangerous situations so that the driver can take early measures to avoid traffic accidents. .

Mobileye's pre-installed products are SoC's chip EYEQ series and software loaded on it. By the end of 2016, Mobileye's front-loading solution was applied to 237 models of 20 OEM OEMs. The automakers that have reached agreements with them include BMW, Chrysler, Ford, and General Motors. Mobileye's current post-installed product is mainly a visual camera-based ADAS.

David Oberman revealed that the pre-installed products currently being sold are mainly EYEQ3, and it is expected that the next-generation product EYEQ4 will be put into mass production in 2018. In China, Mobileye currently develops and develops the automotive front-loading and after-loading markets.

Three expertise in autopilot technology

"21st Century": Intel's high premium acquisition, the most value Mobileize what point?

David Oberman: Mobileye is an industry leader in ADAS and smart driving, which is recognized by everyone. Our technology has been developed for 18 years. Our visual ADAS R&D center is the largest R&D center in the world.

Unlike other businesses, we have chosen an unusual path. On the technical route, we chose the vision-based ADAS direction. We believe that this way of using cameras to obtain information is the most essential and most important input source for smart driving in the future.

Our products are mature and market-proven. From a global perspective, many vehicle manufacturers in various countries in the world, including China, have conducted a large number of road tests on our products. So we can adapt to different markets.

Mobileye has occupied a very large share of the world in the ADAS field. We have 28 equipment manufacturers around the world to cooperate and have accumulated 17 million vehicles to load our system.

Our autopilot technology mainly has three specialties: The first level is perception, Mobileye will provide an 8-camera solution, through 8 cameras, the whole body of the vehicle is monitored, managed, and it will be combined with other sensors. For example, radar, lidar, together constitute a more perfect solution.

The second level is the map information. In addition to conventional GPS navigation information, we also need to specifically develop a series of maps that can be used for autopilot. This function developed in-house is called Road Experience Management (referred to as REM, the road experience management data generation function).

Our REM is the detection and discovery of some road information related to driving on the road, including road signs, steering directions or arrows on the road, road material, traffic lights and traffic signals, etc. These information were all collected on the traditional map. Create a new map layer called REM.

This information will be shared by various different depots and then used on different autonomous vehicles. The amount of data required by the REM map is not very large, about 10 kilograms a kilometer, but the need for information is very accurate.

We collected accurate information related to driving. We obtained this information through crowdfunding. All vehicles running on the road and loaded with our system can collect relevant data and send it to the cloud. Although the data provided by each car or driver is very small, such a map can be quickly formed because of the large installed capacity.

The third level is our driving strategy. All of these vehicles, whether autonomous or manual, are coordinated. For example, as if on the road, a car should be inserted into the traffic lane to change lanes, and the driving strategy should be how to do it. For example, if two cars have their own directions to make a turn, what kind of driving strategy should be adopted to make them safe and effective.

The two examples just given are very simple, but the real world scene is very complicated. What we need to do is to make our system as intelligent as the human brain. We can recognize various road conditions and make correct driving decisions.

After the merger and acquisition will be responsible for Intel's automatic driving business

"21st Century": From the perspective of Mobileye, why did you choose Intel, not other chip vendors like Nvidia?

David Oberman: This decision was not made at my level, so I don't know why. I would like to mention that Mobileye and Intel have cooperated on many projects as early as in 2016. For example, cooperation between BMW and Intel in 2016 will be available for road test products in the second half of this year. Production.

"21st Century": After being acquired by Intel, will your future business model and external cooperation strategy change?

David Oberman: Intel has made it clear that they will not change much when they come in. The information we have learned so far is that Intel has acquired us, but Intel’s Automated Driving Group (ADG) has come to Mobileye.

Therefore, all the companies in the auto-pilot related business are specifically responsible for Mobileye. When we step into Intel, it is definitely a good thing, because Intel has other resources to lead in, to help us get the original thing to do better.

"21st Century": After the acquisition, what new plans do you have in this area?

David Oberman: No, so far.

Stick with the monocular camera to take pictures

"21st Century": There are opinions that the monocular camera technology is suitable for the stage before the automatic driving L4, and the stage after the automatic driving L4 is more suitable for the binocular camera technology. How do you see this view? What is your layout for binocular camera technology?

David Oberman: We took a photo with a monocular camera. Countless pictures were superimposed to build a 3D image. This is our technique.

The biggest advantage of the monocular camera is that its distance can be seen long enough. If you use a binocular camera, there are two problems, one can only see a certain distance; Second, the two cameras must focus on a point in order to constitute an image, such a focus is a very high technical requirements . Therefore, we insist on using a monocular camera.

The autopilot technology we are using now is actually using a trinocular camera. This is a combination of three: one is conventional; one is fish-eye, wide enough, on the driving recorder; and the other is Narrow, you can see the farthest. So the combination of three cameras helps us adapt to all situations.

In the future, on a number of projects, we are considering returning to a camera solution, but it is necessary to meet the requirements of these three cameras. It is the HD HD camera.

"21st Century": How does your autopilot solution solve the leap from L3 (L3 is a scene-limited autopilot) phase to L4 (L4 is advanced autopilot)?

David Oberman: Leaping from L3 to L4, from Mobileye, we will provide automated driving solutions through the aforementioned three major technical pillars. Of course, we also need other technologies such as V2X, V2V, etc. to participate. At the same time, government involvement and management are also very important.

Actively deploy the Chinese market

"21st Century": In the Chinese market, what are your layouts?

David Oberman: Our strategy is currently mainly in two directions. The first is the front loading market, providing our EYEQ chips and software to OEMs through Tier1 suppliers. SAIC, FAW, and Southeast Automotive have already had specific cooperation projects with us and some production models have been released.

The second category is the post-installation market. For example, three types of vehicles, namely long-distance passenger transport, dangerous chemical transport and tourism passenger transport, are our goals. Moreover, transportation like buses and dangerous chemicals have been highly controlled by the Chinese government and related policies and regulations have been adopted to promote the application of ADAS.

In addition, we also discuss cooperation with China's 4S stores, distributors, distributors, insurance agencies and governments.

Among them, the government, we and the Ministry of Communications Highway Research Institute, as well as Beijing, Shanghai, Shenzhen and other places of public transport groups have cooperation.

"21st Century": China also has many companies in the field of automatic driving layout. Do you understand? What are you looking at?

David Oberman: I have seen the news, but I don't know much about specific projects or details. I really don't know how to comment.

We see that driverless one, at least so far, the concept is relatively new, each home has its own definition, practices, ideas, strategies. In this area, we need to have a shared concept, so we have a platform for cooperation with Delphi. It is also shared. If other manufacturers have intentions, we are willing to provide solutions.

"21st Century": You once said that in the driving field eventually the car will be smarter than people. When do you expect this day to arrive?

David Oberman: I believe no one can give an accurate answer. Because there are too many challenges, all aspects, including the technology itself, include the entire environment. For example, ethics. Even if one day is really smarter than people, it also has the potential to make mistakes. Caused casualties. How the eyes of the people see this issue, or how the government treats such things. So there is not an accurate point in time, but this is a matter in which all parties work together and push forward simultaneously.

Planetary Gearbox

Planetary Gearbox,High Precision Planetary Gearbox,Planetary Gear Reducers for Stepper Motors,Gear Reducers for Stepper Motors

Suzhou Jiangsente Automation Technology Co., Ltd , https://www.cn-johnson.com

Posted on