On May 16, 2019, Nissan held a press conference for Nissan’s next-generation driver assistance system ProPilot 2.0 at its headquarters in Yokohama, Japan. Subsequently, Reuters published a report titled “Nissan stays cool on Lidar tech, siding with Tesla” Review articles.
After a deeper understanding, we believe that Nissan ProPilot 2.0 has significant differences with Tesla Autopilot in terms of technical architecture. But from Nissan’s launch conference, ProPilot 2.0 already has quite a lot of features, and another camp represented by ProPilot 2.0 may become a strong competitor of Tesla Autopilot in the future.
Tetsuya Iijima, General Manager of Nissan Autopilot & ADAS Advanced Technology Development, commented on the Lidar in an interview with Reuters.
At present, Lidar lacks the performance of the latest technology beyond radar and camera. If the laser radar can reach the level of application in our (automobile) system, then of course it is great. But this is not the case, its cost and performance do not match.
At the same time, behind Tetsuya Iijima is a PPT that senses the camera’s cutting lane line and detects obstacles in front. It seems that it is easy to draw the conclusion that Nissan station team Tesla is interested in lidar.
But in an interview with The Drive, Kunio Nakaguro, senior vice president of product development at the Nissan-Renault-Mitsubishi Alliance, gave a further explanation: we don’t think the current laser radar can provide better performance, we don’t think it is necessary to deploy the laser today. radar.
Here, in fact, Nissan’s attitude towards Lidar is different from that of Tesla’s consistent anti-laser, and the subsequent Nissan public relations response has further clarified the differences between Nissan and Tesla.
Nakaguro's evaluation of the laser radar is only for the ProPilot 2.0, which is delivered on the production model. The declining laser radar is not for all levels of the autopilot system.
This also shows that with the decline in the cost of laser radar, performance and reliability improvements, Nissan has a high probability of applying lidar in future L3/4/5-class automatic driving systems.
However, the Nissan announcement also released the schematic diagram of the sensor layout of ProPilot 2.0, which is distributed as follows:
The Tesla Autopilot sensor layout is as follows.
At least it seems that the Nissan ProPilot 2.0 is also very visually sensible, nothing more than the 4 auto angles of the Autopilot, making it safer.
So looking at it in a comprehensive way, Nissan is in the same camp as Tesla in the ADAS stage to visually perceive the world?
it’s not true. The mystery lies in the four AVM cameras on ProPilot 2.0. The AVM here is the abbreviation of Around View Monitor, which is a 360 panoramic system independently developed by Nissan. In other words, these four cameras are wide-angle surround cameras, and it is difficult to participate in the perception of high-speed assisted driving scenes (except for automatic parking). Even if you get involved, the ability boundaries are limited.
The following two diagrams clearly show the huge difference in detection range between the surround camera and the sensor camera used by Tesla.
According to Tetsuya Iijima, ProPilot 2.0 has made Nissan a leader in the field of automatic assisted driving, which is the focus of the daily product branding strategy.
All of the functions are the world's highest level,It is going to be very difficult for others to top this and overtake us. We have integrated the most advanced-level technologies.
What is the world’s highest level and the most advanced technology in the world? According to official speech, ProPilot 2.0 is the world’s first assisted driving system that combines highway navigation driving with single-lane off-hand driving.
The following is the specific meaning of “highway navigation driving”.
The system is designed for automatic entry/exit of ramps on highways, and in combination with vehicle navigation systems, assists vehicles to travel on pre-determined routes on established roads. The system supports the auxiliary vehicle to drive in multiple lanes to assist in overtaking, changing lanes and driving out.
This is the case with “single lane disengagement”.
In single-lane driving, the driver only needs to keep his attention on the road ahead, and when the vehicle is in an accident, the driver can take over at any time to manually control the steering wheel, and the driver can drive in a single lane.
When you want to change the lane, the driver puts his hands on the steering wheel and lights up to start the turn signal. Once the system confirms that the current road conditions support the lane change, the system will assist the vehicle to automatically change lanes.
We are looking at the definition of Navigate on Autopilot from Tesla Autopilot: automatically enter and exit highway ramps or overpass junctions, over slower vehicles.
In other words, ProPilot 2.0 can also implement Tesla Autopilot’s first automatic assisted navigation driving, which is the most significant differentiation feature of Autopilot over other ADAS systems. Considering that Autopilot is unique in the industry, ProPilot 2.0 is quite impressive.
Then the question is coming, how does Nissan do it?
Doesn’t it mean that the detection distance of the surround camera is much lower than that of the sensory camera? Don’t forget, ProPilot 2.0 has four more angular radars than Autopilot, which is the biggest difference between the two companies in their perception of the route. Elon Musk is a very pure and radical representation of the potential of computer vision + AI perception, while Nissan is more balanced and conservative in the choice of camera and radar perception.
Does that mean that ProPilot 2.0 is not too heavy on visual perception? This question needs to be divided into two.
In the case of Bitsla, Nissan does not seem to be too eye-catching, but in traditional car companies, Nissan is the most visual company.
Tetsuya Iijima said at the press conference that the three cameras in front of ProPilot 2.0 are integrated three-eye camera, which consists of a fisheye short-range camera, a main-view mid-range camera and a narrow-view long-range camera for long distance and three. Perceptual coverage of the lane. The supplier is the visual perception overlord Mobileye.
The application of the trinocular camera solves the multi-lane sensing coverage in front and is the basis for realizing the automatic lane changing function.
Although the launch of Autopilot in 2016 allowed Tesla and Mobileye to part ways, Mobileye did not stop on the road of visual perception. After Tesla, the still-powered car companies used the EyeQ series of chips to support the visual perception of the assisted driving system. Among them, the first traditional giant that uses the Mobileye chip to do the three-eye camera perception processing is Nissan.
Nissan mentioned another detail in the announcement. In addition to cameras, radar, ultrasonic sensors and GPS, ProPilot 2.0 also uses high-precision maps to provide 360-degree real-time information and precise location of the vehicle on the road. The supplier of Nissan’s high-precision map is Zenrin, an old Japanese dealer.
Combined with the following news, all the clues are strung together.
In April 2017, Nissan announced its participation in Mobileye’s REM crowdsourcing mapping program. Prior to that, Japanese printer Zenrin had joined the REM crowdsourcing mapping program.
The so-called REM, Road Experience Management, is a mobile camera launched by Mobileye to upload road data in a “crowdsourcing” manner through the vehicle cameras of major car companies, and jointly develop and maintain high-precision maps. The road marking and the map data captured by the camera are used to realize a high-precision positioning mapping plan.
At the beginning of the cooperation in April 2017, Mobileye mentioned “the commercialization of high-precision maps in Japan in mid-2018”. Although the actual landing time is later, the visual perception from Mobileye, Nissan’s decision-making and execution. The ability to integrate with suppliers and Zenrin’s high-precision map data capabilities form a masterpiece of the Japanese driver-assisted driving system outside of Autopilot.
Back to the beginning of the article, why do we say that “another camp represented by ProPilot 2.0 may become a strong competitor of Tesla Autopilot in the future?” Unlike Tesla’s implementation of radical and complete vertical integration and independent R&D strategy, Nissan has taken the most powerful resource in the converging industry and has a balanced route. Create a system that is theoretically competitive and feasible.
Elon Musk once said that Tesla has a data advantage of 100 times more than its competitors. Tesla’s global road test fleet will soon reach 500,000 units, and the improvement of artificial intelligence neural network algorithms relies heavily on massive data. Training.
So, if Mobileye joins a car company and maps to promote ProPilot 2.0 in every market segment in the world, is Tesla’s absolute data advantage still so big?
At the CES 2018 show, Mobileye announced a partnership with the EV4 new plan for REM Crowd Mapping.