Mercedes Benz owner Daimler is teaming up with Bosch to launch a fleet of driverless taxis in California’s Silicon Valley next year.
It is part of a program to test vehicles designed for city driving in an attempt to keep up with the likes of Waymo and Uber.
The world’s largest maker of premium cars and biggest automotive supplier gave few details about their robo-taxi program, described as a passenger shuttle service, and did not reveal which city would host it.
Scroll down for video
Mercedes boss Daimler is teaming up with Bosch to launch a fleet of driverless taxis in California’s Silicon Valley next year. Pictured is the Mercedes-Benz F015 Luxury in Motion autonomous concept car
Negotiations with the municipality within the sprawling technology hub of Silicon Valley were still underway, spokespersons for the companies said on a conference call with journalists.
Global automakers, suppliers and tech companies like Alphabet’s Waymo and Uber are pouring resources into the development of autonomous driving systems and launching networks of test fleets around the globe aimed at pushing the technology forward.
The extremely complex challenges of self-driving, and the expense of research and development, has led to a host of partnerships between automakers, suppliers and others.
Bosch and Daimler, with its passenger car brand Mercedes-Benz, first joined forces in a self-driving alliance in April 2017.
Teams from both companies are working together in Stuttgart and Silicon Valley.
Executives from the companies would not comment on the potential size of the pilot program, nor how many vehicles would be used or which customers would be served.
Although California would be the first pilot site, others could follow, they said.
The vehicles will include a safety driver and a steering wheel, they said.
Bosch and Daimler, with its passenger car brand Mercedes-Benz, first joined forces in a self-driving alliance in April 2017. Teams from both companies are working together in Stuttgart and Silicon Valley
Using an app-based mobility service operated by Daimler, customers will be offered free rides on selected routes within the city during the pilot, a spokesperson from Daimler said.
The service will be built atop the artificial intelligence platform Nvidia DRIVE Pegasus, supplied by chipmaker Nvidia.
Bosch is currently developing its own electronic control unit – the main computer ‘brain’ that controls functions within a self-driving car.
It will use Nvidia’s hardware chips and base software technology.
Until then, it will rely on existing technology from Nvidia, like its Pegasus platform, said Michael Fausten, Bosch’s head of urban autonomous driving.
John Krafcik, CEO of Waymo, is seen in a 2017 photo with a customized Chrysler Pacifica that is being used in the Google-owned firm’s autonomous car project
Under the deal, Daimler will supply the vehicles and test facilities, while Bosch will provide the many sensors, actuators and control units used in the development process.
Daimler’s head of autonomous driving, Uwe Keller, said Mercedes S- and B-class vehicles would be used in development but the vehicle to be used during the pilot had not yet been chosen.
Daimler and Bosch said the pilot will help provide information about how self-driving vehicles can be integrated into a complex transport network offering multiple choices.
Considered by many to be out ahead in the race to autonomy, Waymo is due to launch its fully driverless, autonomous car hailing service in the Phoenix, Arizona, area soon following a public trial.
HOW DO SELF-DRIVING CARS ‘SEE’?
Self-driving cars often use a combination of normal two-dimensional cameras and depth-sensing ‘LiDAR’ units to recognise the world around them.
In LiDAR (light detection and ranging) scanning – which is used by Waymo – one or more lasers send out short pulses, which bounce back when they hit an obstacle.
These sensors constantly scan the surrounding areas looking for information, acting as the ‘eyes’ of the car.
While the units supply depth information, their low resolution makes it hard to detect small, faraway objects without help from a normal camera linked to it in real time.
In November last year Apple revealed details of its driverless car system that uses lasers to detect pedestrians and cyclists from a distance.
The Apple researchers said they were able to get ‘highly encouraging results’ in spotting pedestrians and cyclists with just LiDAR data.
They also wrote they were able to beat other approaches for detecting three-dimensional objects that use only LiDAR.
Other self-driving cars generally rely on a combination of cameras, sensors and lasers.
An example is Volvo’s self driving cars that rely on around 28 cameras, sensors and lasers.
A network of computers process information, which together with GPS, generates a real-time map of moving and stationary objects in the environment.
Twelve ultrasonic sensors around the car are used to identify objects close to the vehicle and support autonomous drive at low speeds.
A wave radar and camera placed on the windscreen reads traffic signs and the road’s curvature and can detect objects on the road such as other road users.
Four radars behind the front and rear bumpers also locate objects.
Two long-range radars on the bumper are used to detect fast-moving vehicles approaching from far behind, which is useful on motorways.
Four cameras – two on the wing mirrors, one on the grille and one on the rear bumper – monitor objects in close proximity to the vehicle and lane markings.