Previous Page  39 / 84 Next Page
Information
Show Menu
Previous Page 39 / 84 Next Page
Page Background

navigation technology avoids the

cost of physical guides such as rail or

cables.

The pods already operate on public

roads, but research continues to

extend their use on all kinds roads

and in different conditions for full

autonomous operation. The €4m

i-CAVE (integrated Cooperative

Automated

Vehicles)

research

program, led by the Technical

University of Eindhoven, has been

looking at how to link the 2getthere

pods together to create a ‘virtual train’

with pods 0.3s apart using wireless

links for a Cooperative Adaptive

Cruise Control (CACC) system.

One of the biggest shakeups for

driverless technology this year was

the entry of Uber. It acquired the

entire research group from Carnegie

Mellon University and started to roll

out driverless taxis in Pittsburgh. The

vehicles, the Fusion from Ford and

XC90 from Volvo, use the Carnegie

Mellon software and while they still

have a ‘driver’ who can take control

in the event of an emergency, they

operate autonomously. The Pittsburgh

roll out follows Uber’s other entry into

the autonomous vehicle market with

Otto. The 91-person start-up develops

systems and software for self-driving

trucks, with staff from the self-driving

development teams at Google, Apple,

and Tesla.

The team is competing with Daimler-

Benz, who have also demonstrated

a self-driving truck. The future truck

uses radar sensors linked to the

throttle and braking systems to allow

the trucks to follow each other as

closely as a few metres, reducing

drag from the air and boosting fuel

efficiency. The front radar sensor

has a range of 250 m and scans an

18-degree segment while a short-

range sensor has a range of 70 m

and scans a 130-degree segment.

A stereo camera installed above the

instrument panel has a range of 100

m, and it scans an area of 45 degrees

horizontally and 27 degrees vertically.

This monitors both single and two-

lane roads, pedestrians, moving

and stationary objects, information

on traffic signs and even the road

surface. The camera recognises

everything that contrasts with the

background, and so it can measure

clearances of the top and sides of the

truck precisely.

The US is not the only place where

autonomous taxis are on the streets.

NuTonomy in Singapore is rolling out

driverless taxis using LiDAR, CMOS

camera and radar sensors.

One of the things that had been

holding back the testing and roll out

of driverless cars has been the lack of

legislation to support the technology.

Up until now, a car or truck has

required a driver. The Federal

Automated Vehicles Policy from the

Department of Transport in the US

now allows for vehicles that can take

full control of the driving task in at least

some circumstances. Portions of the

policy for highly automated vehicles

(HAVs also apply to lower levels of

automation, including some of the

driver-assistance systems already

being deployed by automakers today.

The guidance for manufacturers,

developers and other organizations

outlinesa15point “SafetyAssessment”

for the safe design, development,

testing and deployment of automated

vehicles. The Guidance covers any

organization testing, operating, and/

or deploying automated vehicles,

which includes traditional car makers

and component suppliers as well as

technology companies, start-ups or

fleet operators who are customers of

Autonomous Stuff.

The 15-point Safety Assessment

outlines objectives on how to achieve

a robust design. It allows for varied

methodologies from Object and Event

Detection and Response to Roadway

Safety as well as Response and

robustness of the HAV upon system

failure. It also covers the validation

methods for testing, validation, and

verification of an HAV system, data

recording and sharing requirements,

post-crash behaviour and vehicle

cybersecurity.

All the driverless cars at the

announcement of these regulations

in September used one particular

technology

supplier,

called

Autonomous Stuff. It’s ‘Automated

Research Development Platform’ was

used for the University of Michigan’s

Mcity car and it supplied technology

for the driverless cars from Carnegie

Mellon University, MIT, Stanford,

University of California Berkeley,

University of Michigan and Virginia

Tech Transportation Institute. It

supplies sensors and middleware

software such as Polysync, and this

is being used by Kia for a self-driving

Soul model.

Conclusion

2017 promises to be even more

significant as the sensor and software

technology matures. Apple has

been developing technology for self-

driving cars, and whether it will move

into hardware or focus on software

remains to be seen. Self-driving taxis

and trucks will be rolling out across the

world, with real world uses. That of

course has led to problems. Google’s

self-driving car has already had

several crashes, and Tesla’s Autopilot,

while not a fully autonomous control

system, has also had problems with

sensors leading to accidents.

However, a wide range of different

autonomous platforms are mature

enough in 2016 to be used

commercially on public roads, is a

huge shift. More will roll out in 2017,

especially for mass transit, ready for

autonomous cars and trucks to be

available on the road in the 2020s.

New-Tech Magazine Europe l 39