Could The Celex Ultrafast High-Contrast Camera Change Autonomous Cars?

A late-stage prototype from NTU Singapore operates at nanosecond intervals, requires less processing power.

NTU Singapore

An self-driving car's ability to "see" the world around it and recognize how that environment is changing in real time is perhaps the most important aspect in determining how safe that autonomous vehicle is. 

Cameras are just one of the technologies self-driving systems use right now—radar and lasers are also popular—but those camera systems are simply very advanced optical cameras, which itself is an old technology. But a team of engineers at Nanyang Technological University Singapore (NTU Singapore) is in the late prototype stages of a new type of camera that operates at a far faster rate than optical cameras, and requires less processing power for an autonomous system to understand what's being shown—both massive advantages for self-driving cars. 

A Different Type of Camera

The Celex prototype high-contrast camera developed by the team under Assistant Professor Chen Shoushun records changes in light intensity between scenes at ultrafast nanosecond intervals. Using this data, an autonomous system can detect changes such as movement or new objects in real time. Celex uses a built-in processor circuit that measures changes in light intensity at the level of individual pixels at the sensor, rather than a whole image, like an optical camera does. This reduces the amount of data an autonomous system needs to analyze, which increases the computer's reaction time. It also allows the high-contrast camera to differentiate between objects in the background versus foreground, and does not get confused by bright lights, weather, or visually complex situations, according to New Atlas.

The prototype began life way back in 2009, but Shoushun says it may be commercially available by the end of the year.  

"Our new camera can be a great safety tool for autonomous vehicles, since it can see very far ahead like optical cameras but without the time lag needed to analyze and process the video feed. With its continuous tracking feature and instant analysis of a scene, it complements existing optical and laser cameras and can help self-driving vehicles and drones avoid unexpected collisions that usually happen within seconds." —Assistant Professor Chen Shoushun

Check out the Celex in action, below.

The full release from NTU Singapore

Scientists from Nanyang Technological University, Singapore (NTU Singapore) have developed an ultrafast high-contrast camera that could help self-driving cars and drones see better in extreme road conditions and in bad weather.

Unlike typical optical cameras, which can be blinded by bright light and unable to make out details in the dark, NTU’s new smart camera can record the slightest movements and objects in real time.

The new camera records the changes in light intensity between scenes at nanosecond intervals, much faster than conventional video, and it stores the images in a data format that is many times smaller as well.

With a unique in-built circuit, the camera can do an instant analysis of the captured scenes, highlighting important objects and details.

Developed by Assistant Professor Chen Shoushun from NTU’s School of Electrical and Electronic Engineering, the new camera named Celex® is now in its final prototype phase.

“Our new camera can be a great safety tool for autonomous vehicles, since it can see very far ahead like optical cameras but without the time lag needed to analyse and process the video feed,” explained Asst Prof Chen.

“With its continuous tracking feature and instant analysis of a scene, it complements existing optical and laser cameras and can help self-driving vehicles and drones avoid unexpected collisions that usually happens within seconds.”

Asst Prof Chen unveiled the prototype of Celex® last month at the 2017 IS&T International Symposium on Electronic Imaging (EI 2017) held in the United States.

It received positive feedback from the conference attendees, many of whom are academia and top industry players.

How it works
A typical camera sensor has several millions pixels, which are sensor sites that record light information and are used to form a resulting picture.

High-speed video cameras that record up to 120 frames or photos per second generate gigabytes of video data, which are then processed by a computer in order for self-driving vehicles to “see” and analyse their environment.

The more complex the environment, the slower the processing of the video data, leading to lag times between “seeing” the environment and the corresponding actions that the self-driving vehicle has to take.

To enable an instant processing of visual data, NTU’s patent-pending camera records the changes between light intensity of individual pixels at its sensor, which reduces the data output. This avoids the needs to capture the whole scene like a photograph, thus increasing the camera’s processing speed.

The camera sensor also has a built-in processor that can analyse the flow of data instantly to differentiate between the foreground objects and the background, also known as optical flow computation. This innovation allows self-driving vehicles more time to react to any oncoming vehicles or obstacles.

The research into the sensor technology started in 2009 and it has received $500,000 in funding from the Ministry of Education Tier 1 research grant and the Singapore-MIT Alliance for Research and Technology (SMART) Proof-of-Concept grant.

The technology was also published in two academic journals published by the Institute of Electrical and Electronics Engineers (IEEE), the world’s largest technical professional organisation for the advancement of technology.

Commercialisation potential
With keen interest from the industry, Asst Prof Chen and his researchers have spun off a start-up company named Hillhouse Tech to commercialise the new camera technology. The start-up is incubated by NTUitive, NTU’s innovation and enterprise company.

Asst Prof Chen expects that the new camera will be commercially ready by the end of this year, as they are already in talks with global electronic manufacturers.