5

Event-based Vision Sensor—“Metavison”—Promises Impressive Specs

 2 years ago
source link: https://www.allaboutcircuits.com/news/event-based-image-sensor-metavision-boasts-impressive-specs/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

News

Event-based Vision Sensor—“Metavison”—Promises Impressive Specs

one day ago by Jake Hertz

PROPHESEE, along with iCatch, have teamed up to provide the industry with the "world's first event-based vision sensor" in a 13 x 15 mm mini PGBA package.

As computer vision applications gain more and more momentum, companies are continually investing in new ways to improve the technology. At the same time, as pure imaging improves, power efficiency and data management become significant challenges on the hardware level. 

An example computer vision block diagram.

An example computer vision block diagram. Image used courtesy of Cojbasic et al

One proposed solution to this challenge is ditching conventional imaging techniques in favor of event-based vision. Aiming to capitalize on this type of technology, this week, PROPHESEE, in collaboration with iCatch, released a new event-based vision sensor that boasts some impressive specs. 

This article will discuss the concept of event-based vision, the benefits it offers, and dissect PROPHESEE’s newest offering.

Challenges in Conventional Vision

One of the significant challenges in imaging systems is that, as imaging systems become conventionally better, they tend to put more stress on the hardware. Notably, as resolutions and field of view become better, the amount of raw data produced by the camera also increases. 

While this may be a positive thing in terms of imaging quality, it creates a plethora of challenges for supporting hardware.

An example block diagram of a general image sensor.

An example block diagram of a general image sensor. Image used courtesy of Microsoft and LiKamWa et al

This increase in data traffic can have the harmful effect of placing an increased burden on computing resources, which now need to be able to process more data at faster speeds to maintain real-time operation. On top of this, conventional imaging systems work by applying the same frame rate to all objects in the scene. The result is that moving objects may end up being undersampled, and the important data in a scene can end up being lost.

When applied to machine learning, this increase in data traffic equals higher latency and more power consumption needed to complete a task. At the same time, much of the data being processed may not even be the essential information within a scene—further adding to the wasted energy and latency of the system.

These problems become even more concerning when coupled with the increasing demand for low power, low latency systems.

Solutions With Event-Based Vision?

In an attempt to alleviate these issues, one promising solution is event-based vision.

Event-based vision (right) aims to remove redundant information from conventional vision (left) systems.

Event-based vision (right) aims to remove redundant information from conventional vision (left) systems. Image used courtesy of PROPHESEE

The concept of event-based vision rejects traditional frame-based imaging approaches, where every pixel reports back everything it sees at all times. 

Instead, event-based sensing relies on each pixel to report what it sees only if it senses a significant change in its field of view. By only producing data when an event occurs, event-based sensing significantly reduces that raw amount of data created by imaging systems while also ensuring that the produced data is full of useful information.

Overall, the direct result type of sensing technology is that machine learning algorithms have to process less data, meaning less power consumption and lower latency overall.

The Metavision Sensor

This week, PROPHESEE, in collaboration with iCatch, announced the release of its brand new event-based imaging sensor.

Dubbed the "Metavision sensor," the new IC leverages specialized pixels which only respond to changes in its field of view, activating themselves independently when triggered by events. While not an entirely novel technology, PROPHESEE claims that Metavision is the world's first event-based vision sensor available in an industry-standard package, coming in a 13 x 15 mm mini PGBA package.

The new Metavision sensor.

The new Metavision sensor. Image used courtesy of PROPHESEE

From a hardware perspective, the new sensor appears to be very impressive, offering specs such as a >10k fps time-resolution equivalent, >120 dB dynamic range, and a 3nW/event power consumption. 

From a compute perspective, Metavision promises anywhere from 10 to 1000x less data than frame-rate-based solutions, with a throughput of >1000 obj/s, and a motion period irregularity detection of 1%.

Push for More Event-based Vision

With Metavision, PROPHESEE and iCatch appear to have brought an exciting and promising new technology to an industry-standard format, making it more accessible for engineers everywhere. 

Thanks to this, the companies are hopeful that event-based vision could start to permeate into the industry and bring its benefits along with it.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK