An eye for an AI: Optic device mimics human retina
source link: https://www.sciencefocus.com/news/an-eye-for-an-ai-optic-device-mimics-human-retina/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
An eye for an AI: Optic device mimics human retina
The device responds to changes in what it sees, which could help it spot objects much more quickly.
10th December, 2020 at 08:33
If our artificial intelligence is able to think like a human brain, why do we feed it data like a normal computer? Scientists are addressing this question by considering the sensory input we receive and have developed an optical device inspired by the workings of the human eye. Researchers in Oregon published their research on optical sensors, which could make robotic components far more efficient.
Using ultrathin layers of photosensitive perovskite material, normally adopted in solar cells, this device adapts its signals as it senses different intensities of light. Perovskites are chemical materials, composed of metal atoms carrying positive charges and oxygen or halide anions, carrying negative charges which layer into an interesting lattice.
The charged lattice structure that creates the unique properties of perovskites, as atomic-level changes in the structure can alter its electrical behaviour. It’s these properties that make perovskites excellent semiconductors, able to switch from insulating electricity to conducting it.
Unlike solar cells, the devices created do not store and use the light provided as energy, but instead respond to changes illumination. In doing so, these new ‘retinomorphic’ sensors send signals to process the image in front of them based on the changes in light.
Dr John Labram, Assistant Professor of Electrical & Computing Engineering, was initially inspired by a biology lecture he played in the background, which detailed how the human brain and eyes work. Our eyes have photo-receptors which are sensitive to changes in the light, but less responsive to constant illumination. From this, he started sketching potential devices to mimic the processing behaviour of these photo-receptors in our eyes.
Such changes are often associated with motion, making this an incredibly important development for the field of artificial intelligence. Looking out across a beach, our eyes are drawn to the changes like a huge, curling wave or a seagull swooping down to steal our chips. By prioritising information in this way, it takes less time for us to interpret our surroundings.
Read more about artificial intelligence:
For artificial intelligence, this translates to simpler, more efficient processing at the visual input level, meaning AI systems could bring together different types of information much quicker than they currently do.
“You can imagine these sensors being used by a robot tracking the motion of objects. Anything static in its field of view would not elicit a response, however a moving object would be registering a high voltage. This would tell the robot immediately where the object was, without any complex image processing,” said Dr Labram.
Currently, computers receive information in a step-by-step way, processing inputs as a series of data points, whereas this technology helps build a more integrated system. For artificial intelligence, researchers are attempting to build on human brains which contain a network of neurons, communicating cells, able to process information in parallel. Labram’s research is an essential step in this direction, with potential to be scaled up for robotics, image recognition and self-driving cars.
Why do we make robots look like humans?
Asked by: Roberta Wild, Lincoln
We’ve always been fascinated by the idea of creating autonomous machines that resemble us, and if they need to interact closely with us, we prefer them to look familiar.
Human-like robots such as Honda’s ASIMO, Boston Dynamics’ Atlas, and the childlike iCub built by the Italian Institute of Technology are amazing demonstrations of our technology, but they still have a long way to go – and when they look nearly human but not quite, they end up looking seriously freaky to us.
Perhaps we should just let robots be the shape they need to be, in order to best carry out their function.
Read more:
Recommend
-
19
PowerModeInput PowerModeInput can make your text input box more compelling This project can make your input box lively. One day I saw a vscode plugin called
-
10
This AI robot mimics human expressions to build trust with usersThe project aims to improve human-robot interactions
-
7
Robot Eva mimics human emotions and other tech newsRobot Eva mimics human emotions and other tech newsBBC Click's Marc Cieslak looks at the best tech news stories...
-
7
Tiny, Fast, & Strong: An Insect-sized Robot that Mimics CheetahAugust 30th 2021 new storyTL;DR3
-
6
Artificial Intelligence Can Now Accurately Describe Your PoopAI app now better at categorizing poop than humans are, scientists find.27 May 2022, 1:19pm...
-
6
Amazon shows off Alexa feature that mimics the voices of your dead relatives Skip to main content...
-
8
Philips Hue soon to launch new ‘Natural Light’ mode that mimics sunlight throughout the day July 20, 20...
-
2
Adonit Neo Pro Stylus Mimics Apple Pencil Features for $44.99 MacRumors macrumors bot
-
8
-
8
The Human Eye and Single Photons [Physics FAQ] - [Copy...
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK