1

Everyday Objects Can Run Artificial Intelligence Software

 2 years ago
source link: https://science.slashdot.org/story/22/01/29/049206/everyday-objects-can-run-artificial-intelligence-software
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Everyday Objects Can Run Artificial Intelligence Software (science.org) 39

Posted by EditorDavid

on Saturday January 29, 2022 @11:34AM from the object-oriented-programming dept.

Slashdot reader sciencehabit quotes Science magazine: Imagine using any object around you—a frying pan, a glass paperweight—as the central processor in a neural network, a type of artificial intelligence that loosely mimics the brain to perform complex tasks. That's the promise of new research that, in theory, could be used to recognize images or speech faster and more efficiently than computer programs that rely on silicon microchips.

To demonstrate the concept, the researchers built neural networks in three types of physical systems, which each contained up to five processing layers. In each layer of a mechanical system, they used a speaker to vibrate a small metal plate and recorded its output using a microphone. In an optical system, they passed light through crystals. And in an analog-electronic system, they ran current through tiny circuits.

In each case, the researchers encoded input data, such as unlabeled images, in sound, light, or voltage. For each processing layer, they also encoded numerical parameters telling the physical system how to manipulate the data. To train the system, they adjusted the parameters to reduce errors between the system's predicted image labels and the actual labels.

In one task, they trained the systems, which they call physical neural networks (PNNs), to recognize handwritten digits. In another, the PNNs recognized seven vowel sounds. Accuracy on these tasks ranged from 87% to 97%, they report in this week's issue of Nature. In the future, researchers might tune a system not by digitally tweaking its input parameters, but by adjusting the physical objects—warping the metal plate, say.

The team is most excited about PNNs' potential as smart sensors that can perform computation on the fly. A microscope's optics might help detect cancerous cells before the light even hits a digital sensor, or a smartphone's microphone membrane might listen for wake words. These "are applications in which you really don't think about them as performing a machine-learning computation," they say, but instead as being "functional machines."

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK