For the first time, researchers at Penn State University have used a network of sensors to mimic the brain's neural circuitry. The group, led by Indian-origin scientist Saptarshi Das, worked on combining a tactile sensor with a visual sensor so that each one's output could influence the other using visual memory.
The researchers built the multisensory neuron by wiring together a tactile sensor and a phototransistor made from a monolayer of molybdenum disulfide, a compound with unusual electrical and optical characteristics that make it well-suited for use in light detection and as a substrate for transistors.
To process both visual and tactile information, the sensor produces electrical spikes, just like neurons. The tactile sensor used a triboelectric effect, in which two layers slide against one another to produce electricity, to reproduce the touch input.
To simulate visual input, researchers shone a light into the phototransistor that can remember visual input, like how a person can retain the make of a room after a quick flash illuminates it.
Researchers found that the sensory response of the neuron increased when tactile and visual signals were weak, meaning the sensors combined for better output, just like the human brain, where “one sense can influence another and allow the person to better judge a situation,” Das said, in a PSU news release.
Das explained that the artificial multisensory neuron system could enhance sensor technology’s efficiency, paving the way for more eco-friendly AI uses. This could help robots, drones, and self-driving vehicles to navigate their environment effectively while using less energy.
Comments
Start the conversation
Become a member of New India Abroad to start commenting.
Sign Up Now
Already have an account? Login