“Neural dust” is a term used to describe tiny sensors designed by the EECS department of UC Berkeley. In a paper released this month, Berkeley researchers revealed that they’ve recorded the first in-vivo readings from implanted dust.This research is a long time coming. In 2013, the team published research detailing their research on their use of ultrasound with CMOS circuitry. In 2015, they released another paper that further focused on theory, modeling, and scaling.The resultant prototype in this most recent announcement is a step towards sensors that can be safely implanted in the brain. It’s also a step towards a future where wearable technology could be implanted directly inside the body.
Sensors are at the heart of the Internet of Things (IoT) revolution and most applications will deploy multiple sensors including an image sensor. The more compelling home automation products tend to deploy cameras which are commonly based around a CMOS image sensor and this, coupled with sophisticated computer vision algorithms, look set to become the ‘brains’ of the smart home.