Making more sustainable smart devices
Opportunity
The artificial neural network (ANN) is a powerful type of computing system that enables voice recognition in smartphones and smart appliances. Loosely patterned after the human brain, the ANN is composed of a complex web of discrete nodes that, like neurons, can accept information, process it and pass it on as output. As with other types of machine learning methods, ANN algorithms improve the more they are used, and can be trained extensively to achieve high levels of efficiency. For example, ANNs have allowed Siri-Apple’s ubiquitous virtual assistant-to learn to keep an ear out for the keyword “Hey, Siri” with impressive accuracy.
The number of devices carrying virtual assistants is expected to reach 8.4 billion by 2024 and by 2023, some 275 million voice-controlled appliances will be powering smart homes. Because these smart devices constantly listen to their surroundings in order to detect their triggering keyword or “wakewords”, all nodes of the ANN are always on, meaning that the hardware continuously uses up energy. As a result, sustaining the ANN’s computing prowess requires devices across the world to consume excessive amounts of electricity.
With the market only expected to grow even further, there is a pressing need for a more sustainable, energy-efficient way to run smart devices, without sacrificing their keyword-spotting accuracy.
Technology
This new technology describes the use of a deep, convolutional spiking neural network (SNN) for keyword detection. Unlike conventional ANNs, SNNs process information in an event-driven manner. While all neurons in ANN fire in response to any kind of input, nodes in the SNN activate asynchronously, and only when incoming signals raise their membrane potential beyond a specific firing threshold.
This means that when smart devices are idle, only some neurons remain awake to listen out for the keyword while others could lay dormant. Given the relative infrequency of voice commands, devices that run SNN would have higher computational and energy efficiency compared to those that run ANNs.
Moreover, compared to the synchronous approach of ANNs, early classification decisions can be made from the SNN after it is first activated-with the quality of the classification decisions improving over time with accrued experience.
One key concern with using SNNs is that they may not be as accurate as ANNs, compromising the device’s performance. But according to unpublished data, SNNs can match ANNs in spotting multiple keywords and are even better at detecting single wakewords. The SNN approach also synergises well with emerging ultra-low power neuromorphic hardware, allowing each other to maximise their energy-saving potential.
An illustration of the spiking neural network (SNN) model. Here an output spike is generated from the spiking neuron only when incoming signals raise membrane potential beyond the firing threshold.

