Botanicus Interacticus is a new interactive plant technology which does not require any new instrumentation in plants. A simple electrode placed inside the soil is able to grasp a ton of frequencies produced by the plant, converting it into a multi-touch gesture sensitive controller.
Touché is a project developed at Disney Research which makes use of frequencies captured by sensing various events witnessed by the plant and simultaneously recognises complex human physical interactions with it. In simple words, it has the capability to express what kind of touch event has occurred — caressing, pinching, holding, tickling, etc. Traditional capacitive sensors work by generating an electrical signal at a single frequency. This frequency is applied onto a conductive surface, such as a metal. The value of the capacitance changes if the hand is close enough to the surface of the plant or if it is in contact. This signal is processed by the metal sensor implemented on the Arduino.
Swept-frequency capacitive sensing makes use of multiple frequencies. In Disney Research paper CHI 2012, the Touché developers state the reason for using multiple frequencies is because, “Objects excited by an electrical signal respond differently at different frequencies, therefore, the changes in the return signal will also be frequency dependent.”
Rather than using a single data point generated by an electrical signal at a single frequency, as in traditional capacitive sensing, Touché utilises multiple data points from multiple generated frequencies. This machine learning pipeline is based around a Support Vector Machine. Specifically, it uses the SVM module from the Gesture Recognition Toolkit.
Nick Arner has replicated this model to play with and has experimented with different gestures to evaluate them. He has place two plants on his desk, a fern plant and an air plant. The Arduino kit with connected to the fern plant is constructed like the image below.
Arner has constructed the model based on the description in the Botanicus Interacticus paper. After inserting the wire into the soil, he has trained the ESP system to recognise the type of gestures. Two kinds of touch events were recorded — Touch Single Leaf and Caressing The Plant.
Arner has demonstrated the plants stimulate frequencies with two other kind of touch events:
The model makes use of Support Vector Machine algorithm to classify and regress the machine learning tasks. In classification task the algorithm detects what type of events have occured, that is, what kind of frequencies are being stimulated by the plant. In regressing tasks, the machine learning system maps the distance between two coordinates to control the parameter — for instance, you could map the distance travelled by the hand between two points on a plant stem to the value of a volume slider.
While watching this video, one can notice that there are some erroneous instances. For example, when the transition from “No hand” to “Resting Hand” happens, a “Ticking” gesture shows up. This can be negated with the help of the ‘classlabelfilter’ from the Gesture Recognition Toolkit.
Nick Arner’s experiment with the ESP data can be found here. One drawback which occurred during the experiment was that the model was not able to find the difference between “Touching” and “Rubbing” actions. It may be that this is possible, but dependent on the type of plant involved — a plant with thicker, more “solid” leaves may return a conductive pattern that’s better at discriminating between individual leaf touches than the thin, loose leaves of the fern plant.
This interesting invention, which is still in a nascent stage can be implemented in home-security devices. For example, it can be used to detect any security issues such as intruder detection, faulty devices, etc.