Korean engineers have developed an algorithm that allows smartphones to recognize objects just by touching them. After training the algorithm, its accuracy was up to 99.7 percent with some objects, according to the work, which will be presented at the CHI 2018 conference.
There are several methods for smartphones to interact with physical objects. For example, objects can be placed Bluetooth or NFC. There are also applications that use artificial vision to recognize objects through a smartphone camera, for example, Google Lens. But for this, the camera must be aimed specifically at the object, and the room must be clear enough.
A group of engineers led by Sung-Ju Lee of the Korean Institute of Advanced Technology (KAIST) chose a different approach: the developers decided to use the touch or hit of a smartphone on the object to recognize it. Any smartphone is equipped with several sensors. The engineers chose three of them, which record the effects that arise in a collision: a microphone, an accelerometer and a gyroscope. When a collision occurs, the microphone registers the sound that is created, unique to each material, and the accelerometer and the gyroscope record data about the movement and rotation of the smartphone.
For the application to distinguish the objects, the developers used the Weka machine learning package classifier. After training the algorithm, the developers tested their effectiveness on 15 volunteers and 14 objects: an aluminum can, a book, etc. The authors decided to make sure that the application will work not only in the laboratory, and they did two tests with each volunteer, one of which was in a noisy environment. Participants were asked to hit the smartphone 50 times on each object, after which this data was collected and analyzed.
It turned out that the application works very accurately; for some objects, the accuracy was 99.7 percent, and the lowest value was 94 percent. In addition, the noise had little effect on the recognition accuracy. Engineers invented domestic applications for the application. For example, a user can touch a product in the store with a smartphone and it will be automatically added to the shopping cart of the online store for later ordering. In the video you can see many other examples:
Last year, researchers from the University of Washington (USA) developed a mobile application that, with a smartphone camera, analyzes the reaction of the pupils to bright light, and determines whether the patient has had a serious injury in the brain. For the lighting to be uniform, the engineers used a 3D printed box, which resembles, by its measurements, a virtual reality headset (VR). The website of the university publishes the description of the invention.