The User Experience Research Released Up-to-date Technologies in CHI 2015!

    The user experience research team (UX team) of Intel-NTU ConnectedContext Computing  Center attended the ACM annual conference on Computer-Human Interaction (CHI) 2015. Three research papers were published in the conference, including one honorable-mentioned paper. The researchers not only focus on developing latest technologies, but also conduct research to apply the innovation to users’ daily life.
SoberDiary (

    Alcohol dependence is a chronic disorder. To abandon alcohol, patients will have to participate in a complicated follow-up maintenance programs, which helps them stop drinking. Is it really impossible to walk away from alcohol all by oneself? The Sober Diary team developes a new approach to this problem: patients now can stay sober with the help of their smartphones.
    With face recognition technology and a pluggable alcohol sensor, users only need to blow on the sensor, and the smartphone will tell if the users have been drinking. This mechanism makes recording and tracking users’ drinking habits easier, and simplifies the process of keeping a drinking diary. Visualizing the process of giving up drinking is another important factor of the success of Sober Diary. The storyboard of accomplishing the goal makes the patients realize that they are on the way to a grand goal: stop drinking. The visualized steps help patients obtain self-fulfillment within different stages of the treatment. Isn’t it wonderful?
    SoberDiary app turns your smartphone into the best assistant of alcohol withdrawal. Successfully assisting alcohol withdrawal in a long-term study, Sober Diary received Honorable Mentioned Paper award in CHI 2015, which is the honor of the top 5% paper in the conference. Dr. Chuang-wen You, the leader of  SoberDiary research, said:, “Like what Sober Diary tells the users when the goal is finally reached: we make a living by what we get, but make a life by what we give. We hope Sober Diary is not just an academic research, but rather an innovation to help those people and their family who are suffering from alcohol addiction.


Cyclops (
    Imagine the new interaction brought by man having an extra eye. This is what the second paper presents - Cyclops. Cyclops is a single-piece wearable device that sees its user’s whole body postures from an ego-centric view of the user. The View is obtained via a fisheye lens put at the center of the user’s body, which is able to see only the user’s limbs and interpret his body postures effectively.
    Unlike currently available body gesture input systems that depend on external cameras or distributed motion sensors all over the user’s body, Cyclops is a single-piece wearable device that is worn as a pendant or a badge. The main idea proposed in this paper is the observation of limbs from a central location of the body. Owing to the ego-centric view, Cyclops turns posture recognition into a highly controllable computer vision. This paper demonstrates a proof-of-concept device and an algorithm for recognizing static and moving bodily gestures based on the past motion images and a random decision forest. With Cyclops, the complicated tasks can be accpomplished with a simple wearable camera, which provides a solution with higher efficiency and lower cost than the traditional approaches’.
    Dr. Liwei Chan, the leader of Cyclops research said, “Augmented human is a new research domain with the invention of wearable devices. With wearable devices containing small sensors and actuators, human’s physical abilities can be highly developed. The next step of Cyclops research is to installing cameras with various sizes onto different parts of human body, in order to provide more possibility of interaction and services.”

WonderLens (

    Altough it is more comfortable to read and play with paper than with electronic displays, paper still plays an important role today. However, since the content printed on paper is static, it constrains the interactivity of printed paper. Hence, WonderLens presents a system of optical lenses and mirrors enabling tangible interactions on printed paper.
    When users perform spatial operations on the optical lenses and mirrors, they deform the visual content printed on paper, and thereby provide immediate dynamic visual and haptic feedback. The embedded magnetic unit of each lens and mirror allows them to be identified and tracked with an analog Hall-sensor grid that is placed behind the paper, so the system provides additional auditory and visual feedback through different levels of embodiment, and further allows printed paper to present dynamic information to users. This interactive system can be applied to board games, tangible learning applications, and playful interactions for children.
    “No matter how much digital devices replace or extend old technologies in our daily life,” said Dr. Rong-hao Liang, the leader of WonderLens research, and also the co-founder of GaussToys, “the interaction with paper is still unique and essential. We hope WonderLens is a step further to extend current imagination of paper and highlights new possibilities of paper with ubiquitous computation techniques.