SIG Interaction

SIG Chair: Yi-Ping Hung (Professor, Dept. of Computer Science and Information Engineering, National Taiwan University)
Four sub-projects pertain to SIG Interaction: 

The Aims and Objectives of SIG Interaction: (indicated by sub-projects)
1.          Natural and Seamless Interaction
We have achieved the KPI for the First Year in identifying basic vocabularies for human-IOT communication with smart things, building a Dial kit for exploring the lighting patterns, and investigating users’ understanding with single LED. It was found that the designs of lighting pattern could be used as an effective communication to let users know what’s going on in the smart systems, such as Active, Approve, Exchange info, or Show problems. However, due to the limitation of single lighting, users are not easy to perceive some complex behaviors the system is trying to communicate, such as I see you seeing me. To address this issue, we are planning to investigate the richer expressions with the unit consist of 9 LEDs. We imagine that the animations or positions of multiple lightings could be used to convey plentiful messages. 
We’ll first develop a scenario of the interaction of multiple machines and implement the lighting behaviors. Second, we’ll invite the participants to experience and express their understandings of the machine-to-machine communication from the lighting patterns shown on the artifacts. Besides this qualitative exploration, we also plan to find suitable matches of the lighting patterns and the main vocabularies with the quantity measurement approaches.
2.          Finger-Worn Interaction Device for Connecting the Unconnected Things
In MidasTouch project, we have proposed three wearable systems for recognizing gestures on the everyday objects and providing immediate semantic haptic feedback. They are a fingerstall-like device utilizing RFID reader to recognize gestures performed on different tagged everyday objects reliably, a nail-mounted tactor arrays that displays spatial cues and character information to users in an eyes-free manner, and a system of spatiotemporal vibration patterns and guidelines for delivering alphanumeric characters on wrist-worn vibrotactile displays. These results have been published in the international conferences MobileHCI and UIST in 2016.
To explore more expressive tactile feedback, we started the project, FingerTactor, to create animated vibrotactile motions on the finger that helps users make sense of and gain better awareness of invisible information flows, using two vibrotactile rings worn by the user’s finger. Furthermore, our previous works mainly focused on tactile type of haptic feedback. To enable haptic output with motions, we plan to extend the scope toward haptic force feedback. Current progress included seeking to simulate remote physical operations with together stimulating proprioceptive sensory at arm and tactile sensory at fingers and arms.
Furthermore, we foresee haptic-integrated applications are increasing. The main difference of haptic feedback from other feedback types (e.g., visual and audio) is that haptic feedback is usually non-observable and non-shareable (e.g., only the person of experiencing knows). These distinct features made haptic design exceptionally difficult to debug and evaluate in particular in an IOT environment where multiple haptic outputs may appear everywhere. We plan to explore this problem by visualizing various kinds of haptic feedbacks using e.g., colors, allowing haptic output to become visual cues visible and shareable between the designers and the users. The results will lay good foundation to provide effective development tools for haptic design and evaluation.
In addition to haptic interaction, we plan to explore using personal multi-view videos to capture and recognize users’ activities in an eye-hand coordinated manner, by multiple wearable cameras worn by the user, e.g., on glasses and on finger-rings. This multi-perspective views containing visual information of different focuses and levels of details from the wearer’s perspective, will allow to capture and recognize users’ activities in an eye-hand coordinated manner, and provide good potential for delivering real-time guidance, which can be later enhanced with expressive haptic feedbacks produced by our haptic project, useful for education and learningAlso, we will conduct a user study to know the potential issues beyond haptic signal delivery, such as the limitations of human haptic sensation.
3.          Supporting Grounding in Distributed Communication with Attention Monitoring and Management
Currently we have identified concrete design and technology development opportunities for using IoT technologies as the basis to augment people’s abilities to externalize and internalized work knowledge in physical task domains for better knowledge transfer. We have completed an initial qualitative study to identify user needs in this context, built a working, functional prototype, and conducted an pilot study to examine the feasibility of IoT-based Knowledge Keeping and Knowledge Transfer. We have provided a proof-of-concept, which opens up a space of research and design for the next step.
 
 
We have discussed extensively with our collaborators at Intel as well as members of the NTU IoX Center. We have decided to strengthen the collaboration with another project of center, MidasTouch. Next we’ll plan and initiate a new user study that would help us identify specific feedback types and designs that would best leverage the distributed nature of work knowledge for the purpose of knowledge capture, store, share and transfer. We also plan to work closely with MidasTouch to borrow their experience and expertise on the design of on-body interfaces for information input and display. (updated in Feb, 2017)

錯誤 | 臺大智慧聯網創新研究中心

錯誤

網站遇到非預期錯誤。請稍後再試。

錯誤訊息

  • Warning:Illegal offset type in isset or empty 於 VariableRealmDefaultController->getStore() (/home/web/public_html/sites/all/modules/variable/variable_realm/variable_realm.class.inc 中的第 267 行)。
  • Warning:Illegal offset type 於 VariableRealmDefaultController->setStore() (/home/web/public_html/sites/all/modules/variable/variable_realm/variable_realm.class.inc 中的第 294 行)。
  • Warning:Illegal offset type in isset or empty 於 variable_store() (/home/web/public_html/sites/all/modules/variable/variable_store/variable_store.module 中的第 15 行)。
  • Recoverable fatal error:Object of class stdClass could not be converted to string 於 _variable_store_load() (/home/web/public_html/sites/all/modules/variable/variable_store/variable_store.module 中的第 103 行)。