Haptic User Interface Integration for 3D Game Engines

dc.contributor.authorŞENGÜL, Gökhan
dc.contributor.authorÇAĞILTAY, Nergiz
dc.contributor.authorÖZÇELİK, Erol
dc.contributor.authorTÜNER, Emre
dc.contributor.authorEROL, Batuhan
dc.date.accessioned2022-08-02T08:51:19Z
dc.date.available2022-08-02T08:51:19Z
dc.date.issued2014-08-02
dc.description.abstractTouch and feel senses of human beings provide important informa tion about the environment. When those senses are integrated with the eyesight, we may get all the necessary information about the environment. In terms of human-computer-interaction, the eyesight information is provided by visual displays. On the other hand, touch and feel senses are provided by means of special devices called “haptic” devices. Haptic devices are used in many fields such as computer-aided design, distance-surgery operations, medical simulation environments, training simulators for both military and medical applications, etc. Besides the touch and sense feelings haptic devices also provide force feedbacks, which allows designing a realistic environment in virtual reality ap plications. Haptic devices can be categorized into three classes: tactile devices, kines thetic devices and hybrid devices. Tactile devices simulate skin to create con tact sensations. Kinesthetic devices apply forces to guide or inhibit body movement, and hybrid devices attempt to combine tactile and kinesthetic feed back. Among these kinesthetic devices exerts controlled forces on the human body, and it is the most suitable type for the applications such as surgical simu lations. The education environments that require skill-based improvements, the touch and feel senses are very important. In some cases providing such educa tional environment is very expensive, risky and may also consist of some ethi cal issues. For example, surgical education is one of these fields. The traditional education is provided in operating room on real patients. This type of education is very expensive, requires long time periods, and does not allow any error-and try type of experiences. It is stressfully for both the educators and the learners. Additionally there are several ethical considerations. Simulation environments supported by such haptic user interfaces provide an alternative and safer educa tional alternative. There are several studies showing some evidences of educa tional benefits of this type of education (Tsuda et al 2009; Sutherland et al 2006). Similarly, this technology can also be successfully integrated to the physical rehabilitation process of some diseases requiring motor skill improve ments (Kampiopiotis & Theodorakou, 2003). Hence, today simulation environments are providing several opportunities for creating low cost and more effective training and educational environment. Today, combining three dimensional (3D) simulation environments with these haptic interfaces is an important feature for advancing current human-computer interaction. On the other hand haptic devices do not provide a full simulation environment for the interaction and it is necessary to enhance the environment by software environments. Game engines provide high flexibility to create 3-D simulation environments. Unity3D is one of the tools that provides a game en gine and physics engine for creating better 3D simulation environments. In the literature there are many studies combining these two technologies to create several educational and training environments. However, in the literature, there are not many researches showing how these two technologies can be integrated to create simulation environment by providing haptic interfaces as well. There are several issues that need to be handled for creating such integration. First of all the haptic devices control libraries need to be integrated to the game engine. Second, the game engine simulation representations and real-time interaction features need to be coordinately represented by the haptic device degree of freedom and force-feedback speed and features. In this study, the integration architecture of Unity 3D game engine and the PHANToM Haptic device for creating a surgical education simulation environment is provided. The methods used for building this integration and handling the synchronization problems are also described. The algorithms de veloped for creating a better synchronization and user feedback such as provid ing a smooth feeling and force feedback for the haptic interaction are also provided. We believe that, this study will be helpful for the people who are creating simulation environment by using Unity3D technology and PHANToM haptic interfaces.
dc.identifier.urihttp://hdl.handle.net/20.500.11905/1462
dc.language.isoen
dc.publisherInternational Conference on Human-Computer Interaction
dc.subjectcomputer engineering
dc.subject.othersoftware engineering
dc.titleHaptic User Interface Integration for 3D Game Engines
dc.typeArticle
dspace.entity.type

Files

Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
B4.Sengul_2014.pdf
Size:
37.9 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed to upon submission
Description: