HOME
ICDVRAT 2024
2018-2022
1996-2016
BOOKS
ABOUT
SPONSORS
CONTACT
S V G Cobb and P M Sharkey, University of Nottingham/University of Reading, UK
The International Conference on Disability, Virtual Reality and Associated Technologies (ICDVRAT) this year has its sixth bi-annual conference and celebrates ten years of research and development in this field. A total of 180 papers have been presented at the first five conferences, addressing consideration, development, exploration and examination of how these technologies can be applied in disabilities research and practice. The research community is broad and multi-disciplined, comprising a variety of scientific and medical researchers, rehabilitation therapists, educators and practitioners. Likewise, technologies, their applications and target user populations are also broad, ranging from inexpensive mobility aids through to fully immersive interactive simulated environments. A common factor is the desire to identify what the technologies have to offer and how they can provide added value to existing methods of assessment, rehabilitation and support for individuals with disabilities. We review this first decade of research and development in the ICDVRAT community and ask how far we have progressed: are we still discussing potential and promise or has our technology found its way into practical implementation?
Session Chair: Tony Brooks
R G Morris, E Pullen, S Kerr, P R Bullock and R P Selway, Institute of Psychiatry, London/University of Nottingham/King’s College Hospital, London, UK
Exploration of social rule violation in patients with focal prefrontal neurosurgical lesions, R G Morris, E Pullen, S Kerr, P R Bullock and R P Selway, Institute of Psychiatry, London/University of Nottingham/King’s College Hospital, London, UK
Social rule violation was explored in 22 patients with prefrontal neurosurgical lesions and 22 normal controls. The patients were split into those with neurosurgical lesions impinging on the either the orbitofrontal (OF), dorsolateral (DL) or mesial (M) region of the prefrontal cortex. The study used a virtual reality ‘bar’ in which participants walked from the entrance to the bar counter, ordered drinks and returned to the entrance, with the choice of moving between other people (socially inappropriate) or around the people (social appropriate). There was a significant increase in socially inappropriate behaviour in the patients whose lesions were in other prefrontal regions than the dorsolateral prefrontal cortex.
P Lopes-dos-Santos, L M Teixeira, S Silva, M Azeredo and M Barbosa, University of Porto/Portuguese Catholic University, PORTUGAL
This paper focuses on interpersonal dynamics between the child with disabilities and the adult monitoring his/her performance in Aesthetic Resonance Environments. Drawing upon a social constructivist approach, a framework for human interactivity was checked against empirical data obtained from the exploratory implementation of an environment intending to stimulate body awareness and enhance movement in a group of six children with severe neuromotor disabilities. Results showed that the adult assumed the role of a facilitator, mediating interactions between children and the technological system. The provided social mediation increased quality of movement and improved levels of engagement in the observed group of participants.
U Andersson, P Josefsson and L Pareto, University West, SWEDEN
The purpose of the study is to explore particular challenges faced when designing virtual environments for children with autism, with the purpose of training social skills. Our findings are based on studying autistic behaviour during three years (primary and secondary sources), analysis of related system and other computer mediated assistive technology, as well as general game design. From these studies we have identified eight critical design parameters that need to be adjustable in a system suitable for autistic persons. The parameters importance, their variation range, as well as the need for independent adjustment of these were estimated and verified by experienced expert pedagogues.
C Y Trepagnier, M M Sebrechts, A Finkelmeyer, J Woodford and W Stewart Jr, The Catholic University of America, Washington, DC, USA
Preliminary results are presented of a feasibility study, still in progress, of a virtual social environment designed to stimulate the social attention of pre-school-aged children with Autism Spectrum Disorder (ASD). The system uses eye-tracking and provides gaze-contingent rewards of clips from preferred videos. Of six children reported on here, most find the experimental setting appealing, and the rewards compelling; they voluntarily engage with it across numerous sessions, and demonstrate learning, with large inter-individual differences in rate of progress. Implications are discussed for the pilot study to follow.
Session Chair: Cecília Sik Lányi
R Shikata, T Kuroda, Y Tabata, Y Manabe and K Chihara, Kyoto University Hospital/Kyoto College of Medical Technology/Nara Institute of Science and Technology, JAPAN
This paper describes a new meeting support system that helps the hearing impaired to understand the contents of the meeting. The proposed system distinguishes the mainstream of the discussion from other chattering based on the utterances. The situation of the meeting is acquired as a picture using an omni-directional vision sensor, and the system analyzes speaker’s relations from the captured image by using face directions for the participants. The system shows the mainstream and the chattering of a meeting by using the analyzed result and speech-recognition.
D Padbury, R J McCrindle and H Wei, University of Reading, UK
This paper describes an intuitive approach for interacting with a computer or computer-driven applications. Interaction is achieved by observing, through a stereo camera set-up, the motion of a user’s hands. This motion is then translated into 3-dimensional (3-D) coordinates to enable interaction with either a traditional 2-dimensional (2-D) desktop or a novel 3-D user interface. The aim of this work is to provide an intuitive method of interaction to computer based applications for individuals whose condition might restrict their ability to use a standard keyboard/mouse.
A Rathinavelu, H Thiagarajan and S R Savithri, Dr Mahalingam College of Engineering and Technology/National Institute of Technology/All India Institute of Speech and Hearing, INDIA
Lip sync model is one of the aspects in computer facial animation. To create realistic lip sync model, a facial animation system needs extremely smooth lip motion with the deformation of the lips synchronized with the audio portion of speech. A deformable, parametric model of the lips was developed to achieve the desired effect. In order to create realistic speech animation, the articulatory modeling of the lip alone is insufficient. The other major articulators such as the tongue and jaw must also be considered. This lip sync model was initially developed by using polygonal model and blended key shape techniques and then parameterized by using 36 control points. The data for lip sync model was collected from video image and magnetic resonance imaging (MRI) techniques. The articulatory movements of our lip sync model were presented along with virtual reality (VR) objects in an interactive multimedia (IMM) interface. This IMM interface was used to teach small vocabulary of hearing impaired (HI) children. Virtual reality objects used to increase the cognitive process within HI children. The bilabial speech sounds were differentiated by using appropriate visual cues. Control panel was developed to present articulatory movements at different speed. For this study, six hearing impaired children were selected between the ages 4 and 7 and they were trained for 10 hours across 2 weeks on 18 meaningful words. The intelligibility of hearing impaired children was experimented to find out their performance in articulation and in memory retention. The results indicated that 65-75% of given words were articulated well and 75-85% of words were identified by all children.
T Kuroda, K Okamoto, T Takemura, K Nagase and H Yoshihara, Kyoto University Hospital/ Kyoto University, JAPAN
This paper proposes new coordinate system suitable for denoting sign language motion. As the proposed coordinate system consists of polar coordinate systems whose origins are certain points of human body, postures shown on the system can be proportional for avatars with any possible shape and fit with existing subjective sign notation systems. This paper extracted coordinate origins from Japanese-Japanese Sign Language Dictionary via morphological analysis. Selected 85 points are successfully mapped on H-ANIM standard humanoid avatar.
Session Chair: Noomi Katz
C A P G van der Mast, Delft University of Technology, THE NETHERLANDS
In this paper the architecture and use of the Delft VRET system is described. In this generic VRET system special emphasis is given to usability engineering of the user interface for the therapist. Results of controlled experiments with patients are summarized. The system is in regular use in a few clinics since 2005. New technological and functional challenges of VRET are presented. These challenges will lead to improvements of the system in the future. Agent support for the therapist and tele-VRET are the most promising challenges.
B K Wiederhold and M D Wiederhold, Interactive Media Institute/Virtual Reality Medical Center, San Diego, CA, USA
War veterans with neuromuscularskeletal injury often require significant treatment and rehabilitation, straining health care resources. In a study funded by the Office of Naval Research (ONR), the Virtual Reality Medical Center (VRMC) is applying virtual reality therapy to injured military personnel at the Naval Medical Center San Diego (NMCSD). The goal of this study is to investigate whether augmenting traditional rehabilitation with VR (in this case, off-the-shelf interactive video games) will enable a more rapid and complete rehabilitation. Because VR is interactive, and encourages patients to use their entire body to reach goals in the game, it is conceivable that it will make rehabilitation sessions more comfortable and entertaining. Participants consist of 20 veterans with upper arm and shoulder injuries (rotator cuff tear, shoulder impingement, bursitis) or amputation. The participants are divided into two groups (n=10): an experimental group, which receives traditional rehabilitation augmented by virtual reality therapy, and a control group, which undergoes traditional rehabilitation. Participants will complete ten treatment sessions in their respective condition. Though the study has not been completed, preliminary results based on subjective questionnaires and functional capacity indicate that the experimental condition may elicit increased heart rate and respiration. Participants in this group also seem to enjoy the music and interaction made possible through VR. These results suggest that VR may enhance the rehabilitation process, creating a more effective form of treatment. Long-term benefits of this form of treatment may include improved treatment time and reduced drop out rates, therefore reducing the costs of rehabilitation.
Z Geiszt, M E Kamson, C Sik Lányi and J A Stark, University of Pannonia, HUNGARY/Austrian Academy of Sciences, AUSTRIA
Virtual reality (VR) offers a wide range of applications in the field of cognitive neuropsychology both in diagnosing cognitive deficits and in treating them. An optimal diagnostic method is the on-field test, which provides an opportunity to apply VR-based simulations. VR is also a useful tool for skill-building and training by setting up a virtual setting, which imitates the real environment including the attributes to be trained. Moreover, it provides a graded approach to problem-solving and the feeling of safety and it excludes the negative elements which are detrimental to the learning process. To further extend the effectiveness of VR applications it is necessary to refine VR environments and adapt them according to the specific needs of selected target groups and the real-time control of virtual events. Following the principle of flexibility we prepared two virtual environments: 1) an Adjustable Virtual Classroom (AVC) for the treatment of fear of public speaking in a primary school task-solving setting, and 2) a Virtual Therapy Room (VTR) designed for use with aphasic clients. Due to their flexible nature, a large number of elements can be customised in both of these settings including spatial organisation, textures, audio materials and also the tasks to be solved. The real-time control over the virtual avatars by the supervisor, i.e. therapist to guide the social interactions in the virtual world also allows him/her to follow-up on the user’s reactions and therapy performance. By focusing on the details of the therapy room, we would like to demonstrate the relevance of the flexibility of the software in the development of innovative therapy solutions for aphasic clients.
H Grillon, F Riquier, B Herbelin and D Thalmann, EPFL, Lausanne, SWITZERLAND/Aalborg University Esbjerg, DENMARK
We hereby present a study whose aim is to evaluate the efficiency and flexibility of virtual reality as a therapeutic tool in the confines of a social phobia behavioural therapeutic program. Our research protocol, accepted by the ethical commission of the cantonal hospices’ psychiatry service, is identical in content and structure for each patient. This study’s second goal is to use the confines of virtual exposure to objectively evaluate a specific parameter present in social phobia, namely eye contact avoidance, by using an eye-tracking system. Analysis of our results shows that there is a tendency to improvement in both the questionnaires and eye contact avoidance.
A A Rizzo, K Graap, J Pair, G Reger, A Treskunov and T Parsons, University of Southern California/ Virtually Better, Inc., Decatur, Georgia/U.S. Army, Fort Lewis, Tacoma, Washington, USA
Post Traumatic Stress Disorder (PTSD) is reported to be caused by traumatic events that are outside the range of usual human experience including (but not limited to) military combat, violent personal assault, being kidnapped or taken hostage and terrorist attacks. Initial data suggests that at least 1 out of 6 Iraq War veterans are exhibiting symptoms of depression, anxiety and PTSD. Virtual Reality (VR) delivered exposure therapy for PTSD has been used with reports of positive outcomes. The aim of the current paper is to present the rationale, technical specifications, application features and user-centered design process for the development of a Virtual Iraq PTSD VR therapy application. The VR treatment environment is being created via the recycling of virtual graphic assets that were initially built for the U.S. Army-funded combat tactical simulation scenario and commercially successful X-Box game, Full Spectrum Warrior, in addition to other available and newly created assets. Thus far we have created a series of customizable virtual scenarios designed to represent relevant contexts for exposure therapy to be conducted in VR, including a city and desert road convoy environment. User-centered design feedback needed to iteratively evolve the system was gathered from returning Iraq War veterans in the USA and from a system in Iraq tested by an Army Combat Stress Control Team. Clinical trials are currently underway at Camp Pendleton and at the San Diego Naval Medical Center. Other sites are preparing to use the application for a variety of PTSD and VR research purposes.
Session Chair: Bruno Herbelin
P J Standen, D J Brown, N Anderton and S Battersby, University of Nottingham/Nottingham Trent University, UK
As part of the process to design a device that would enable users with intellectual disabilities to navigate through virtual environments, an earlier study had collected baseline data against which to evaluate prototype design solutions. This study describes the evaluation of three design solutions: two modifications to a standard games joystick and a two handed device. Evaluation data were collected while 22 people with intellectual disabilities worked through four VE designed using a games format. None of the prototypes gave significantly improved performance over the standard joystick and some actually led to the user receiving more help from the tutor to use the device. This difference was significant for the two handed device in all four games. However there was considerable variation in results such that with some devices there was a reduction in the variability of scores between individuals. Future research needs to focus on the design of environments and how best to match the user with the device.
Y Mizukami and H Sawada, Kagawa University, JAPAN
This paper introduces the development of a tactile device using a shape-memory alloy, and describes the information transmission by the higher psychological perception such as the phantom sensation and the apparent movement of the tactility. The authors paid attention to the characteristic of a shape-memory alloy formed into a thread, which changes its length according to its body temperature, and developed a vibration-generating actuator electrically driven by periodic signals generated by current control circuits, for the tactile information transmission. The size of the actuator is quite compact and the energy consumption is only 20mW. By coupling the actuators as a pair, an information transmission system was constructed for presenting the apparent movement of the tactility, to transmit quite novel sensation to a user. Based on the preliminary experiment, the parameters for the tactile information transmission were examined. Then the information transmission by the device was tested by 10 subjects, and evaluated by questionnaires. The apparent movement was especially well perceived by users as a sensation of a small object running on the skin surface or as being tapped by something, according to the well-determined signals given to the actuators. Several users reported that they perceived a novel rubbing sensation given by the AM, and we further experimented the presentation of the sensation in detail to be used as a sensory-aids tactile display for the handicapped and elderly people.
P Langdon, S Godsill and P J Clarkson, University of Cambridge, UK
We report the application of new statistical state space filtering techniques to cursor movement data collected from motion impaired computer users performing a standard Fitts’s Law style selection task. Developed as an alternative to expensive haptic feedback assistance, the aim was to assess the feasibility of the basic techniques in resolving the users intended trajectory from the extremely variable and wavering data that result from the effects of muscular spasm, weakness and tremor. The results, using a choice of basic parameter for the filters, show that the state space filtering techniques are well suited to estimating the intended trajectory of the cursor even under conditions of extreme deviation from the direct track and that these filters effectively act as an extreme cursor smoothing system. We conclude that further development of the approach may lead to more effective adaptive systems capable of providing smoothed feedback to the user and estimates of intended destination. A similar approach might further be applied to situationally induced movement perturbations.
K Kuzume and T Morimoto, Yuge National College of Technology, JAPAN
This paper presents the realization of a hands-free man-machine interface using tooth-touch sound. The proposed device has several advantages, including low price, ease of handling, and reliability. It may be used as an Environmental Control System (ECS) and communication aid for disabled persons. We analyzed the characteristics of the tooth-touch sound, obtained using a bone conduction microphone. We then designed the device using VHDL (Hardware Description Language) and a simulation of the FPGA (Field Programmable Device) in practice. We applied our device to the ECS to demonstrate its usefulness and evaluate its performance. The results confirmed that the proposed device had superior features to comparable devices, such as those utilizing voice control or eye blinks, chin operated control sticks, mouth sticks, or a brain computer interface (BCI) for severely disable persons.
L N S Andreasen Struijk, Aalborg University, DENMARK
This work describes a new inductive tongue-computer interface to be used by disabled people for environmental control. The new method demands little effort from the user, provides a basis for an invisible man machine interface, and has potential to allow a large number of commands to be facilitated. The inductive tongue–computer interface implemented with 9 sensors was tested in three healthy subjects and the results shows typing rates up to 30 to 57 characters pr. minute after 3 hours of training.
Session Chair: Charles van der Mast
E A Lallart, S C Machefaux and R Jouvent, Hôpital de la Salpêtrière, Paris, FRANCE
New interactive technologies offer the opportunity to involve the user's body in a virtual environment while seeing herself/himself performing the actions. Interactive exercises with a video-capture reinforce the perception-action loop, which is the pillar of agency (i.e. the ability to attribute the intention of an action to its proper author). We present a new paradigm as a possible treatment of agency disturbances in schizophrenia.
C D Murray, E Patchick, S Pettifer, T Howard and F Caillette, University of Manchester, UK
This paper describes a pilot clinical study to evaluate the efficacy of using immersive virtual reality (IVR) as a rehabilitative technology for phantom limb pain experienced by amputees. This work builds upon prior research which has used simple devices such as the mirror box (where the amputee sees a mirror image of their remaining anatomical limb in the phenomenal space of their amputated limb) to induce vivid sensations of movement originating from the muscles and joints of their phantom limb and to relieve pain. The IVR system transposes movements of amputees’ anatomical limbs into movements of a virtual limb which is presented in the phenomenal space of their phantom limb. The primary focus here is on a qualitative analysis of interview data with each participant throughout the study. We argue that the findings of this work make a case for proof of principle for this approach for phantom pain treatment.
J Lloyd, T E Powell, J Smith and N V Persaud, Birmingham University, UK
Route learning difficulties are a common consequence of acquired brain injury, and virtual environments provide a novel tool for researching this area. A pilot study demonstrated the ecological validity of a non-immersive virtual town, showing performance therein to correlate well with real-world route learning performance. The first patient study found that a rehabilitation strategy known as ‘errorless learning’ is more effective than traditional ‘trial-and-error’ methods for route learning tasks. The second patient study, currently in progress, will assess whether naturalistic route learning strategies of map and landmark use can be combined effectively with errorless techniques. A final study will investigate the relationships between route learning performance and scores on a select battery of neuropsychometric tests.
J H Sánchez and M A Sáenz, University of Chile, Santiago, CHILE
We introduce AudioMetro, application software for blind users that represents a subway system in a desktop computer to assist mobilization and orientation in a subway network. A user can organize and prepare a travel by using the software before riding the subway. Conclusions of the usability study revealed the critical importance of using key interface elements, such as audio-based hierarchy menu, travel simulation, and information about the subway network, stations and their surroundings. Cognitive study results show an advance in the development of mobility skills needed for using the subway system which represent a contribution for a much more integral development of blind users and one-step towards social integration and inclusion.
Session Chair: Pat Langdon
A Al-khalifah, R J McCrindle, P M Sharkey and V A Alexandrov, University of Reading, UK
In this paper we present a number of the immersive VR applications that we have developed during the past 18 months as a means of practically demonstrating the modelling approaches previously reported. The paper discusses the usefulness of the different approaches in assisting medical practitioners to diagnose and track conditions which might lead to impairment or disability, and how they can be used to train medical students to recognise such conditions or to undertake associated medical procedures. Initial findings of a survey of undertaken with medical practitioners as to the effectiveness of VR and in particular immersive models as diagnostic and training aids are also presented.
P E Waddingham, S V Cobb, R M Eastgate and R M Gregson, University of Nottingham, UK
Amblyopia, or ‘lazy eye’, is currently treated by wearing an adhesive patch over the non-amblyopic eye for several hours per day, over a period of many months. Non-compliance with patch wearing is a significant problem. Our multi-displinary team involved clinicians and technologists to investigate the application of VR technology in a novel way. We devised a binocular treatment system in which children watch a video clip of a cartoon on a virtual TV screen, followed by playing an interactive computer game to improve their vision. So far the system has been used to treat 39 children of which 87% have shown some improvement in vision. Vision improvement tended to occur within the first 3-4 treatment sessions. This paper describes research development of the I-BiT™ system. We present a summary of results from clinical case studies conducted to date and discuss the implications of these findings with regard to future treatment of amblyopia.
R Kizony, P L Weiss, M Shahar and D Rand, University of Haifa, ISRAEL
The limitations of existing virtual reality (VR) systems in terms of their use for home-based VR therapy led us to develop ‘TheraGame’, a novel video capture VR system. TheraGame operates on a standard PC with a simple webcam. The software is programmed using a Java-based visual interaction system. This system enables a quick and easy definition of virtual objects and their behavior. The user sits in front of the monitor, sees himself and uses his movements to interact with the virtual objects. The objective of this presentation is to present the system, a number of the current applications, and some initial pilot usage results. Results from a study of 12 healthy elderly subjects showed moderate to high levels of enjoyment and usability. These scores were also high as reported by 4 participants with neurological deficits. Some limitations in system functionality were reported by one person with stroke who used TheraGame at home for a period of 2.5 weeks. Overall, TheraGame appears to have considerable potential for home based rehabilitation.
H Zheng, R Davies, H Zhou, J Hammerton, S J Mawson, P M Ware, N D Black, C Eccleston, H Hu, T Stone, G A Mountain and N D Harris, University of Ulster/University of Essex/Sheffield Hallam University/University of Bath, UK
The SMART project, entitled ‘SMART rehabilitation: technological applications for use in the home with stroke patients’, is funded under the EQUAL (extend quality of life) initiative of the UK Engineering and Physical Sciences Research Council (EPSRC). The project aims to examine the scope, effectiveness and appropriateness of systems to support home-based rehabilitation for older people and their carers. In this paper, we describe the design and development of a low-cost home-based rehabilitation system. Through the project we have involved end users in the design process and this model can be applied to the design of other healthcare related systems.
Session Chair: Ali Al-khalifah
S A Wall and S A Brewster, University of Glasgow, UK
Access to simple visualisations such as bar charts, line graphs and pie charts is currently very limited for the visually impaired and blind community. Tangible representations such as heat-raised paper, and inserting pins in a cork-board are common methods of allowing visually impaired pupils to browse and construct visualisations at school, but these representations can become impractical for access to complex, dynamic data, and often require a sighted person’s assistance to format the representation, leading to a lack of privacy and independence. A system is described that employs tactile feedback using an actuated pin-array, which provides continuous tactile feedback to allow a visually impaired person to explore bar charts using a graphics tablet and stylus. A study was conducted to investigate the relative contribution of multimodal feedback (tactile, speech, non-speech audio) during typical graph browsing tasks. Qualitative feedback showed that the participants found it difficult to attend to multiple sources of information and often neglected the tactile feedback, while the speech feedback was the most popular, and could be employed as a continuous feedback mechanism to support graph browsing.
C C Tan, W Yu and G McAllister, Queen’s University Belfast, UK
This paper gives an overview of the current status of Internet accessibility and offers a brief review of the existing technologies that address accessibility problems faced by visually impaired people. It then describes an adaptive architecture which is able to integrate diverse assistive technologies so as to allow visually impaired people to access various types of graphical web content. This system is also capable of adapting to user’s profile and preferences in order to provide the most adequate interface to the user.
J Kildal and S A Brewster, University of Glasgow, UK
TableVis was developed to support computer users who are blind or visually impaired in tasks that involve obtaining quick overviews of tabular data sets. Previous work has covered the evaluation of this interface and its associated techniques of interactive data sonification and support exploratory processes. This paper examines the exploratory strategies and procedures employed by the users. A three-stage process for completing the exploratory task is described, and a discussion about the strategies and procedures that were observed is offered. Possible best practices and the most common issues are identified, which form the basis for the next steps to be taken in this line of research.
L Pareto and U Lundh Snis, University West, Uddevalla, SWEDEN
We present ongoing work on the design of an information system for users with reading disabilities and users with reduced vision. The design target is a portable, auditory, location-aware information system, to complement visually displayed information in exhibitions. Applying a user-centred, we identify non-typical user-groups’ specific requirements, which are turned into a design. The first design-iteration, which includes a formative evaluation, using a mock-up prototype, with dyslectic and visually impaired participants, is completed. The evaluation indicates that the user-group’s specific aspects we have identified are relevant, while designing for these groups.
M Simonnet, J-Y Guinard and J Tisseau, European Center for Virtual Reality, École Nationale D’Ingénieurs de Brest, FRANCE
This study aims at the conception of haptic and vocal navigation software that permits blind sailors to create and simulate ship itineraries. This question implies a problematic about the haptic strategies used by blind people in order to build their space representation when using maps. According to current theories, people without vision are able to construct cognitive maps of their environment but the lack of sight tends to lead them to build egocentric and sequential mental pictures of space. Nevertheless, exocentric and unified representations are more efficient (Piaget et al, 1948). Can blind people be helped to construct more effective spatial pictures? Some previous works have shown that strategies are the most important factors in spatial performance in large-scale space (Tellevik, 1992) (Hill et al, 1993) (Thinus-Blanc et al, 1997). In order to encode space in an efficient way, we made our subject use the cardinal points reference in small-scale space. During our case study, a compass establishes a frame of external cues. In this respect, we support the assumption that training based on systematic exocentric reference helps blind subjects to build unified space. At the same time, this training has led the blind sailor to change his haptic strategies in order to explore tactile maps and perform better. This seems to modify his processing of space representation. Eventually, we would like to study the transfer between map representation and environment mobility. Our final point is about using strategy based on cardinal points and haptic virtual reality technologies in order to help the blind improve their spatial cognition.
Session Chair: Eva Petersson
A Bar-Haim Erez, R Kizony, M Shahar and N Katz, Hebrew University & Hadassah, Jerusalem/University of Haifa, ISRAEL
The aim of this paper is twofold 1) to introduce a computerized platform of Visual Spatial Search Task (VISSTA), its current package and potential for a variety of additional programs, and 2) to present results of the basic package of stroke patients and healthy controls. Method. Participants included 39 healthy individuals; 25 patients post right hemisphere damage (RHD) with unilateral spatial neglect (USN); 27 patients post RHD without USN; and 20 patients post left hemisphere damage (LHD). All participants were tested on the computerized VISSTA and paper and pencil cancellation tests. The stroke patients were also tested on the ADL checklist and FIM. Results. Findings indicate that the VISSTA is a valid visual search assessment that significantly differentiated between patients following stroke and healthy controls and between different stroke patient groups. USN patients showed impairment in both visual search conditions and clear laterality bias when target was presented on the left side of a computer screen, this was true for success rate and reaction time. RHD patients without USN performed better than those with USN, however, they still show impairment in attention properties of visual search and detection of targets (both on left and right) compared to healthy individuals. Conclusions. The VISSTA tool was found to be sensitive to levels of visual spatial attention by means of accuracy and reaction time. Results suggest that it is important to supplement the conventional paper and pencil tests and behavioral measures with tools that provide both accuracy and RT parameters in a randomized and more complex fashion. The VISSTA is also suitable for treatment as it provides a flexible platform.
C Sik Lányi, R Mátrai and I Tarjányi, University of Pannonia, HUNGARY
In today’s information society, computer users frequently need to seek for information on home pages as well as to select among software functions. A well-designed interface is essential in order to find everything necessary and meet the requirements of both the average user and users with special needs. Our project set out to discover where and with how much contrast objects should be placed on the screen in order to find everything easily. We examine what kind of characteristic searching routes can be found and whether we can find differences between the average user and mentally retarded user in navigation and everyday searching exercises.
S Jacoby, N Josman, D Jacoby, M Koike, Y Itoh, N Kawai, Y Kitamura, E Sharlin and P L Weiss, University of Haifa, ISRAEL/Osaka University, JAPAN/University of Calgary, CANADA
Tangible User Interfaces (TUIs) are a subset of human-computer interfaces that try to capture more of the users' innate ability of handling physical objects in the real world. The TUI known as ActiveCube is a set of graspable plastic cubes which allow the user to physically attach or detach cubes by connecting or disconnecting their faces. Each cube is essentially a small computer which powers up and communicates with its neighbours upon connection to a neighbouring cube. When users assemble a physical shape using the system they also connect a network topology which allows ActiveCube to digitize and track the exact 3D geometry of the physical structure formed. From the user’s perspective, ActiveCube is a very powerful tool; the 3D shape being built with it physically is tracked in the virtual domain in real-time. ActiveCube’s use as a concrete, ecologically valid tool to understand dynamic functional processes underlying constructional ability in either typically developed children or in children with neurological pathology has not yet been explored. The objective of this paper is to describe the ActiveCube interface designed for assessing and treating children with Developmental Coordination Disorder (DCD). In our pilot study, six male children, aged 6 to 7 years, three with DCD and three who are typically developed were tested. The children’s task was to successively use the ActiveCubes to construct 3D structures in a “matching” strategy known as “Perspective Matching”. The usability results showed that all the participating children enjoyed the tasks, were motivated and maintained a high level of alertness while using the ActiveCubes. More than 80% of them found the tasks to be easy or moderate. “Similarity” data from single subjects has been used to show differences in constructional ability between children with DCD and those who are typically developed. This automated ActiveCube three-dimensional (3D) constructional paradigm has promise for the assessment and treatment of children with DCD.
P J Standen, R Karsandas, N Anderton, S Battersby and D J Brown, University of Nottingham/Nottingham Trent University, UK
The inability of people with intellectual disabilities to make choices may result from their lack of opportunities to practice this skill. Interactive software may provide these opportunities and software that requires a timed response may reduce choice reaction time. To test this, 16 people with severe intellectual disabilities were randomly allocated to either an intervention or a control group. The intervention group spent eight sessions playing a switch controlled computer game that required a timed response while the control group spent the same amount of time playing a computer based matching game that did not require a timed response. Both groups repeated a test of choice reaction time (CRT) that they had completed prior to the intervention. The intervention group made more accurate switch presses with repeated sessions while receiving less help from the tutor who sat alongside them. The intervention group also showed a significant reduction in their CRT from baseline while the control group did not.
Session Chair: Sue Cobb
E R Miranda, University of Plymouth, UK
This paper introduces a brain-computer interface (BCI) system that uses electroencephalogram (EEG) information to steer generative rules in order to compose and perform music. It starts by noting the various attempts at the design of BCI systems, including systems for music. Then it presents a short technical introduction to EEG sensing and analysis. Next, it introduces the generative music component of the system, which employs an Artificial Intelligence technique for the computer-replication of musical styles. The system constantly monitors the EEG of the subject and activates generative rules associated with the activity of different frequency bands of the spectrum of the EEG signal. The system also measures the complexity of the EEG signal in order to modulate the tempo (beat) and dynamics (loudness) of the performance.
H Sugarman, E Dayan, A Lauden, A Weisel-Eichler and J Tiran, Ono Academic College/Hadassah College Jerusalem/Ben-Gurion University of the Negev, ISRAEL
We are developing a low-cost robotic system– the Jerusalem Telerehabilitation System – using a force feedback joystick and a standard broadband internet connection. In this study, the system was found to be user-friendly by therapists, older and younger normal subjects, and a post-stroke subject. Kinematic analysis of the joystick movements showed differences between the older and younger normal subjects and between the post-stroke subject and older normal subjects. These preliminary data indicate that our low-cost and straightforward system has the potential to provide useful kinematic information to the therapist in the clinic, thereby improving patient care.
E Lövquist and U Dreifaldt, University of Limerick, IRELAND
In this paper we present an application based on an immersive workbench and a haptic device, designed to motivate stroke patients in their rehabilitation of their arm. The work presented in this paper is the result of a six-month project, based on evaluation with stroke patients and on informal interviews with medical doctors, physiotherapists and occupational therapists. Our application called “The Labyrinth” has been used for studies with stroke patients and we have seen that arm rehabilitation using Virtual Environments and haptics can be very encouraging and motivating. These factors are crucial to improve the rehabilitation process.
I Sander, D J Roberts, C Smith, O Otto and R Wolff, University of Salford, UK
Immersive virtual reality is gaining acceptance as a tool for rehabilitation intervention as it places a person within a safe and easily configurable synthetic environment, allowing them to explore and interact within it through natural movement. The purpose of the study was to explore the usefulness of different types of virtual environments in the rehabilitation of upper limb function and balance in stroke patients. Although the above characteristics are ideal for rehabilitation of motor disorders, acceptance is hampered by insufficient knowledge of the effect of method of immersion on the naturalness of human movement. This study begins to address this by comparing the impact of two typical methods, Head Mounted Display (HMD) and immersive projection technology (IPT), on the naturalness of reach and balance activities. The former places the simulated image in front of the eyes, whereas the latter projects it around the user so that they perceive a holographic effect when wearing stereo glasses. Using the novel approach of placing the HMD in the IPT allowed subjects perceiving the environment through either, to be observed moving within the IPT holograph. Combined with sharing the same tracking and camera systems, this provided a direct comparison of tracking measurements, interaction behaviour, video and other observational data. The experiment studied subjects moving objects around a simulated living room setting initially on a level surface and then whilst varying the height and shape of the walking surface through raised planks. Performance in the synthetic environment, using both display types, was compared to that in a physical mock up of the living room. The experimental results demonstrate similar balance and reach movements in the physical mock up and the IPT, whereas a striking reduction in naturalness in both activities was observed for HMD users. This suggests that an inappropriate choice of method has the potential to teach unnatural motor skills if used in rehabilitation. Reasons for the difference are discussed along with possible remedies and considerations for practical applications within a clinical setting.
ICDVRAT Archive | Email | © 2006 Copyright ICDVRAT |