SlideShare a Scribd company logo
1 of 26
3.0 Introduction
VR is being applied to a wide range of medical areas, including remote and local surgery,
surgery planning, medical education and training, treatment of phobias and other causes of
psychological distress, skill training, and pain reduction. It is also used for the visualisation of
large-scale medical records, and in the architectural planning of medical facilities, although these
last two applications are not covered by this survey.
The survey focuses on three main application areas: surgery in general, neurosurgery, and mental
and physical health and rehabilitation. See also Section 4: Sources and Resources in Medical VR.
3.1 VR for Surgery
Surgery is mostly visual and manual. VR for surgery involves applications of interactive
immersive computer technologies to help perform, plan and simulate surgical procedures. In
performance, the VR guides the surgeon, sometimes with a robot to execute the procedure under
the surgeon's control (to remove hand tremor and scale down manipulations for key-hole
surgery, for example). In other words, VR is used to give the surgeon 3D interactive views of
areas within the patient. Planning is carried out preoperatively, to find the best approach to
surgery, involving minimum damage. Simulation is mostly used in training, using patient data
often registered with anatomical information from an atlas. It may be used for routine training, or
to focus on particularly difficult cases and new surgical techniques.
VR is being applied in all three major areas of surgery: open surgery, endoscopic surgery and
radiosurgery. The surgery may be remote (through the use of robotics) or local.
In open surgery, the surgeon opens the body and uses hands and instruments to operate. This is
the most invasive form of surgery, with long recovery times. There is a strong movement away
from open surgery and towards improved techniques of minimally-invasive surgery.
Open surgery
Endoscopy is minimally invasive surgery through natural body openings or small artificial
incisions ('keyhole surgery'): laparoscopy, thoracoscopy, arthroscopy, and so on. A small
endoscopic camera is used in combination with several long, thin, rigid instruments. The trend is
to carry our as much surgery as is feasible by this means, to minimise the risk to patients.
Endoscopic surgery: the current situation without VR
Advantages for the patient include less pain, and less strain on the organism, and faster recovery.
There are also relatively small injuries, and an economic gain arising through shorter illness
time.
However, for the surgeon, there are several disadvantages, including restricted vision and
mobility, difficult handling of the instruments, difficult hand-eye coordination and no tactile
perception except force feedback.
Endoscopic surgery is becoming increasingly popular, because of its significant advantages. It is
also the most popular surgical application of VR, partly because it expands on what is already an
"unnatural" view of the locus of operation. Another reason is that endoscopic surgery is
relatively easy to simulate because of the limited access, restricted feedback (especially tactile)
and limited freedom of movement of instruments. Endoscopic simulators are being produced by
all the main medical VR companies, usually with a focus on training.
Another recent trend is towards so-called Virtual Endoscopy. This is a technique whereby data
from non-intrusive sources - such as scans - are combined into a virtual data model that can be
explored by the surgeon as if an endoscope were inserted in the patient. VR is increasingly being
used to provide surgeons with a meaningful and interactive 3D view of areas and structures they
would otherwise be unable or unwilling to deal with directly.
In radiosurgery, X-ray beams from a Linear Accelerator are finely collimated and accurately
aimed at a lesion. Popular products include Radionics X-knife, and Elekta`s Gammaknife.
Planning radiosurgery is suitable for VR, since it involves detailed understanding of 3D
structure.
Elekta's Gammaknife(left ) and the X-knife from Radionics (right)
VR in surgery differs from most other VR in its focus on contact with objects, which must often
be deformable objects and interdependent. The focus is on looking into objects rather than
looking into space - there is less room available. The data is essentially volumetric and finger
and hand interaction must be extremely precise.
The above characteristics bring with them certain technical requirements, such as real-time
response to user`s action - which implies fast graphics, low latency input devices. The images
must be of high resolution and faithful to the actual patient data, since life-critical decisions are
based on the presentation of patient data. For simulators, the physical procedures must match
those used in the actual operation.
Other requirements of VR for surgery include registration of patient data with atlases and the
ability to coregister multimodal data. For use over extended periods, which is often needed in
surgery, the style of user interaction should be natural, comfortable, and easy to use.
Areas where VR is being applied:
Image-guided surgery
Guiding surgeons to targets during actual operations
Training simulators
Practicing difficult procedures
Preoperative planning
Studying patient data before surgery
Telesurgery
Operating remotely, or assisting other surgeons remotely.
3.2 Image-guided surgery
VR can in principle be applied to enhance reality for image-guided surgery. When applied to
image-guided surgery in this way, the images obviously need to be available intra-operatively,
and accurate registration of the real patient with the data becomes a crucial issue.
Currently, VR is used much more for preoperative planning (see 3.4 below) than to guide actual
surgery (due to the understandable conservatism of medical practitioners). When VR is used
intra-operatively, it tends to be implemented as some form of Augmented Reality (see the
University of North Carolina system, below, and 2.4 above). Image-guided surgery is also a
prerequisite of remote telemedicine and collaboration (see 3.5 below).
Image-guided surgery, implemented as Augmented Reality, at the University of N. Carolina
Brain tumour surgery guidance images
ARTMA (University of Vienna) system for Image-guided Ear, Nose & Throat Surgery
The ARTMA team at the University of Vienna were pioneers in this field. They refer to their
approach as Interventional Video Tomography (see abstract below). It is also applied to
telemedicine (see 3.5 below).
"Interventional Video Tomography"
SPIE Proceedings of Lasers in Surgery, 4-6 February 1995, San Jose,
CA
Paper #: 2395-34, pp.150-152
Author(s): Michael J. Truppe, Ferenc Pongracz, Artma
Medizintechnik GmbH, Wien, Austria; Oliver Ploder, Arne Wagner,
Rolf Ewers, Universität Wien, Vienna, Austria.
Abstract
Interventional Video Tomography (IVT) is a new imaging modality for
Image Directed Surgery to visualize in real-time intraoperatively the
spatial position of surgical instruments relative to the patient's anatomy.
The video imaging detector is based on a special camera equipped with an
optical viewing and lighting system and electronic 3D sensors. When
combined with an endoscope it is used for examining the inside of cavities
or hollow organs of the body from many different angles.
The surface topography of objects is reconstructed from a sequence of
monocular video or endoscopic images. To increase accuracy and speed of
the reconstruction the relative movement between objects and endoscope
is continuously tracked by electronic sensors. The IVT image sequence
represents a 4D data set in stereotactic space and contains image, surface
topography and motion data. In ENT surgery an IVT image sequence of
the planned and so far accessible surgical path is acquired prior to surgery.
To simulate the surgical procedure the cross sectional imaging data is
superimposed with the digitally stored IVT image sequence. During
surgery the video sequence component of the IVT simulation is substituted
by the live video source.
The IVT technology makes obsolete the use of 3D digitizing probes for
the patient image coordinate transformation. The image fusion of medical
imaging data with live video sources is the first practical use of augmented
reality in medicine. During surgery a head-up display is used to overlay
real-time reformatted cross sectional imaging data with the live video
image.
A main tool for traditional image-guided surgery is the microscope.
Microscopes are now being integrated with robotic transport systems.
Microscopes can also, in principle, serve as the vehicles for VR, as they
will increasingly allow 3D views and are already "in place" in the
operating theatre. Surgeons readily accept microscope-based views from
which they can easily look away, whereas they are less comfortable with
overlays placed on their primary direct view of the patient - the traditional
augmented reality approach.
The Zeiss MKM Microscope transport. It is a 6 degree of freedom robot with a surgical
stereomicroscope attached.
3.3 Education and Training
VR provides a unique resource for education about anatomical structure. One of the main
problems for medical education in general is to provide a realistic sense of the inter-relation of
anatomical structures in 3D space. With VR, the learner can repeatedly explore the structures of
interest, take them apart, put them together, view them from almost any perspective. This is
obviously impossible with a live patient, and is economically infeasible with cadavers (which, in
any case, have already lost many of the important characteristics of live tissue).
Another advantage of VR for medical education is that demonstrations and exercises or
explorations can easily be combined. For example, a "canned" tour of a particular structure,
perhaps with voice annotations from an expert, can be used to provide an overview. The learner
may then explore the structure freely and, perhaps later, be assigned the task of locating
particular aspects of this structure. It is also possible to preserve particularly instructive cases,
which would be impossible by other means.
There is something of crisis in current surgical training. As the techniques become more
complicated, and more surgeons require longer training, fewer opportunities for such training
exist.
Training in the operating theatre itself brings increased risk to the patient and longer operations.
New surgical procedures require training by other doctors, who are usually busy with their own
clinical work. It is difficult to train physicians in rural areas in new procedures. Training
opportunities for surgeons are on a case-by-case basis. Animal experiments are expensive, and of
course the anatomy is different.
The solution to these problems is seen to be the development of VR training simulators. These
allow the surgeon to practice difficult procedures under computer control. The usual analogy
made is with flight simulators, where trainee pilots gain many hours of experience before
moving on to practice in a real cockpit.
Boston Dynamics open surgery anastomosis trainer
The advantages of training simulators are obvious. Training can be done anytime and anywhere
the equipment is available. They make possible the reduction of operative risks associated with
the use of new techniques, reducing surgical morbidity and mortality.
However, the big challenge is to simulate with sufficient fidelity for skills to be transferred from
performing with the simulation to performing surgery on patients. Faithfulness is hard to achieve
and much more evaluation of different approaches to training simulation are needed. Many
experienced surgeons predict that in time, experience with training simulators will constitute a
component of medical certification. But this will require new regulations and legislation.
Hot topics in the area include the use of force feedback (see 2.3.4 above), increased accuracy of
modelling of soft tissue, and the role of auditory feedback.
For simple operations like suturing and biopsy needle placement, VR is effective, but perhaps an
overkill to train skills that can easily and cheaply be acquired in other ways.
The most useful and tractable areas for the development of training simulators are the various
techniques of endoscopic surgery in widespread use today. It is relatively easy to reproduce in
VR the restricted field of view and limited tactile feedback of endoscopic surgery. It is much
more problematic to reproduce open surgery techniques realistically. For complex anatomical
structures, this is definitely not yet possible.
Karlsruhe Endoscopic Surgery Trainer
The pictures above illustrate both the value of simulators for training procedures, but also their
current weaknesses in terms of realism. To realistically simulate an operation, the method of
interaction should be the same as in the real case (as with flight simulators). When this is not the
case, the VR can serve as an anatomy educational system rather than a training simulation.
One way of increasing the reality of interaction is to combine VR with physical models, as
illustrated in the Gatech simulators for endoscopy and eye surgery, and the Penn State University
bronchoscopy simulator (see below). These systems focus on training the surgeon in the use of
particular medical devices, rather than on training a better awareness of general or specific
patient anatomy.
Gatech: Endoscopic Surgical Simulator
Gatech: Eye Surgery Simulator
Bronchoscope Simulator from Penn State University Hospital at Hershey
An example of an anatomy educational system is the EVL eye (shown below) from the
University of Illinois. Since the VR is immersive and based around the CAVE, it cannot be said
to duplicate the interaction methods of real eye surgery (since surgeons cannot get physically
inside eyes) and so is not a training simulator, unlike the Georgia Tech system above.
The EVL eye, from the Electronic Visualisation Lab, University of Illinois at Chicago
The EVL eye used by a group in the CAVE
More realistically in terms of interaction, the Responsive Workbench is another candidate for
anatomy teaching (see below). As with CAVE-based applications, a shared VR enhances the
potential for collaborative learning.
The Responsive Workbench from GMD in Germany
The most technologically challenging area of simulator training is for highly specialised aspects
of life-critical operations such as brain surgery. The Johns Hopkins/KRDL skull-base surgery
simulator for training aneurysm clipping (see below) is one example. The interaction is entirely
with the VR itself.
JHU/KRDL Skull-base Surgery Simulator
Researchers at University of California San Diego Applied Technology Lab have developed an
interesting Anatomy Lesson Prototype [http://cybermed.ucsd.edu/AT/AT-anat.html]. They point
out that the main challenges they identified from talking to medical faculty and students included
visualising potential spaces; studying relatively inaccessible areas; tracing layers and linings;
establishing external landmarks for deep structures; and cogently presenting embryological
origins. Correlating gross anatomy with various diagnostic imaging modalities, and portraying
complex physiological processes using virtual representation were also considered highly
valuable goals.
Relevant Web Sites:
VR and Education
VR in Surgical Education
3.4 Preoperative planning
Simulators such as the JHU/KRDL Skull-base Surgery Simulator blur into systems for pre-
operative planning. Planning systems also sometimes blend with augmented reality, since the
planning is on a actual, particular patient, so that physical reality (the patient) and the VR
naturally come together in planning. The aim in such planning is to study patient data before
surgery and so plan the best way to carry out that surgery.
Preoperative planning must:
Use actual patient data to be operated upon
Not use an idealised model, or atlas, or Visible Human dataset
Be fast
Be accurate
Be multimodal (different data sources): to show blood vessels, soft tissue, bone, etc.
Convey as much information as possible
Radionics' Stereoplan - a 'pure' planning system
The aim of Stereoplan is to allow surgeons to examine patient data as fully as possible, and
evaluate possible routes for intervention. Further, the system then provides the coordinates for
the stereotactic frames that are standardly used to guide the route for brain surgery. Similar to the
Radionics' Stereoplan, the KRDL Brainbench, built around the Virtual Workbench, aims at
helping planning of stereotactic frame-based functional neurosurgery (see below).
KRDL Brainbench for stereotactic frame-based neurosurgery planning
Combined neurosurgery planning and augmented reality from Harvard Medical School
In pre-operative planning the interaction method need not be realistic and generally is not. The
main focus is on exploring the patient data as fully as possible, and evaluating possible
intervention procedures against that data, not in reproducing the actual operation. The University
of Virginia "Props" interface illustrates this (below). A doll's head is used in the interaction with
the dataset, without any suggestion that the surgeon will ever interact with a patient's head in
quite this way.
University if Virginia "Props" Interface used in pre-operative plannng
KRDL VIVIAN: the Virtual Workbench used for stereotactic tumor neurosurgery planning
Of course, the simulation must be accurate. Given this, techniques developed for planning can
sometimes be applied to the prediction of outcomes of interventions, as in bone replacements or
reconstructive plastic surgery. Such simulations can also help in training, and in communications
between doctors and patients (and their families).
An important aspect of such systems for use by medical staff is the design of the tools and how
this affects usability. See "Interaction Techniques for a Virtual Workspace".
3.5 Telemedicine and Collaboration
Telemedicine is surprisingly little used today in actual medical practice. According to a recent
article (The Economist, February 28th 1998) less than 1 in 1000 radiographs are viewed by a
distant, rather than a local, specialist. This is despite the proven ability of telemedicine to save
doctors' time and, hence, money (for example, from a recent study in Tromso on teleradiology).
Similarly, home visits can be successfully replaced with remote consultations, saving money and
increasing aged patients' satisfaction (because they can get more frequent consultations without
troublesome travel), but currently only 1 in 2000 home visits are conducted remotely through
information technology. Telemedicine is successfully used in military settings, where normal
legal and economic considerations do not apply.
One promising area where VR could make a contribution is in remote diagnostics, where two
surgeons can confer on a particular case, each experiencing the same 3D visualisation, although
located in different places.
The other, often discussed, main applications are for remote operations, either through robotic
surgery, or through assistance to another remote surgeon. The big problem here is network delay,
since almost immediate interactivity is required. The small delay introduced by the use of
satellite communication is unacceptable in remote surgery. Talk of remote operations carried out
on space crew in deep space, or even merely on Mars, is pure science fiction (they would require
communication at speeds above that of light).
Robots are used more routinely non-remotely, for precision in carrying out certain procedures,
such as hip replacement. The types of operation to which robots are applied in this way are
usually high volume, repeated procedures. As well as improved accuracy, major cost savings can
be produced.
A relatively new development is to use surgeon-controlled robots to carry out, by key-hole
methods, operations which previously required open surgery. VR becomes important here in
providing a detailed 3D view to guide the surgeon in carrying out the operation via extremely
small robotic instruments. Major operations, such as coronary bypass, can be carried out in this
way with significantly reduced trauma and recovery time for the patient.
The technical possibility already exists for unsupervised robots to carry out surgery, but much
ethical and legal debate and legislation will be needed before this could be put into practice. This
survey does not focus primarily on telerobotics, which is itself a large field.
Remote surgeon workstation at the University of Virginia
The two upper images are live video provided via an ATM link. They show a view through the
surgical microscope and a room view. The remote surgeon may pan/tilt/zoom the room camera
and may move the microscope view with 6 degrees of freedom. The bottom windows
respectively show presurgical imaging with functional overlays, volume rendering of the surgical
plan, and a snapshot archive taken during surgery.
The Artma Virtual Patient System is an established technology for telesurgery.
The SRI Telepresence project is representative of current work in this area (see below).
SRI Telepresence: telerobotics and stereo video interface, surgeon interaction
SRI Telepresence: telerobotics and stereo video interface, patient interaction
3.6 Snapshot of the State of the Art: Conference Report on
Scientific and Clinical Tools for Minimally Invasive
Therapies, Medicine Meets VR 1998
Much of the research in this area, especially in the USA, is funded by the government for
military applications such as remote surgery on the battlefield. The main agencies re DARPA
and Yale-NASA (mostly the same projects that previously got funding from DARPA). Progress
reports on these projects, may of which have been running for several years, were presented at
the conference and included:
a "smart" T-Shirt that senses the path of the bullet that hit its wearer, monitors his
condition and location, etc., so that rescue teams can decide if he's worth rescuing, and be
prepared, and combat units can knock out the location from which he was attacked;
various personal monitoring devices, including a wearable system for astronauts;
a Limb Trauma simulator using the PHANToM (by MusculoGraphics, in Boston):
a stretcher with monitoring systems; and
an enhanced dexterity robot called ParaDex, among others.
HT Medical also gets funding from his committee, as well as SRI international, Boston
Dynamics, HIT Lab, and others.
During the conference as a whole, a broad selection of current work, both academic and
commercial was presented. There were many endoscopic simulators, for the knee, shoulder,
colon, abdomen... And all had some force-feedback that wasn't convincing as real tissue (from
what doctors said) but apparently helped in training (from what the engineers said).
Tactile tissue simulation was one of the key phrases. Everybody is trying to figure out how to do
it, but I didn't see (feel) any convincing implementation. Force feedback is the latest craze, but
the sensitivity to model subtle gradations just isn't there yet. An interesting alternative is to use
sound as feedback.
Also, many atlases of the whole human body (and one of a frog) were presented. Most used the
Visible Human, but there were others (the Japanese) that had their own data sets.
One interesting point that was raised by the team at SRI is that the key problem in training
surgeons is not how to convey the locomotive skills needed to manipulate an endoscope or how
to cut using a scalpel, but how to understand patient anatomy. Training the hands how to use an
endoscope takes a week or so, but learning how to the interpret a patient's anatomy takes years. I
agree with this assessment, and I think that's where rich interaction capabilities combined with
real-time volumetric rendering of multimodal data are crucial.
SRI, of Stanford, have tested their telepresence system with live animals using a 200 metre link.
Their results are published in the Journal of Vascular Surgery. Dr. John Hill of SRI presented
their first attempts to move towards computer-generated graphics training simulators using their
telepresence system. They use a set-up similar to the ISS Virtual Workbench, but with their own
interaction devices. They are working on simulating suture of tissue and vessels using an Onyx
and 2D texture maps.
Dr. Ramin Shahidi, Stanford University Medical Center, is working on SGI-based volume
rendering neurosurgery and craniofacial applications. Their graphics didn't include more than
one volume at a time. His presentation was an overview of the use of volume rendering vs.
surface rendering.
NASA-Ames and Stanford University have created the National Biocomputation Center: Dr.
Muriel Ross was announcing this centre as a resource for collaboration with academics and
industry, to promote medical VR. NASA-Ames have an Immersion Workbench (aka Responsive
Workbench, aka Immersadesk) and their own visualisation software and are working on
craniofacial "virtual" surgery. It appears that they use polygon meshes for their visualisation.
Dr. Henry Fuchs presented work in progress at UNC that uses depth range finders to reconstruct
a surface map of the intestines to then guide an endoscope for colonoscopy. All this was added to
their well-known augmented reality system, and comprises an interestingly novel approach.
HT Medical presented their VR Simulation of Abdominal Trauma Surgery. They use the
PHANToM and some "wet" graphics to remove a kidney. They simulate the "steps" taken by the
surgeon. First the surgeon cuts the skin, which then opens, revealing the intestines. A wet
graphics effect is used, but this looks more like "cling film" wrapping over everything. The
intestines moved quite unconvincingly, in an animation that was slightly under the control of the
user (it didn't appear like inverse kinematics were attaching the end-point of the intestines to the
user's tool). The kidney was removed by simply "reaching into it" and moving it out. The
practical value of such a demonsytration was not clear, however.
An impressive paper from Wegner and Karron of Computer Aided Surgery Inc., described the
use of auditory feedback to guide blind biopsy needle placement. Their audio feedback system
generates an error signal in 3D space with respect to a planned needle trajectory. This error
signal and the preoperative plan are used to motivate a position sonification algorithm which
generates appropriate sounds to guide the operator in needle placement. To put it simply,
harmonics versus dissonances are used to convey position information accurately along 6-8
dimensions. A nice example of a synaesthetic medium - using one modality (sound) where one
would normally expect another (touch and/or vision). Their approach has wide applicability.
Myron Kreuger is President of Artificial Reality Corporation and a claimant to the title of
inventor of VR. The system he described was a training system for dealing with emergencies,
where smells of, for example, petrol or the contents of the lower intestine, can provide valuable
information in a hazardous situation. It has also been claimed that smell can significantly
enhance the effectiveness of medical training systems.
Many manufacturers presented their latest demonstrations and products at the conference
exhibition. HT demonstrated CathSim, a simulator that trains nursing students to perform
vascular catheterisations. They built a special force feedback device and a some simple graphics
to provide visual feedback. It was quite good to guide the needle, but had little or no feedback
once inside the skin. This seemed like an "technological overkill" since the procedure is easily
learned without VR and is not exactly hazardous.
They also demonstrated a Flexible Bronchoscopy simulator developed with a partnership of
pulmonologists and pharmacology experts at Merck & Co. (based on the Visible Human
Project). They have a way to track the flexible tip of the endoscope ("a secret", I was told when I
asked) and they generate nice 2D texture-mapped graphics of the interior throat using an SRI
Impact.
Fraunhofer had two demonstrations from their Providence office:
TeleInVivo, demonstrating a PC software volume renderer (a few seconds per rendering
for small windows areas) attached to an ultrasound probe.
Interventional Ultrasound: A guiding system for biopsy needle insertion using an
ultrasound tracking system (not much of an implementation at the moment), so it's the old
idea of using ultrasound to guide a biopsy needle. They overlay the ultrasound view with
the biopsy needle path, something that UNC demonstrated at SIGGRAPH, but without
the expensive head gear.
Matthias Wapler, of the IPA branch in Stuttgart, also described a robot for precise endoscopy
and neurosurgical navigation. They have not yet developed planning software for their system.
Loral were at the Immersion booth, presenting a training system using the Immersion Corp.'s
force feedback device. The application lets the surgeon guide an endoscope through the nose of a
patient. The simulation was "helpful" to surgeons, although it is rather crude and doesn't feel like
the real thing.
Prosolvia (the main Swedish VR company) demonstrated s system for Virtual Arthroscopy of the
shoulder, developed with University Hospital of Linköping. They used the Immersion Corp.
force-feedback system, and their own Oxygen software base. They are interested in collaborating
on VR medical training systems.
Four demonstrations were shown at the SensAble booth:
The Ophthalmic surgical simulator. This project combines N-Vision US$25,000 stereo
display (binoculars with 1280x1024 resolution; there is a cheaper version for VGA
graphics at US$15,000) with the PHANToM, and a nice simulation of the feel of an eye.
The computer platform is Intergraph. Since the PHANToM doesn't provide torque
feedback, I didn't really appreciate the usefulness of the feedback system while cutting
around the cornea. However, prodding the eye produced convincing force feedback.
MusculoGraphics surgery simulation solutions. Their Limb Trauma simulator DIDN'T
have force feedback, so the PHANToM was used as a 3D pointer. The simulation
consisted in picking up a bullet and stopping bleeding of a blood vessel. I thought the
system was unrealistic and of limited usefulness.
Their IV catheter insertion system HAD force feedback, and was quite convincing.
Spine Biopsy Simulator, by the Georgetown University Medical School, for educational
use. The aim is to mimic an actual spine biopsy procedure and improve overall learning
by students. Unfortunately, their demo wasn't working.
Virtual Presence presented two useful tools:
VolWin, a volume rendering package (US$700) on the PC that is based on the Voxar
API. The performance was really good, running on a plain PC. A 256x256x256 volume
was rendered at some 5-6 fps, with some aliasing effects, but basically usable quality.
A package that tests the surgeon's performance using the Immersion Corp. laparoscopy
device. No fancy graphics, the idea being to measure performance in hitting targets. An
excellent simple idea for laparoscopy training.
Gold Standard Multimedia have produced a CD-ROM with a segmentation of the Visible Male.
The package volume renders the views and structures chosen, on a PC platform.
Sense8's medical customers are the National Centre for Biocomputation (NASA, Stanford
University), Rutgers University, Center for Neuro-Science, and Iowa School of Dentistry. A
knee simulator was presented. Unfortunately, it broke early in the conference.
Vista Medical Technologies had a good head mounted display to substitute the microscope. It is
not head tracked, but it allows the surgeon to look through the microscope and outside. It also
allows Picture-in-Picture, so that an endoscope can be used to supplement the microscope.
There was a nice demonstration of 3D sonification from Lake Acoustics of Australia, who were
also involved in the 3D sound feedback for biopsy needle placement described briefly above (the
paper by Wegner and Karron). Using their kit, it is very simple to place sounds in a three-
dimensional landscape surrounding the body to the front (as with normal stereo) and to the back
(as with cinema surround sound) but using only headphones. They were giving away diskettes
containing an impressive demonstration of this system.
3.7 Physical and Mental Health and Rehabilitation
It is clear that this is one of the medical areas where VR can most immediately and successfully
be applied today. This is partly because the technical demands, particularly in terms of detailed
visualisation and interactivity, are actually less stringent than in some other areas, such as
surgery. Often these systems simulate the physical environment, a world of rooms, doors,
buildings, etc., many of which are simple shapes and much easier to model that the irregular and
contoured surfaces of internal organs. They also tend to be solid, and so the physics to be
understood so that they may be modelled is much simpler, and the complexity of interacting with
them is much less.
Main Application areas:
Mental health therapy: fear of heights, fear of flying, various other phobias. Eating
disorders. Stress control, Irritable Bowel Syndrome. Autism.
Patient rehabilitation: treadmills, wheelchairs, people with disabilities (CACM Aug
1997).
Parkinson's disease, stroke therapy - with physiological feedback.
Exploration and communication of unusual mental/body states is also a potential application
area.
Examples:
A Treatment of Akinesia Using Virtual Images
1998 "Technology and Persons with Disabilities" Conference
AcrophobiaVirtual Environment -- Final Report
Autism and VR
The Use of Virtual Reality in the Treatment of Phobias
VR and Disabilities
VR Exposure Therapy
VR in Eating Disorders
VR in Stroke Disorders
3.7.1 Snapshot of the State of the Art: Conference Report on Mental Health session,
Medicine Meets VR 1998
Topics covered at this year's conference included treatment of phobias, psychological
assessment, and cognitive rehabilitation.
The session also provided an opportunity for the launch of the new CyberPsychology and
Behavior journal, the first number of which includes a useful summary of the use of VR as a
therapeutic tool.
Brenda Wiederhold presented a good paper on using VR to go beyond the standard "imaginal"
training of phobic patients. The advantages of VR are, first, that fear can be effectively activated
(which is necessary to bring about change) but can be controlled (too much fear reinforces the
phobia) and, second, physiological measures can be used to control the display. One simple
measure of anxiety, first used by Jung, is a drop in skin resistance.
Similar work on claustrophobia and fear of heights was described by Bulligen of the University
of Basle. Another paper on acrophobia (fear of heights) by Huang et al. of the University of
Michigan described comparisons of real and virtual environments for emotional desensitisation,
and questioned the need for a high level of realism. Using the CAVE environment, they
compared the same views in VR and in reality. See their Web page for views.
A rather pleasant system from Japan, the "Bedside Wellness" system by Ohsuga et al, allows
bedridden patients to take a virtual forest walk while lying on their backs in bed. An array of
three video screens present the unfolding view of the forest as the patient gently steps on two
foot pedals. There is also 3D sound of birds, streams and wind in the trees. A slot below the
central screen delivers a gentle breeze scented with pine to the "walking" patient.
Rizzo, of the University of Southern California, is using VR to give increased ecological validity
to standard tests applied to Alzheimer's Disease patients, such as the mental rotation task (where
the patient has to decide if a second figure is a rotated version of an earlier figure, or is different
in shape). This Immersadesk application seemed like technological overkill to me. However, a
fuller paper by Rizzo et al in the CyberPsychology and Behavior journal, lists several advantages
of VR for cognitive and functional assessment and rehabilitation applications:
1. ecologically valid and dynamic testing and training scenarios, difficult to present by
other means
2. total control and consistency of administration
3. hierarchical and repetitive stimulus challenges that can be readily varied in complexity,
depending on level of performance
4. provision of cueing stimuli or visualisation tactics to help successful performance in an
errorless learning paradigm
5. immediate feedback of performance
6. ability to pause for discussion or instruction
7. option of self-guided exploration and independent testing and training
8. modification of sensory presentations and response requirements based on user's
impairments
9. complete performance recording
10. more naturalistic and intuitive performance record for review and analysis by the user
11. safe environment, although realistic
12. ability to introduce game-like aspects to enhance motivation for learning
13. low-cost functional training environments
Also on the topic of psychological assessment, Laura Medozzi et al, from Milan, described what
seemed to be high quality work to compare traditional tests with VR-based testing. The case of a
patient suffering frontal lobe dysfunction several years after a stroke was used to make the point
that traditional tests often fail to reveal deficits that can be identified with VR. This is thought to
be due to the nonverbal and immersive realism of VR, compared to the presence of a human
examiner, in traditional testing, who inadvertently provided surrogate control over higher order
faculties - largely through verbal exchanges. The same group, in collaboration with workers
under David Rose at the University of East London, described the use of VR to aid cognitive
rehabilitation.
Joan McComas of the University of Ottawa described a VR system for developing spatial skills
in children. She had carried out a four-condition study where choice of location to move to was
either passive or active, as was navigation to that location. The four were then: passenger
(passive choice/passive movement) navigator (active choice/passive movement), driver (active
choice/active movement) and navigated driver (passive choice/active movement). The task was
to find things hidden at locations, but without going to the same location twice. Measures were
percent of correct choices and visit of first error. It occurred to me that we could use this sort of
approach in studies of exploration in 3D information landscapes. A paper by Weniger also struck
a chord by comparing spatial learning (maze navigation) with exercise of the executive function
(the maze with pictograms) and with the use of orientation skills (navigation of landscapes).
Giuseppe Riva, from the Applied Technology for Psychology Lab at the Instituto Auxologico
Italiano in Verbania also discussed the use of VR for psychological assessment - particularly the
development of the Body Image Virtual Reality Scale. Patients chose which virtual body they
think matches their own, and which they would prefer to have instead. The difference gives a
measure of body image distortions.
Greene and Heeter, of the Michigan State University Communication Technology Lab, described
CD-ROMs that contain VR-like stories of cancer sufferers, particularly in relation to coping with
pain. Details can be found at [http://www.commtechlab.msu.edu/products/]. An interesting paper
by Hunter reported the finding that VR can be very effective in helping burn-recovery patients
cope with the pain of treatment. Patients in the VR condition reported significant pain reduction
and less time spent thinking about pain.
Pope described the use of a VR system called "Viscereal" to provide physiological feedback.
Users could control the flow of blood to their hands, and hence could warm or cool them at will.
It has also been found to be effective in permitting conscious control of bowel activity, easing
clinically harmless but distressing conditions such as Irritable Bowel Syndrome.
The Woodburys, a husband and wife team from the Puerto Rican Institute of Psychiatry, mused
on modern cosmology and the origins of our three dimensionality. They gave the conference a
useful reminder that the 3D world is in our heads, not in the world "out there". Pathological
psychological states - especially various psychoses - and altered states of consciousness
produced by certain hallucinogenic drugs, make this clear as the world around the experiencer,
and his sense of his body and its place in that world, falls apart in typical psychotic panic states.
Following Pribram, the Woodburys view the 3D world we know so well as a holographic
projection, formed in the brain according to principles established through evolution as aiding
survival. While recognising that this world is an illusion, psychiatrists work to restore it in
patients whose world has literally collapsed.
Although not mentioned by presenters, one of the audience, Rita Addison, talked about the use of
VR to communicate the reality of mental deficits to other, normal people. Rita has visited the
VRLab in Umea and is well-known for her "Detour: Brain Deconstruction Ahead" which
reproduces for others her visual problems since a car accident a few years ago.

More Related Content

What's hot

Computer Assisted Surgical Intervention
Computer Assisted Surgical InterventionComputer Assisted Surgical Intervention
Computer Assisted Surgical Interventionanirudh.s
 
Intracerebral Hemorrhage (ICH): Understanding the CT imaging features
Intracerebral Hemorrhage (ICH): Understanding the CT imaging featuresIntracerebral Hemorrhage (ICH): Understanding the CT imaging features
Intracerebral Hemorrhage (ICH): Understanding the CT imaging featuresPetteriTeikariPhD
 
Caos and robotic surgeries
Caos and robotic surgeriesCaos and robotic surgeries
Caos and robotic surgeriesdrranjithkumar
 
Stereotaxic atlas
Stereotaxic atlasStereotaxic atlas
Stereotaxic atlasSmita Jain
 
virtual simulation
virtual simulationvirtual simulation
virtual simulationSumer Yadav
 
Simplicity in Neuronavigation
Simplicity in NeuronavigationSimplicity in Neuronavigation
Simplicity in NeuronavigationBrainlab
 
Computer assisted surgery
Computer assisted surgeryComputer assisted surgery
Computer assisted surgeryDrDeepa Grover
 
Model guided therapy and the role of dicom in surgery
Model guided therapy and the role of dicom in surgeryModel guided therapy and the role of dicom in surgery
Model guided therapy and the role of dicom in surgeryKlaus19
 
3D Spherical Harmonics Deformation Visualization of Intraoperative Organ
3D Spherical Harmonics Deformation Visualization of Intraoperative Organ3D Spherical Harmonics Deformation Visualization of Intraoperative Organ
3D Spherical Harmonics Deformation Visualization of Intraoperative OrganIOSR Journals
 
Robotic Surgery In Orthopaedics - orthoapedic seminar-Dr Mukul Jain GMCH, U...
Robotic Surgery In Orthopaedics -   orthoapedic seminar-Dr Mukul Jain GMCH, U...Robotic Surgery In Orthopaedics -   orthoapedic seminar-Dr Mukul Jain GMCH, U...
Robotic Surgery In Orthopaedics - orthoapedic seminar-Dr Mukul Jain GMCH, U...MukulJain81
 
Wavelets in Medical Image Processing On Hip Arthroplasty and De-Noising, Segm...
Wavelets in Medical Image Processing On Hip Arthroplasty and De-Noising, Segm...Wavelets in Medical Image Processing On Hip Arthroplasty and De-Noising, Segm...
Wavelets in Medical Image Processing On Hip Arthroplasty and De-Noising, Segm...IOSR Journals
 
Medical Image Processing in Nuclear Medicine and Bone Arthroplasty
Medical Image Processing in Nuclear Medicine and Bone ArthroplastyMedical Image Processing in Nuclear Medicine and Bone Arthroplasty
Medical Image Processing in Nuclear Medicine and Bone ArthroplastyIOSR Journals
 

What's hot (20)

Computer Assisted Surgical Intervention
Computer Assisted Surgical InterventionComputer Assisted Surgical Intervention
Computer Assisted Surgical Intervention
 
Intracerebral Hemorrhage (ICH): Understanding the CT imaging features
Intracerebral Hemorrhage (ICH): Understanding the CT imaging featuresIntracerebral Hemorrhage (ICH): Understanding the CT imaging features
Intracerebral Hemorrhage (ICH): Understanding the CT imaging features
 
virtual surgery
virtual surgeryvirtual surgery
virtual surgery
 
Virtual surgery [new].ppt
Virtual surgery [new].pptVirtual surgery [new].ppt
Virtual surgery [new].ppt
 
Caos and robotic surgeries
Caos and robotic surgeriesCaos and robotic surgeries
Caos and robotic surgeries
 
Stereotaxic atlas
Stereotaxic atlasStereotaxic atlas
Stereotaxic atlas
 
virtual simulation
virtual simulationvirtual simulation
virtual simulation
 
Simplicity in Neuronavigation
Simplicity in NeuronavigationSimplicity in Neuronavigation
Simplicity in Neuronavigation
 
Image guided surgery
Image guided surgeryImage guided surgery
Image guided surgery
 
Computer assisted surgery
Computer assisted surgeryComputer assisted surgery
Computer assisted surgery
 
Model guided therapy and the role of dicom in surgery
Model guided therapy and the role of dicom in surgeryModel guided therapy and the role of dicom in surgery
Model guided therapy and the role of dicom in surgery
 
FieldStrength 49
FieldStrength 49FieldStrength 49
FieldStrength 49
 
3D Spherical Harmonics Deformation Visualization of Intraoperative Organ
3D Spherical Harmonics Deformation Visualization of Intraoperative Organ3D Spherical Harmonics Deformation Visualization of Intraoperative Organ
3D Spherical Harmonics Deformation Visualization of Intraoperative Organ
 
Robotic Surgery In Orthopaedics - orthoapedic seminar-Dr Mukul Jain GMCH, U...
Robotic Surgery In Orthopaedics -   orthoapedic seminar-Dr Mukul Jain GMCH, U...Robotic Surgery In Orthopaedics -   orthoapedic seminar-Dr Mukul Jain GMCH, U...
Robotic Surgery In Orthopaedics - orthoapedic seminar-Dr Mukul Jain GMCH, U...
 
Toshiba Visions 23
Toshiba Visions 23Toshiba Visions 23
Toshiba Visions 23
 
Digital imaging in orthodontics
Digital imaging in orthodontics Digital imaging in orthodontics
Digital imaging in orthodontics
 
FieldStrength 50 advanced neuro
FieldStrength 50 advanced neuroFieldStrength 50 advanced neuro
FieldStrength 50 advanced neuro
 
Wavelets in Medical Image Processing On Hip Arthroplasty and De-Noising, Segm...
Wavelets in Medical Image Processing On Hip Arthroplasty and De-Noising, Segm...Wavelets in Medical Image Processing On Hip Arthroplasty and De-Noising, Segm...
Wavelets in Medical Image Processing On Hip Arthroplasty and De-Noising, Segm...
 
Somatom sessions 32
Somatom sessions 32Somatom sessions 32
Somatom sessions 32
 
Medical Image Processing in Nuclear Medicine and Bone Arthroplasty
Medical Image Processing in Nuclear Medicine and Bone ArthroplastyMedical Image Processing in Nuclear Medicine and Bone Arthroplasty
Medical Image Processing in Nuclear Medicine and Bone Arthroplasty
 

Viewers also liked (19)

Presentación1
Presentación1Presentación1
Presentación1
 
Virtual reality
Virtual realityVirtual reality
Virtual reality
 
Communication
CommunicationCommunication
Communication
 
Imts
ImtsImts
Imts
 
Autobiography!
Autobiography!Autobiography!
Autobiography!
 
My meal plan
My meal planMy meal plan
My meal plan
 
Imt
ImtImt
Imt
 
AI
AIAI
AI
 
Water resources
Water resources Water resources
Water resources
 
Cloud computing abstract
Cloud computing abstractCloud computing abstract
Cloud computing abstract
 
Vermeidung von Stille und Rauschen in API-Dokumentatio
Vermeidung von Stille und Rauschen in API-DokumentatioVermeidung von Stille und Rauschen in API-Dokumentatio
Vermeidung von Stille und Rauschen in API-Dokumentatio
 
iDocIt - Ein Assistent zur API-Dokumentation
iDocIt - Ein Assistent zur API-DokumentationiDocIt - Ein Assistent zur API-Dokumentation
iDocIt - Ein Assistent zur API-Dokumentation
 
Using Thematic Grids to Document Web Service Operations
Using Thematic Grids to Document Web Service OperationsUsing Thematic Grids to Document Web Service Operations
Using Thematic Grids to Document Web Service Operations
 
A paper presentation abstract
A paper presentation abstractA paper presentation abstract
A paper presentation abstract
 
Using Meaningful Names to Improve API Documentation
Using Meaningful Names to Improve API DocumentationUsing Meaningful Names to Improve API Documentation
Using Meaningful Names to Improve API Documentation
 
Na no technology
Na no technologyNa no technology
Na no technology
 
Brain computer interfaces
Brain computer interfacesBrain computer interfaces
Brain computer interfaces
 
imageprocessing-abstract
imageprocessing-abstractimageprocessing-abstract
imageprocessing-abstract
 
Cloud computing abstract
Cloud computing abstractCloud computing abstract
Cloud computing abstract
 

Similar to VR Applications in Medical Surgery, Training and Rehabilitation

virtual surgery
virtual surgeryvirtual surgery
virtual surgeryMakka Vasu
 
Augmented Reality : Future of Orthopedic Surgery
Augmented Reality : Future of Orthopedic SurgeryAugmented Reality : Future of Orthopedic Surgery
Augmented Reality : Future of Orthopedic SurgeryPayelBanerjee17
 
Augmented reality in spine surgery
Augmented reality in spine surgeryAugmented reality in spine surgery
Augmented reality in spine surgeryealkhatib
 
Augmented reality in spine surgery
Augmented reality in spine surgeryAugmented reality in spine surgery
Augmented reality in spine surgeryealkhatib
 
Procedure and system for three-dimensional representation of an augmented rea...
Procedure and system for three-dimensional representation of an augmented rea...Procedure and system for three-dimensional representation of an augmented rea...
Procedure and system for three-dimensional representation of an augmented rea...Toscana Open Research
 
Autonomous Camera Movement for Robotic-Assisted Surgery: A Survey
Autonomous Camera Movement for Robotic-Assisted Surgery: A SurveyAutonomous Camera Movement for Robotic-Assisted Surgery: A Survey
Autonomous Camera Movement for Robotic-Assisted Surgery: A SurveyIJAEMSJORNAL
 
Study: Development of a precision multimodal surgical navigation system for l...
Study: Development of a precision multimodal surgical navigation system for l...Study: Development of a precision multimodal surgical navigation system for l...
Study: Development of a precision multimodal surgical navigation system for l...JeanmarcBasteMDPhD
 
Diagnostic medical imaging
Diagnostic medical imagingDiagnostic medical imaging
Diagnostic medical imagingPostDICOM
 
Immersive Technologies in Healthcare and Their Use
Immersive Technologies in Healthcare and Their UseImmersive Technologies in Healthcare and Their Use
Immersive Technologies in Healthcare and Their UseImmersive Studio
 
3D diagnostics in dental medicine - CBCT
3D diagnostics in dental medicine - CBCT3D diagnostics in dental medicine - CBCT
3D diagnostics in dental medicine - CBCTtlauc
 
Advanced Robot Vision for Medical Surgical Applications
Advanced Robot Vision for Medical Surgical ApplicationsAdvanced Robot Vision for Medical Surgical Applications
Advanced Robot Vision for Medical Surgical ApplicationsDR.P.S.JAGADEESH KUMAR
 
Ambient Intelligence and Immersive Virtual Telepresence in Health Care
Ambient Intelligence and Immersive Virtual Telepresence in Health CareAmbient Intelligence and Immersive Virtual Telepresence in Health Care
Ambient Intelligence and Immersive Virtual Telepresence in Health CareRiva Giuseppe
 

Similar to VR Applications in Medical Surgery, Training and Rehabilitation (20)

virtual surgery
virtual surgeryvirtual surgery
virtual surgery
 
IRJET-V9I1137.pdf
IRJET-V9I1137.pdfIRJET-V9I1137.pdf
IRJET-V9I1137.pdf
 
Augmented Reality : Future of Orthopedic Surgery
Augmented Reality : Future of Orthopedic SurgeryAugmented Reality : Future of Orthopedic Surgery
Augmented Reality : Future of Orthopedic Surgery
 
Augmented reality in spine surgery
Augmented reality in spine surgeryAugmented reality in spine surgery
Augmented reality in spine surgery
 
Augmented reality in spine surgery
Augmented reality in spine surgeryAugmented reality in spine surgery
Augmented reality in spine surgery
 
Introduction to Programming
Introduction to ProgrammingIntroduction to Programming
Introduction to Programming
 
Procedure and system for three-dimensional representation of an augmented rea...
Procedure and system for three-dimensional representation of an augmented rea...Procedure and system for three-dimensional representation of an augmented rea...
Procedure and system for three-dimensional representation of an augmented rea...
 
Stereotaxy Brain
Stereotaxy BrainStereotaxy Brain
Stereotaxy Brain
 
Autonomous Camera Movement for Robotic-Assisted Surgery: A Survey
Autonomous Camera Movement for Robotic-Assisted Surgery: A SurveyAutonomous Camera Movement for Robotic-Assisted Surgery: A Survey
Autonomous Camera Movement for Robotic-Assisted Surgery: A Survey
 
Robotics and simulation in neurosurgery
Robotics and simulation in neurosurgeryRobotics and simulation in neurosurgery
Robotics and simulation in neurosurgery
 
Future of Minimal Access Surgery
Future of Minimal Access SurgeryFuture of Minimal Access Surgery
Future of Minimal Access Surgery
 
Study: Development of a precision multimodal surgical navigation system for l...
Study: Development of a precision multimodal surgical navigation system for l...Study: Development of a precision multimodal surgical navigation system for l...
Study: Development of a precision multimodal surgical navigation system for l...
 
Diagnostic medical imaging
Diagnostic medical imagingDiagnostic medical imaging
Diagnostic medical imaging
 
Immersive Technologies in Healthcare and Their Use
Immersive Technologies in Healthcare and Their UseImmersive Technologies in Healthcare and Their Use
Immersive Technologies in Healthcare and Their Use
 
MTP.pptx
MTP.pptxMTP.pptx
MTP.pptx
 
3D diagnostics in dental medicine - CBCT
3D diagnostics in dental medicine - CBCT3D diagnostics in dental medicine - CBCT
3D diagnostics in dental medicine - CBCT
 
D017511825
D017511825D017511825
D017511825
 
Advanced Robot Vision for Medical Surgical Applications
Advanced Robot Vision for Medical Surgical ApplicationsAdvanced Robot Vision for Medical Surgical Applications
Advanced Robot Vision for Medical Surgical Applications
 
Virtual reality in health care by Rabeendra Basnet
Virtual reality in health care by Rabeendra BasnetVirtual reality in health care by Rabeendra Basnet
Virtual reality in health care by Rabeendra Basnet
 
Ambient Intelligence and Immersive Virtual Telepresence in Health Care
Ambient Intelligence and Immersive Virtual Telepresence in Health CareAmbient Intelligence and Immersive Virtual Telepresence in Health Care
Ambient Intelligence and Immersive Virtual Telepresence in Health Care
 

More from Jagadeesh Kumar (10)

VR
VRVR
VR
 
Social networks
Social networksSocial networks
Social networks
 
Sixth sense
Sixth senseSixth sense
Sixth sense
 
A paper presentation
A paper presentationA paper presentation
A paper presentation
 
Virtual reality
Virtual realityVirtual reality
Virtual reality
 
Mother and son
Mother and sonMother and son
Mother and son
 
Virtual reality
Virtual realityVirtual reality
Virtual reality
 
Virtual reality
Virtual realityVirtual reality
Virtual reality
 
Multimedia
MultimediaMultimedia
Multimedia
 
Artificial intelligence
Artificial intelligence Artificial intelligence
Artificial intelligence
 

Recently uploaded

URLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppURLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppCeline George
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3JemimahLaneBuaron
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxRoyAbrique
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Krashi Coaching
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Sapana Sha
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdfQucHHunhnh
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application ) Sakshi Ghasle
 
mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docxPoojaSen20
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionSafetyChain Software
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104misteraugie
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...EduSkills OECD
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityGeoBlogs
 
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991RKavithamani
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactdawncurless
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationnomboosow
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxSayali Powar
 

Recently uploaded (20)

URLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppURLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website App
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application )
 
mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docx
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory Inspection
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
Industrial Policy - 1948, 1956, 1973, 1977, 1980, 1991
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communication
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
 

VR Applications in Medical Surgery, Training and Rehabilitation

  • 1. 3.0 Introduction VR is being applied to a wide range of medical areas, including remote and local surgery, surgery planning, medical education and training, treatment of phobias and other causes of psychological distress, skill training, and pain reduction. It is also used for the visualisation of large-scale medical records, and in the architectural planning of medical facilities, although these last two applications are not covered by this survey. The survey focuses on three main application areas: surgery in general, neurosurgery, and mental and physical health and rehabilitation. See also Section 4: Sources and Resources in Medical VR. 3.1 VR for Surgery Surgery is mostly visual and manual. VR for surgery involves applications of interactive immersive computer technologies to help perform, plan and simulate surgical procedures. In performance, the VR guides the surgeon, sometimes with a robot to execute the procedure under the surgeon's control (to remove hand tremor and scale down manipulations for key-hole surgery, for example). In other words, VR is used to give the surgeon 3D interactive views of areas within the patient. Planning is carried out preoperatively, to find the best approach to surgery, involving minimum damage. Simulation is mostly used in training, using patient data often registered with anatomical information from an atlas. It may be used for routine training, or to focus on particularly difficult cases and new surgical techniques. VR is being applied in all three major areas of surgery: open surgery, endoscopic surgery and radiosurgery. The surgery may be remote (through the use of robotics) or local. In open surgery, the surgeon opens the body and uses hands and instruments to operate. This is the most invasive form of surgery, with long recovery times. There is a strong movement away from open surgery and towards improved techniques of minimally-invasive surgery.
  • 2. Open surgery Endoscopy is minimally invasive surgery through natural body openings or small artificial incisions ('keyhole surgery'): laparoscopy, thoracoscopy, arthroscopy, and so on. A small endoscopic camera is used in combination with several long, thin, rigid instruments. The trend is to carry our as much surgery as is feasible by this means, to minimise the risk to patients. Endoscopic surgery: the current situation without VR Advantages for the patient include less pain, and less strain on the organism, and faster recovery. There are also relatively small injuries, and an economic gain arising through shorter illness time. However, for the surgeon, there are several disadvantages, including restricted vision and mobility, difficult handling of the instruments, difficult hand-eye coordination and no tactile perception except force feedback. Endoscopic surgery is becoming increasingly popular, because of its significant advantages. It is also the most popular surgical application of VR, partly because it expands on what is already an
  • 3. "unnatural" view of the locus of operation. Another reason is that endoscopic surgery is relatively easy to simulate because of the limited access, restricted feedback (especially tactile) and limited freedom of movement of instruments. Endoscopic simulators are being produced by all the main medical VR companies, usually with a focus on training. Another recent trend is towards so-called Virtual Endoscopy. This is a technique whereby data from non-intrusive sources - such as scans - are combined into a virtual data model that can be explored by the surgeon as if an endoscope were inserted in the patient. VR is increasingly being used to provide surgeons with a meaningful and interactive 3D view of areas and structures they would otherwise be unable or unwilling to deal with directly. In radiosurgery, X-ray beams from a Linear Accelerator are finely collimated and accurately aimed at a lesion. Popular products include Radionics X-knife, and Elekta`s Gammaknife. Planning radiosurgery is suitable for VR, since it involves detailed understanding of 3D structure. Elekta's Gammaknife(left ) and the X-knife from Radionics (right) VR in surgery differs from most other VR in its focus on contact with objects, which must often be deformable objects and interdependent. The focus is on looking into objects rather than looking into space - there is less room available. The data is essentially volumetric and finger and hand interaction must be extremely precise. The above characteristics bring with them certain technical requirements, such as real-time response to user`s action - which implies fast graphics, low latency input devices. The images must be of high resolution and faithful to the actual patient data, since life-critical decisions are based on the presentation of patient data. For simulators, the physical procedures must match those used in the actual operation.
  • 4. Other requirements of VR for surgery include registration of patient data with atlases and the ability to coregister multimodal data. For use over extended periods, which is often needed in surgery, the style of user interaction should be natural, comfortable, and easy to use. Areas where VR is being applied: Image-guided surgery Guiding surgeons to targets during actual operations Training simulators Practicing difficult procedures Preoperative planning Studying patient data before surgery Telesurgery Operating remotely, or assisting other surgeons remotely. 3.2 Image-guided surgery VR can in principle be applied to enhance reality for image-guided surgery. When applied to image-guided surgery in this way, the images obviously need to be available intra-operatively, and accurate registration of the real patient with the data becomes a crucial issue. Currently, VR is used much more for preoperative planning (see 3.4 below) than to guide actual surgery (due to the understandable conservatism of medical practitioners). When VR is used intra-operatively, it tends to be implemented as some form of Augmented Reality (see the University of North Carolina system, below, and 2.4 above). Image-guided surgery is also a prerequisite of remote telemedicine and collaboration (see 3.5 below).
  • 5. Image-guided surgery, implemented as Augmented Reality, at the University of N. Carolina Brain tumour surgery guidance images
  • 6. ARTMA (University of Vienna) system for Image-guided Ear, Nose & Throat Surgery The ARTMA team at the University of Vienna were pioneers in this field. They refer to their approach as Interventional Video Tomography (see abstract below). It is also applied to telemedicine (see 3.5 below). "Interventional Video Tomography" SPIE Proceedings of Lasers in Surgery, 4-6 February 1995, San Jose, CA Paper #: 2395-34, pp.150-152 Author(s): Michael J. Truppe, Ferenc Pongracz, Artma Medizintechnik GmbH, Wien, Austria; Oliver Ploder, Arne Wagner, Rolf Ewers, Universität Wien, Vienna, Austria. Abstract Interventional Video Tomography (IVT) is a new imaging modality for Image Directed Surgery to visualize in real-time intraoperatively the spatial position of surgical instruments relative to the patient's anatomy. The video imaging detector is based on a special camera equipped with an optical viewing and lighting system and electronic 3D sensors. When combined with an endoscope it is used for examining the inside of cavities or hollow organs of the body from many different angles. The surface topography of objects is reconstructed from a sequence of monocular video or endoscopic images. To increase accuracy and speed of the reconstruction the relative movement between objects and endoscope is continuously tracked by electronic sensors. The IVT image sequence
  • 7. represents a 4D data set in stereotactic space and contains image, surface topography and motion data. In ENT surgery an IVT image sequence of the planned and so far accessible surgical path is acquired prior to surgery. To simulate the surgical procedure the cross sectional imaging data is superimposed with the digitally stored IVT image sequence. During surgery the video sequence component of the IVT simulation is substituted by the live video source. The IVT technology makes obsolete the use of 3D digitizing probes for the patient image coordinate transformation. The image fusion of medical imaging data with live video sources is the first practical use of augmented reality in medicine. During surgery a head-up display is used to overlay real-time reformatted cross sectional imaging data with the live video image. A main tool for traditional image-guided surgery is the microscope. Microscopes are now being integrated with robotic transport systems. Microscopes can also, in principle, serve as the vehicles for VR, as they will increasingly allow 3D views and are already "in place" in the operating theatre. Surgeons readily accept microscope-based views from which they can easily look away, whereas they are less comfortable with overlays placed on their primary direct view of the patient - the traditional augmented reality approach. The Zeiss MKM Microscope transport. It is a 6 degree of freedom robot with a surgical stereomicroscope attached. 3.3 Education and Training
  • 8. VR provides a unique resource for education about anatomical structure. One of the main problems for medical education in general is to provide a realistic sense of the inter-relation of anatomical structures in 3D space. With VR, the learner can repeatedly explore the structures of interest, take them apart, put them together, view them from almost any perspective. This is obviously impossible with a live patient, and is economically infeasible with cadavers (which, in any case, have already lost many of the important characteristics of live tissue). Another advantage of VR for medical education is that demonstrations and exercises or explorations can easily be combined. For example, a "canned" tour of a particular structure, perhaps with voice annotations from an expert, can be used to provide an overview. The learner may then explore the structure freely and, perhaps later, be assigned the task of locating particular aspects of this structure. It is also possible to preserve particularly instructive cases, which would be impossible by other means. There is something of crisis in current surgical training. As the techniques become more complicated, and more surgeons require longer training, fewer opportunities for such training exist. Training in the operating theatre itself brings increased risk to the patient and longer operations. New surgical procedures require training by other doctors, who are usually busy with their own clinical work. It is difficult to train physicians in rural areas in new procedures. Training opportunities for surgeons are on a case-by-case basis. Animal experiments are expensive, and of course the anatomy is different. The solution to these problems is seen to be the development of VR training simulators. These allow the surgeon to practice difficult procedures under computer control. The usual analogy made is with flight simulators, where trainee pilots gain many hours of experience before moving on to practice in a real cockpit.
  • 9. Boston Dynamics open surgery anastomosis trainer The advantages of training simulators are obvious. Training can be done anytime and anywhere the equipment is available. They make possible the reduction of operative risks associated with the use of new techniques, reducing surgical morbidity and mortality. However, the big challenge is to simulate with sufficient fidelity for skills to be transferred from performing with the simulation to performing surgery on patients. Faithfulness is hard to achieve and much more evaluation of different approaches to training simulation are needed. Many experienced surgeons predict that in time, experience with training simulators will constitute a component of medical certification. But this will require new regulations and legislation. Hot topics in the area include the use of force feedback (see 2.3.4 above), increased accuracy of modelling of soft tissue, and the role of auditory feedback. For simple operations like suturing and biopsy needle placement, VR is effective, but perhaps an overkill to train skills that can easily and cheaply be acquired in other ways. The most useful and tractable areas for the development of training simulators are the various techniques of endoscopic surgery in widespread use today. It is relatively easy to reproduce in VR the restricted field of view and limited tactile feedback of endoscopic surgery. It is much more problematic to reproduce open surgery techniques realistically. For complex anatomical structures, this is definitely not yet possible. Karlsruhe Endoscopic Surgery Trainer The pictures above illustrate both the value of simulators for training procedures, but also their current weaknesses in terms of realism. To realistically simulate an operation, the method of
  • 10. interaction should be the same as in the real case (as with flight simulators). When this is not the case, the VR can serve as an anatomy educational system rather than a training simulation. One way of increasing the reality of interaction is to combine VR with physical models, as illustrated in the Gatech simulators for endoscopy and eye surgery, and the Penn State University bronchoscopy simulator (see below). These systems focus on training the surgeon in the use of particular medical devices, rather than on training a better awareness of general or specific patient anatomy. Gatech: Endoscopic Surgical Simulator Gatech: Eye Surgery Simulator Bronchoscope Simulator from Penn State University Hospital at Hershey An example of an anatomy educational system is the EVL eye (shown below) from the University of Illinois. Since the VR is immersive and based around the CAVE, it cannot be said
  • 11. to duplicate the interaction methods of real eye surgery (since surgeons cannot get physically inside eyes) and so is not a training simulator, unlike the Georgia Tech system above. The EVL eye, from the Electronic Visualisation Lab, University of Illinois at Chicago The EVL eye used by a group in the CAVE More realistically in terms of interaction, the Responsive Workbench is another candidate for anatomy teaching (see below). As with CAVE-based applications, a shared VR enhances the potential for collaborative learning.
  • 12. The Responsive Workbench from GMD in Germany The most technologically challenging area of simulator training is for highly specialised aspects of life-critical operations such as brain surgery. The Johns Hopkins/KRDL skull-base surgery simulator for training aneurysm clipping (see below) is one example. The interaction is entirely with the VR itself. JHU/KRDL Skull-base Surgery Simulator Researchers at University of California San Diego Applied Technology Lab have developed an interesting Anatomy Lesson Prototype [http://cybermed.ucsd.edu/AT/AT-anat.html]. They point out that the main challenges they identified from talking to medical faculty and students included visualising potential spaces; studying relatively inaccessible areas; tracing layers and linings; establishing external landmarks for deep structures; and cogently presenting embryological
  • 13. origins. Correlating gross anatomy with various diagnostic imaging modalities, and portraying complex physiological processes using virtual representation were also considered highly valuable goals. Relevant Web Sites: VR and Education VR in Surgical Education 3.4 Preoperative planning Simulators such as the JHU/KRDL Skull-base Surgery Simulator blur into systems for pre- operative planning. Planning systems also sometimes blend with augmented reality, since the planning is on a actual, particular patient, so that physical reality (the patient) and the VR naturally come together in planning. The aim in such planning is to study patient data before surgery and so plan the best way to carry out that surgery. Preoperative planning must: Use actual patient data to be operated upon Not use an idealised model, or atlas, or Visible Human dataset Be fast Be accurate Be multimodal (different data sources): to show blood vessels, soft tissue, bone, etc. Convey as much information as possible Radionics' Stereoplan - a 'pure' planning system
  • 14. The aim of Stereoplan is to allow surgeons to examine patient data as fully as possible, and evaluate possible routes for intervention. Further, the system then provides the coordinates for the stereotactic frames that are standardly used to guide the route for brain surgery. Similar to the Radionics' Stereoplan, the KRDL Brainbench, built around the Virtual Workbench, aims at helping planning of stereotactic frame-based functional neurosurgery (see below). KRDL Brainbench for stereotactic frame-based neurosurgery planning
  • 15. Combined neurosurgery planning and augmented reality from Harvard Medical School In pre-operative planning the interaction method need not be realistic and generally is not. The main focus is on exploring the patient data as fully as possible, and evaluating possible intervention procedures against that data, not in reproducing the actual operation. The University of Virginia "Props" interface illustrates this (below). A doll's head is used in the interaction with the dataset, without any suggestion that the surgeon will ever interact with a patient's head in quite this way. University if Virginia "Props" Interface used in pre-operative plannng KRDL VIVIAN: the Virtual Workbench used for stereotactic tumor neurosurgery planning Of course, the simulation must be accurate. Given this, techniques developed for planning can sometimes be applied to the prediction of outcomes of interventions, as in bone replacements or reconstructive plastic surgery. Such simulations can also help in training, and in communications between doctors and patients (and their families). An important aspect of such systems for use by medical staff is the design of the tools and how this affects usability. See "Interaction Techniques for a Virtual Workspace". 3.5 Telemedicine and Collaboration
  • 16. Telemedicine is surprisingly little used today in actual medical practice. According to a recent article (The Economist, February 28th 1998) less than 1 in 1000 radiographs are viewed by a distant, rather than a local, specialist. This is despite the proven ability of telemedicine to save doctors' time and, hence, money (for example, from a recent study in Tromso on teleradiology). Similarly, home visits can be successfully replaced with remote consultations, saving money and increasing aged patients' satisfaction (because they can get more frequent consultations without troublesome travel), but currently only 1 in 2000 home visits are conducted remotely through information technology. Telemedicine is successfully used in military settings, where normal legal and economic considerations do not apply. One promising area where VR could make a contribution is in remote diagnostics, where two surgeons can confer on a particular case, each experiencing the same 3D visualisation, although located in different places. The other, often discussed, main applications are for remote operations, either through robotic surgery, or through assistance to another remote surgeon. The big problem here is network delay, since almost immediate interactivity is required. The small delay introduced by the use of satellite communication is unacceptable in remote surgery. Talk of remote operations carried out on space crew in deep space, or even merely on Mars, is pure science fiction (they would require communication at speeds above that of light). Robots are used more routinely non-remotely, for precision in carrying out certain procedures, such as hip replacement. The types of operation to which robots are applied in this way are usually high volume, repeated procedures. As well as improved accuracy, major cost savings can be produced. A relatively new development is to use surgeon-controlled robots to carry out, by key-hole methods, operations which previously required open surgery. VR becomes important here in providing a detailed 3D view to guide the surgeon in carrying out the operation via extremely small robotic instruments. Major operations, such as coronary bypass, can be carried out in this way with significantly reduced trauma and recovery time for the patient. The technical possibility already exists for unsupervised robots to carry out surgery, but much ethical and legal debate and legislation will be needed before this could be put into practice. This survey does not focus primarily on telerobotics, which is itself a large field.
  • 17. Remote surgeon workstation at the University of Virginia The two upper images are live video provided via an ATM link. They show a view through the surgical microscope and a room view. The remote surgeon may pan/tilt/zoom the room camera and may move the microscope view with 6 degrees of freedom. The bottom windows respectively show presurgical imaging with functional overlays, volume rendering of the surgical plan, and a snapshot archive taken during surgery. The Artma Virtual Patient System is an established technology for telesurgery. The SRI Telepresence project is representative of current work in this area (see below).
  • 18. SRI Telepresence: telerobotics and stereo video interface, surgeon interaction SRI Telepresence: telerobotics and stereo video interface, patient interaction 3.6 Snapshot of the State of the Art: Conference Report on Scientific and Clinical Tools for Minimally Invasive Therapies, Medicine Meets VR 1998
  • 19. Much of the research in this area, especially in the USA, is funded by the government for military applications such as remote surgery on the battlefield. The main agencies re DARPA and Yale-NASA (mostly the same projects that previously got funding from DARPA). Progress reports on these projects, may of which have been running for several years, were presented at the conference and included: a "smart" T-Shirt that senses the path of the bullet that hit its wearer, monitors his condition and location, etc., so that rescue teams can decide if he's worth rescuing, and be prepared, and combat units can knock out the location from which he was attacked; various personal monitoring devices, including a wearable system for astronauts; a Limb Trauma simulator using the PHANToM (by MusculoGraphics, in Boston): a stretcher with monitoring systems; and an enhanced dexterity robot called ParaDex, among others. HT Medical also gets funding from his committee, as well as SRI international, Boston Dynamics, HIT Lab, and others. During the conference as a whole, a broad selection of current work, both academic and commercial was presented. There were many endoscopic simulators, for the knee, shoulder, colon, abdomen... And all had some force-feedback that wasn't convincing as real tissue (from what doctors said) but apparently helped in training (from what the engineers said). Tactile tissue simulation was one of the key phrases. Everybody is trying to figure out how to do it, but I didn't see (feel) any convincing implementation. Force feedback is the latest craze, but the sensitivity to model subtle gradations just isn't there yet. An interesting alternative is to use sound as feedback. Also, many atlases of the whole human body (and one of a frog) were presented. Most used the Visible Human, but there were others (the Japanese) that had their own data sets. One interesting point that was raised by the team at SRI is that the key problem in training surgeons is not how to convey the locomotive skills needed to manipulate an endoscope or how to cut using a scalpel, but how to understand patient anatomy. Training the hands how to use an endoscope takes a week or so, but learning how to the interpret a patient's anatomy takes years. I agree with this assessment, and I think that's where rich interaction capabilities combined with real-time volumetric rendering of multimodal data are crucial. SRI, of Stanford, have tested their telepresence system with live animals using a 200 metre link. Their results are published in the Journal of Vascular Surgery. Dr. John Hill of SRI presented their first attempts to move towards computer-generated graphics training simulators using their telepresence system. They use a set-up similar to the ISS Virtual Workbench, but with their own interaction devices. They are working on simulating suture of tissue and vessels using an Onyx and 2D texture maps. Dr. Ramin Shahidi, Stanford University Medical Center, is working on SGI-based volume rendering neurosurgery and craniofacial applications. Their graphics didn't include more than
  • 20. one volume at a time. His presentation was an overview of the use of volume rendering vs. surface rendering. NASA-Ames and Stanford University have created the National Biocomputation Center: Dr. Muriel Ross was announcing this centre as a resource for collaboration with academics and industry, to promote medical VR. NASA-Ames have an Immersion Workbench (aka Responsive Workbench, aka Immersadesk) and their own visualisation software and are working on craniofacial "virtual" surgery. It appears that they use polygon meshes for their visualisation. Dr. Henry Fuchs presented work in progress at UNC that uses depth range finders to reconstruct a surface map of the intestines to then guide an endoscope for colonoscopy. All this was added to their well-known augmented reality system, and comprises an interestingly novel approach. HT Medical presented their VR Simulation of Abdominal Trauma Surgery. They use the PHANToM and some "wet" graphics to remove a kidney. They simulate the "steps" taken by the surgeon. First the surgeon cuts the skin, which then opens, revealing the intestines. A wet graphics effect is used, but this looks more like "cling film" wrapping over everything. The intestines moved quite unconvincingly, in an animation that was slightly under the control of the user (it didn't appear like inverse kinematics were attaching the end-point of the intestines to the user's tool). The kidney was removed by simply "reaching into it" and moving it out. The practical value of such a demonsytration was not clear, however. An impressive paper from Wegner and Karron of Computer Aided Surgery Inc., described the use of auditory feedback to guide blind biopsy needle placement. Their audio feedback system generates an error signal in 3D space with respect to a planned needle trajectory. This error signal and the preoperative plan are used to motivate a position sonification algorithm which generates appropriate sounds to guide the operator in needle placement. To put it simply, harmonics versus dissonances are used to convey position information accurately along 6-8 dimensions. A nice example of a synaesthetic medium - using one modality (sound) where one would normally expect another (touch and/or vision). Their approach has wide applicability. Myron Kreuger is President of Artificial Reality Corporation and a claimant to the title of inventor of VR. The system he described was a training system for dealing with emergencies, where smells of, for example, petrol or the contents of the lower intestine, can provide valuable information in a hazardous situation. It has also been claimed that smell can significantly enhance the effectiveness of medical training systems. Many manufacturers presented their latest demonstrations and products at the conference exhibition. HT demonstrated CathSim, a simulator that trains nursing students to perform vascular catheterisations. They built a special force feedback device and a some simple graphics to provide visual feedback. It was quite good to guide the needle, but had little or no feedback once inside the skin. This seemed like an "technological overkill" since the procedure is easily learned without VR and is not exactly hazardous. They also demonstrated a Flexible Bronchoscopy simulator developed with a partnership of pulmonologists and pharmacology experts at Merck & Co. (based on the Visible Human
  • 21. Project). They have a way to track the flexible tip of the endoscope ("a secret", I was told when I asked) and they generate nice 2D texture-mapped graphics of the interior throat using an SRI Impact. Fraunhofer had two demonstrations from their Providence office: TeleInVivo, demonstrating a PC software volume renderer (a few seconds per rendering for small windows areas) attached to an ultrasound probe. Interventional Ultrasound: A guiding system for biopsy needle insertion using an ultrasound tracking system (not much of an implementation at the moment), so it's the old idea of using ultrasound to guide a biopsy needle. They overlay the ultrasound view with the biopsy needle path, something that UNC demonstrated at SIGGRAPH, but without the expensive head gear. Matthias Wapler, of the IPA branch in Stuttgart, also described a robot for precise endoscopy and neurosurgical navigation. They have not yet developed planning software for their system. Loral were at the Immersion booth, presenting a training system using the Immersion Corp.'s force feedback device. The application lets the surgeon guide an endoscope through the nose of a patient. The simulation was "helpful" to surgeons, although it is rather crude and doesn't feel like the real thing. Prosolvia (the main Swedish VR company) demonstrated s system for Virtual Arthroscopy of the shoulder, developed with University Hospital of Linköping. They used the Immersion Corp. force-feedback system, and their own Oxygen software base. They are interested in collaborating on VR medical training systems. Four demonstrations were shown at the SensAble booth: The Ophthalmic surgical simulator. This project combines N-Vision US$25,000 stereo display (binoculars with 1280x1024 resolution; there is a cheaper version for VGA graphics at US$15,000) with the PHANToM, and a nice simulation of the feel of an eye. The computer platform is Intergraph. Since the PHANToM doesn't provide torque feedback, I didn't really appreciate the usefulness of the feedback system while cutting around the cornea. However, prodding the eye produced convincing force feedback. MusculoGraphics surgery simulation solutions. Their Limb Trauma simulator DIDN'T have force feedback, so the PHANToM was used as a 3D pointer. The simulation consisted in picking up a bullet and stopping bleeding of a blood vessel. I thought the system was unrealistic and of limited usefulness. Their IV catheter insertion system HAD force feedback, and was quite convincing. Spine Biopsy Simulator, by the Georgetown University Medical School, for educational use. The aim is to mimic an actual spine biopsy procedure and improve overall learning by students. Unfortunately, their demo wasn't working. Virtual Presence presented two useful tools:
  • 22. VolWin, a volume rendering package (US$700) on the PC that is based on the Voxar API. The performance was really good, running on a plain PC. A 256x256x256 volume was rendered at some 5-6 fps, with some aliasing effects, but basically usable quality. A package that tests the surgeon's performance using the Immersion Corp. laparoscopy device. No fancy graphics, the idea being to measure performance in hitting targets. An excellent simple idea for laparoscopy training. Gold Standard Multimedia have produced a CD-ROM with a segmentation of the Visible Male. The package volume renders the views and structures chosen, on a PC platform. Sense8's medical customers are the National Centre for Biocomputation (NASA, Stanford University), Rutgers University, Center for Neuro-Science, and Iowa School of Dentistry. A knee simulator was presented. Unfortunately, it broke early in the conference. Vista Medical Technologies had a good head mounted display to substitute the microscope. It is not head tracked, but it allows the surgeon to look through the microscope and outside. It also allows Picture-in-Picture, so that an endoscope can be used to supplement the microscope. There was a nice demonstration of 3D sonification from Lake Acoustics of Australia, who were also involved in the 3D sound feedback for biopsy needle placement described briefly above (the paper by Wegner and Karron). Using their kit, it is very simple to place sounds in a three- dimensional landscape surrounding the body to the front (as with normal stereo) and to the back (as with cinema surround sound) but using only headphones. They were giving away diskettes containing an impressive demonstration of this system. 3.7 Physical and Mental Health and Rehabilitation It is clear that this is one of the medical areas where VR can most immediately and successfully be applied today. This is partly because the technical demands, particularly in terms of detailed visualisation and interactivity, are actually less stringent than in some other areas, such as surgery. Often these systems simulate the physical environment, a world of rooms, doors, buildings, etc., many of which are simple shapes and much easier to model that the irregular and contoured surfaces of internal organs. They also tend to be solid, and so the physics to be understood so that they may be modelled is much simpler, and the complexity of interacting with them is much less. Main Application areas: Mental health therapy: fear of heights, fear of flying, various other phobias. Eating disorders. Stress control, Irritable Bowel Syndrome. Autism. Patient rehabilitation: treadmills, wheelchairs, people with disabilities (CACM Aug 1997).
  • 23. Parkinson's disease, stroke therapy - with physiological feedback. Exploration and communication of unusual mental/body states is also a potential application area. Examples: A Treatment of Akinesia Using Virtual Images 1998 "Technology and Persons with Disabilities" Conference AcrophobiaVirtual Environment -- Final Report Autism and VR The Use of Virtual Reality in the Treatment of Phobias VR and Disabilities VR Exposure Therapy VR in Eating Disorders VR in Stroke Disorders 3.7.1 Snapshot of the State of the Art: Conference Report on Mental Health session, Medicine Meets VR 1998 Topics covered at this year's conference included treatment of phobias, psychological assessment, and cognitive rehabilitation. The session also provided an opportunity for the launch of the new CyberPsychology and Behavior journal, the first number of which includes a useful summary of the use of VR as a therapeutic tool. Brenda Wiederhold presented a good paper on using VR to go beyond the standard "imaginal" training of phobic patients. The advantages of VR are, first, that fear can be effectively activated (which is necessary to bring about change) but can be controlled (too much fear reinforces the phobia) and, second, physiological measures can be used to control the display. One simple measure of anxiety, first used by Jung, is a drop in skin resistance. Similar work on claustrophobia and fear of heights was described by Bulligen of the University of Basle. Another paper on acrophobia (fear of heights) by Huang et al. of the University of Michigan described comparisons of real and virtual environments for emotional desensitisation, and questioned the need for a high level of realism. Using the CAVE environment, they compared the same views in VR and in reality. See their Web page for views. A rather pleasant system from Japan, the "Bedside Wellness" system by Ohsuga et al, allows bedridden patients to take a virtual forest walk while lying on their backs in bed. An array of three video screens present the unfolding view of the forest as the patient gently steps on two
  • 24. foot pedals. There is also 3D sound of birds, streams and wind in the trees. A slot below the central screen delivers a gentle breeze scented with pine to the "walking" patient. Rizzo, of the University of Southern California, is using VR to give increased ecological validity to standard tests applied to Alzheimer's Disease patients, such as the mental rotation task (where the patient has to decide if a second figure is a rotated version of an earlier figure, or is different in shape). This Immersadesk application seemed like technological overkill to me. However, a fuller paper by Rizzo et al in the CyberPsychology and Behavior journal, lists several advantages of VR for cognitive and functional assessment and rehabilitation applications: 1. ecologically valid and dynamic testing and training scenarios, difficult to present by other means 2. total control and consistency of administration 3. hierarchical and repetitive stimulus challenges that can be readily varied in complexity, depending on level of performance 4. provision of cueing stimuli or visualisation tactics to help successful performance in an errorless learning paradigm 5. immediate feedback of performance 6. ability to pause for discussion or instruction 7. option of self-guided exploration and independent testing and training 8. modification of sensory presentations and response requirements based on user's impairments 9. complete performance recording 10. more naturalistic and intuitive performance record for review and analysis by the user 11. safe environment, although realistic 12. ability to introduce game-like aspects to enhance motivation for learning 13. low-cost functional training environments Also on the topic of psychological assessment, Laura Medozzi et al, from Milan, described what seemed to be high quality work to compare traditional tests with VR-based testing. The case of a patient suffering frontal lobe dysfunction several years after a stroke was used to make the point that traditional tests often fail to reveal deficits that can be identified with VR. This is thought to be due to the nonverbal and immersive realism of VR, compared to the presence of a human examiner, in traditional testing, who inadvertently provided surrogate control over higher order
  • 25. faculties - largely through verbal exchanges. The same group, in collaboration with workers under David Rose at the University of East London, described the use of VR to aid cognitive rehabilitation. Joan McComas of the University of Ottawa described a VR system for developing spatial skills in children. She had carried out a four-condition study where choice of location to move to was either passive or active, as was navigation to that location. The four were then: passenger (passive choice/passive movement) navigator (active choice/passive movement), driver (active choice/active movement) and navigated driver (passive choice/active movement). The task was to find things hidden at locations, but without going to the same location twice. Measures were percent of correct choices and visit of first error. It occurred to me that we could use this sort of approach in studies of exploration in 3D information landscapes. A paper by Weniger also struck a chord by comparing spatial learning (maze navigation) with exercise of the executive function (the maze with pictograms) and with the use of orientation skills (navigation of landscapes). Giuseppe Riva, from the Applied Technology for Psychology Lab at the Instituto Auxologico Italiano in Verbania also discussed the use of VR for psychological assessment - particularly the development of the Body Image Virtual Reality Scale. Patients chose which virtual body they think matches their own, and which they would prefer to have instead. The difference gives a measure of body image distortions. Greene and Heeter, of the Michigan State University Communication Technology Lab, described CD-ROMs that contain VR-like stories of cancer sufferers, particularly in relation to coping with pain. Details can be found at [http://www.commtechlab.msu.edu/products/]. An interesting paper by Hunter reported the finding that VR can be very effective in helping burn-recovery patients cope with the pain of treatment. Patients in the VR condition reported significant pain reduction and less time spent thinking about pain. Pope described the use of a VR system called "Viscereal" to provide physiological feedback. Users could control the flow of blood to their hands, and hence could warm or cool them at will. It has also been found to be effective in permitting conscious control of bowel activity, easing clinically harmless but distressing conditions such as Irritable Bowel Syndrome. The Woodburys, a husband and wife team from the Puerto Rican Institute of Psychiatry, mused on modern cosmology and the origins of our three dimensionality. They gave the conference a useful reminder that the 3D world is in our heads, not in the world "out there". Pathological psychological states - especially various psychoses - and altered states of consciousness produced by certain hallucinogenic drugs, make this clear as the world around the experiencer, and his sense of his body and its place in that world, falls apart in typical psychotic panic states. Following Pribram, the Woodburys view the 3D world we know so well as a holographic projection, formed in the brain according to principles established through evolution as aiding survival. While recognising that this world is an illusion, psychiatrists work to restore it in patients whose world has literally collapsed. Although not mentioned by presenters, one of the audience, Rita Addison, talked about the use of VR to communicate the reality of mental deficits to other, normal people. Rita has visited the
  • 26. VRLab in Umea and is well-known for her "Detour: Brain Deconstruction Ahead" which reproduces for others her visual problems since a car accident a few years ago.