SlideShare a Scribd company logo
1 of 24
1
Contents
Section 1: Introduction
1.1. The three I’s of Virtual Reality.........................................................2
1.2. A short history of early Virtual Reality.............................................3
1.3. Early commercial VR technology.....................................................4
1.4. The components of a VR system......................................................4
Section 2: Input Devices
2.1. 3D Positional Trackers.............................................................................5
2.1.1. Tracker performance parameters.................................................6
2.1.2. Different types of trackers............................................................8
2.2. Navigation and Manipulation Interfaces....................................................9
2.3. Gesture interfaces....................................................................................9
Section 3: Output Devices
3.1. Graphics Display.................................................................................10
3.2. Haptic Feedback..................................................................................11
3.2.1. Tactile feedback interfaces
3.2.2. Force Feedback Interfaces
Section 4: Computer Architecture for VR
4.1. The Rendering Pipeline.........................................................................13
4.2. Haptic Rendering Pipeline....................................................................14
Section 5: Modelling
5.1. Geometric Modelling............................................................................15
5.1.1. Virtual Object Shape
5.1.2. Object Visual Appearance
5.2. Kinematic Modelling.............................................................................16
5.3. Physical Modelling...............................................................................17
5.3.1. Collision detection
5.3.2. Surface deformation
5.3.3. Force computation
5.4. Behaviour Modelling............................................................................18
5.5. Model Management..............................................................................18
5.5.1. Level-of-Detail Management
5.5.2. Cell Segmentation
Section 6: VR Programming
6.1. Toolkits and Scene Graphs...................................................................20
6.2. JAVA 3D..............................................................................................21
6.2.1. Model Geometry and Appearance
6.2.2. Java 3D Scene Graphs
Conclusion
2
Section 1: Introduction
The scientific community has been working in the field of virtual reality (VR) for
decades, having recognized it as a very powerful human-computer interface. A large number of
publications, TV shows, and conferences have describes virtual reality in various and
(sometimes) inconsistent ways. This is led to confusion in the technical literature.
Then what is virtual reality? Let us first describe it in terms of functionality. It is a
simulation in which computer graphics is used to create a realistic-looking world. Moreover,
the synthetic world is not static, but responds to the user’s input (gesture, verbal commands,
etc.). This defines a key feature of virtual reality, which is real-time interactivity. Here real time
means that the computer is able to detect a user’s input and modify the virtual world
instantaneously. People like to see things change on the screen in response to their commands
and become captivated by the simulation.
Definition: Virtual reality is a high-end user-computer interface that involves real-time
simulation and interactions through multiple sensorial channels. These sensorial modalities are
visual, auditory, and tactile, smell, and taste.
1.1 The three I’s of Virtual Reality
Interactivity: Anybody who doubts the spell binding power of interactive
graphics has only to look at children playing videos games. It was reported that two
youngsters in United Kingdom continued playing Nintendo even though their house was
on fire!
Immersion: Interactivity and its captivating power contributes to the feeling of
immersion, of being part of the action in the virtual world that the user experiences.
Virtual reality pushes it even further by using all human sensorial channels. Indeed,
users not only see and manipulate graphics objects in the Virtual world, they can also
touch and feel them [Burdea, 1996].
Imagination: The extent to which a virtual reality application is able to solve a
particular problem, that is the extent to which a simulation performs well, depends
therefore very much on the human imagination. The imagination part of VR refers to
the minds capacity to perceive nonexistent things.
Fig 1.1 The three I’s of Virtual reality
3
1.2 A short history of early Virtual Reality
Virtual reality is not a new invention, but it dates back more than 40 years. In 1962, U.S.
Patent #3,030,870 was issued was issued to Morton Heilig for his invention entitled Sensorama
Simulator, which was the 1st
virtual reality video arcade. As shown in Figure 1.2, this early VR
workstation had 3D video feedback (obtained by using a pair of side-by-side 35mm cameras),
motion, colour, stereo, aroma, wind effect, and a seat that vibrated. It was thus possible to
simulate a motorcycle ride through New York City.
In 1981, NASA created a LCD based HMD, they just reverse engineered a Sony
Watchman TV and added special optics to focus image near the eyes. Even today HMDs use the
same basic principle used by NASA in 1981.
1.3 Early Commercial VR Technology
The 1st
company to sell VR products was VPL Inc. This company produced the DataGlove
(fig 1.3) [VPL, 1987]. Its fibre-optic sensors allowed computers to measure finger and thumb
bending, and thus interaction was possible through gestures.
Soon after this the game company Nintendo introduces a much cheaper PowerGlove for
its gaming console. It used ultrasonic sensors to measure wrist position relative to the screen
conductive ink flex sensors to measure finger bending.
The 1st
commercial head-mounted displays, called EyePhones, were introduced by VPL
in 1980s. These HDMs used LCD displays to produce a stereo image, but at extremely low
resolution (360X240 pixels). In early 1991 Division Ltd. Introduced the 1st
integrated
commercial VR workstation. It had a Stereo display on an HDM, 3D sound, hand tracking, and
gesture recognition.
Fig 1.2 Sensorama Fig 1.3 VPL DataGlove
4
1.4 The Components of a VR System
5
Section 2: Input Devices
One of the three I’s defining virtual reality stands for interactivity. In order to allow
human-computer interaction it is necessary to use special interfaces designed to input user’s
commands into the computer and provide feedback from the simulation to the user.
Today’s VR interfaces are varied in functionalities and purpose, as they address several
human sensorial channels. For Example, body motion is measured with #d position Trackers or
using sensing suits, hand gestures are digitized by sensing gloves, visual feedback is sent
through stereo HDMs and large volume displays, virtual sound is computed by 3D sound
generators, etc. Some of these technologies are still under research. The aim of the researchers
is to allow faster and more natural ways of interaction with the computer and thus overcoming
the communication bottleneck presented by the keyboard and mouse.
2.1 3D Positional Trackers
Many computer application domains, such as navigation, ballistic missile tracking,
ubiquitous computing, robotics, biomechanics, architecture, computer-aided design (CAD),
education, and VR require knowledge of real time position and orientation of moving objects
within some frame of reference [Hightower and Borriello, 2001].
Objects moving in 3D space have 6 degrees of freedom, three for translations and three
for rotations. If a Cartesian coordinate system is attached to the moving object (as illustrated in
Fig 2.1), then its translations are along X, Y, and Z axes. Object rotation about these axes are
called yaw, pitch, and roll respectively.
Fig 2.1 System of coordinates of moving 3D object
Definition: The special purpose hardware used in VR to
measure the real-time change in a 3D objects position and
orientation is called a tracker.
VR applications typically measure the motion of the
user’s head, hands or limbs for the purpose of view control,
locomotion, and object manipulation. In case of the head
mounted display illustrated in figure 2.2, the tracker receiver
is placed on the user’s head, so when the posture of the head changes so does the position of
the receiver. The user’s head motion is sampled by an electronic unit and sent to a host
computer (in this case a graphics workstation). The computer calculates the new viewing
direction of the virtual scene and to render an updated image.
6
Another VR sensorial modality that uses tracker information is 3D sound, which in figure
2.2 is presented through headphones. Tracker data allow the computer to collocate sound
sources with virtual objects the user sees in the simulation. This helps increase the simulation
realism and the user’s feeling of immersion in the synthetic world.
2.1.1 Tracker performance parameters
All 3D trackers regardless of the technology they use, share a number of very important
performance parameters, such as accuracy, jitter, drift and latency. These are illustrated in
figure 2.3.
Definition: Tracker accuracy represents the difference between the objects actual position and
that reported by tracker measurements.
7
Fig 2.3 Tracker performance parameters: a) accuracy b) jitter c) drift d) latency
The more accurate a tracker, smaller this difference is and the better the simulation
follows the real user’s actions. Accuracy is given separate for tracking translation (millimetres)
and rotation (degrees). Accuracy is typically not constant and is degrades with distance from
the origin of reference of the system of coordinates. The distance at which accuracy is
acceptable defines the tracker operating range or working envelope. Accuracy should not be
confused with resolution, which is the granularity or the minimum change in tracker object 3D
position that the sensor can detect. The sphere of repeatability is the envelope which encloses
repeated measurements of a real object stationary position. Repeatability depends on tracker
jitter.
Definition: Tracker jitter represents the change in tracker output when the tracked object is
stationary.
A noisy tracker makes accurate measurements difficult. Just like accuracy, jitter is not
constant over the tracker work envelope, and is influenced by environmental conditions in the
tracker’s vicinity.
8
Definition: Tracker drift is the steady increase in tracker error with time.
The output of the tracker wit drift measure the position of a stationary object is shown
in figure 2.3c. As time passes, the tracker inaccuracy grows, which makes its data useless. Drift
needs to be controlled by periodically zeroing it using a secondary tracker that does not have
drift.
Definition: Latency is the time delay between action and result. In the case of the 3D tracker,
latency is the time between the changes in object position/orientation and the time the sensor
detects this change.
Minimal latency is desired, since large latencies have serious negative effect on the
simulation. This can induce “simulation sickness”. Latency can be minimized by:
1. Synchronizing measurement, communication, rendering and display loops, this method
is called generation lock or ‘genlock’.
2. Using faster communication lines.
3. High update or sampling rate.
Definition: Tracker update rate represents the number of measurements that the tracker
reports every second.
2.1.2 Different types of trackers
1. Mechanical tracker: A mechanical tracker consists of a serial or parallel
kinematic structure composed of links interconnected using sensorized joints.
Figure
2. Magnetic tracker: A magnetic tracker is a noncontact position measurement
device that uses magnetic fields produced by a stationary transmitter to
determine the real-time position of a moving element. There are 2 types: AC and
DC
3. Ultrasonic tracker: An ultrasonic tracker is a noncontact position measurement
device that uses ultrasonic signals produced by a stationary transmitter to
determine the real-time position of a moving receiver element.
4. Optical tracker: A optic tracker is a noncontact position measurement device
that uses optical sensing to determine the real-time position of objects.
5. Hybrid inertial tracker: A hybrid tracker is a system that utilizes two or more
position measurement technologies to track objects better than any single
technology would allow.
9
Figure 2.4 Mechanical tracker Figure 2.5 Optical tracker
2.2 Navigation and Manipulation Interfaces
Definition: A navigation/manipulation interface is a device that allows the interactive change of
view to the virtual environment and exploration through the selection and manipulation of
virtual objects.
2.2.1Types of Navigation/Manipulation Interfaces
1. Tracker-Based: Trackers offer more functionality to VR simulation than simply
measuring real-time position of user. Integrated within these trackers are manipulation
devices.
2. Trackballs: This is a sensorized cylinder that measures three forces and three torques
applied by the user’s hand on a compliant element.
3. 3D probes: Intuitive and inexpensive device which allow either absolute or relative
position control of objects.
2.3 Gesture interfaces
Definition: Gesture interfaces are devices that measure the real-time position of user’s fingers
and wrists in order to allow natural, gesture-recognition based interaction with the virtual
world.
Devices used: Pinch Glove, 5DT Data Glove, Didjiglove, CyberGlove
10
Section 3: Output Devices
Now we look at special hardware designed to provide feedback from the simulation in
response to the input. The sensorial channels fed back by these interfaces are sight, sound, and
touch.
3.1 Graphics Display
Definition: A graphic display is a computer interface that presents synthetic world images in
one or several users interacting with the virtual world.
Types of displays:
1. Personal Graphics display: A graphics display that outputs virtual scene destined to be
viewed by a single user.
a. Head-Mounted Displays (HMDs) (fig 3.1)
b. Hand-Supported Displays (HSDs)
c. Floor-Supported Displays (FSDs)
d. Desk-Supported Displays (DSDs)
2. Large-Volume Displays: Graphics display that allows several users located in close
proximity to simultaneously view images of the virtual display.
a. Monitor-Based (fig3.2)
b. Projector-Based
Fig 3.1 Head-Mounted Display Fig 3.2 Panoramic Display
11
3.2 Haptic Feedback
Definition: Touch feedback conveys real-time information on contact surface geometry, virtual
object surface roughness, slippage, and temperature. It does no actively resist the user’s
contact motion and cannot stop user from moving through virtual surface.
Definition: Force feedback provides real-time information on virtual object surface compliance,
object weight, and inertia. It actively resists user’s contact motion and can stop it.
3.2.1 Tactile feedback interfaces
These devices use many ways to stimulate skin receptors ranging from air blows, jets,
electric impulses, vibrations, micro-pin arrays, direct stimulation and functional neuro-muscular
stimulation to provide tactile feedback.
Devices: Tactile mouse [Rosenberg and Martin, 2001]
CyberTouch Glove [2000 Immersion Co.] (fig 3.3)
Temperature feedback glove
Fig. 3.3 CyberTouch Glove
3.3.2 Force Feedback Interfaces
These devices provide substantial force to stop user’s motion compliant to objects in
the virtual world. This implies these devices have larger actuators, heavier structures, larger
complexities and greater cost.
An important characteristic of force feedback is mechanical bandwidth.
Definition: The mechanical bandwidth of a force feedback interface represents the frequency
of force and torque refreshes as felt by the user (through finger attachments, handles, gimbals,
etc.).
Devices: Force Feedback Joystick, Phantom Arm, CyberGrasp Glove, CyberForce Glove
12
Fig 3.4 CyberGrasp force feedback glove Fig 3.5 CyberForce force feedback system
13
Section 4: Computer Architecture for VR
Now we look at computing hardware supporting such real-time interaction, which we
call the “VR Engine.” This term is an abstraction, which corresponds to various physical
hardware configurations, from a single computer to many networked computers supporting a
given simulation.
Definition: The VR Engine is a key component of any VR system, which reads its input devices,
accesses task-dependant databases, performs the required real-time computations to update
the state of the virtual world, and feeds the result to the output display.
During a VR simulation it is impossible to predict all users’ actions and store all
corresponding synthetic world stated in memory. Therefore the virtual world is created in real
time. For a smooth simulation at least 30 frames/second needs to be displayed. Therefore the
VR engine needs to recompute the virtual world every 33 msec! This process alone takes a
large computational load that needs to be handled in parallel with other tasks.
4.1 The Rendering Pipeline
The term rendering is associated with graphics. It represents the process of
converting 3D geometrical models populating the virtual world into 2D scene
presented to the user. But this is also extended to render the haptic feedback of
the VR system.
Graphics rendering has three fundamental stages as illustrated in figure
4.1. The first stage the application stage, which is done entirely in software by the
CPU. It reads the world geometry database as well as the user’s input mediated
by the input devices. In response to the user’s input the application stage may
change the view to the simulation or change orientation of virtual objects, or
create/destroy objects.
The application stage results are fed to the geometry stage, which can
implement either hardware or software. This stage consists of model
transformations, lighting computations, scene projection, clipping, and mapping.
The last stage in graphics pipeline is the rasterizing stage, which is done in
hardware, in order to gain speed. This stage converts the vertex information
output by the geometry stage into pixel information needed by the video display.
This stage also does anti-aliasing in order to smooth out jagged edges. This leads
to large computational loads. Therefore performed in parallel.
14
Fig 4.1 Advanced graphic rendering pipeline
4.2 Haptic Rendering pipeline
VR simulation systems implement additional sensorial modalities, such as haptics, that
need to meet similar real-time constrains. This can be implemented through a multistage
haptics rendering pipeline.
During the first stage of the haptic rendering pipeline the physical characteristics of the
3D object are loaded from the database. These include surface compliance, smoothness,
weight, surface temperature, etc. The first stage of the pipeline also performs collision
detection to determine which virtual objects collide, if any. Here only altered objects are
passed to the next stage of the pipeline.
The second stage calculates the change in characteristics of the virtual object based on
various simulation models. More objects altered the more the computational load on the
pipeline.
The third and the last stage is the haptic texturing stage, which renders the touch and
force feedback components of the simulation to the output devices. This stage is largely
hardware dependent.
Graphics accelerators:
ATI Fire GL2 [2000]
NVIDIA
3Dlabs Wildcat II
Haptics rendering devices:
CyberForce
Temperature feedback glove
Force feedback joysticks
15
Section 5: Modelling
Another important aspect is the modelling of the virtual world. This means first the
mapping of I/O devices with the simulation scene, then developing object databases for
populating the world. This means modelling object shape, appearance, kinematic constraints,
intelligent behaviour, and physical characteristics. Finally, in order to maintain real-time
interaction with the simulation, the model needs to be optimized during the model
management step.
5.1 Geometric Modelling
5.1.1 Virtual Object Shape
The shape of virtual objects is determined by their 3D surface, which can be described
in several ways. The vast majority of virtual objects have their surface composed of triangular
meshes. Triangular meshes are preferred because they use shared vertices and are faster to
render. Another way of shape representation is to use parametric surfaces.
Fig 5.1 parametric surfaces
VR
authoring
tool
I/O mapping
Geometric
modelling
Kinematic
modelling
Physical
modelling
Intelligent
behaviour
Model
management
16
There are several methods by which object surfaces can be constructed.
Using Toolkit Editors
Importing CAD Files
Creating surfaces with 3D Digitizer
Creating surfaces with 3D Scanner
Using online 3D Object Databases
5.1.2 Object Visual Appearance
The next step is to illuminate the scene such that the object becomes visible. The
appearance of an object will depend strongly on the type and placement of virtual light sources
as well as on the object’s surface.
Scene Illumination: Local scene illumination treats the interactions between objects and
light sources in isolation, neglecting the interdependences between objects. Global illumination
models the inter-reflections between objects and shadows, resulting in more realistic looking
scene.
Texturing: is a technique performed in the rasterizing stage of the graphics pipeline in
order to modify the object model’s surface properties such as colour, specular reflection, or
pixel normal.
5.2 Kinematic Modelling
This determines the location of 3D objects with respect to a world system of
coordinates as well as their motion in the virtual world. Object kinematics is governed by
parent-child hierarchical relations, with the motion of a parent object affecting that of its child.
Homogeneous transformation matrices are used to express object translation and
rotations, and scaling.
Object position is expressed using 3D coordinate systems. This system of coordinates is
attached to the object, usually at its centre of gravity, and oriented along the object axes of
symmetry.
Fig 5.2 View of 3D world.
17
5.3 Physical Modelling
Next step in virtual world modelling is the integration of object’s physical
characteristics. These include weight, inertia, surface geometry, compliance, deformation
mode, etc. These features bring more realism to the simulation.
5.3.1 Collision detection
The first stage of haptic rendering is collision detection, which determines weather two
or more objects are in contact with each other. This can be considered a form of haptic clipping
since only objects that collide are processed by the haptic rendering pipeline.
There are 2 types of collision detections: approximate and exact.
5.3.2 Surface deformation
Collision detection is followed by collision response, which depends on characteristics of
the virtual objects in contact and on the particular application being developed. Surface
deformation changes the 3D objects geometry interactively and thus needs to be coordinated
with graphics pipeline.
5.3.4 Force computation
When the user interacts with 3D object surfaces they should feel the reaction force.
These forces need to be computed by the haptic rendering pipeline and sent to haptic force
feedback devices to the user. Force computation takes into account the type of surface
contact, the kind of surface deformation, as well as the objects physical and kinematic
characteristics. There are main 3 types of virtual objects:
Elastic Virtual objects
Plastic Virtual objects
Rigid Virtual objects
18
5.4 Behaviour Modelling
It is also possible to model object behaviour that is independent of the user’s actions.
This becomes critical in very large simulation environments, when users cannot possibly control
all interactions that are taking place.
Consider modelling of a virtual office, for example. Such an office could have an
automatic door, a clock and a desk calendar, as well as furniture. The time displayed by the
clock and date shown by the calendar should be automatically adjusted by the VR Engine. The
virtual doors should open when the user walks into the office. All these behaviours have to be
modelled into the objects and virtual environment.
5.5 Model Management
Definition: Model management combines techniques to help the VR engine render complex
virtual environments at interactive rates without a significant impact on the simulation quality.
5.5.1 Level-of-Detail Management
This technique involves using the same objects with different polygon counts. These
objects represent the same object with different level of detail. The reason to do so stems from
the realization that the human eye perceives less and less details as objects are further away.
Thus would be wasteful to represent distant objects with high level of detail.
LOD management can be further classified into Static and adaptive.
Fig 5.3 LOD Management
19
5.5.2 Cell Segmentation
When models are too large and cannot be fitted into the RAM they need to be rendered
such a way that the impact of memory swaps on the simulation frame rate minimized. This
involves partitioning the large model into smaller ones, then rendering those with static or
adaptive LOD management.
There are two methods:
Automatic cell segmentation
Combined cell, LOD, and Database methods
Fig 5.4 Cell segmentation: A car model
20
Section 6: VR Programming
Previously discussed topics shows how to model the geometry, appearance, physical,
and behaviour properties of virtual objects as well as their parent-child object hierarchy. All this
is done to reduce the frame rendering time. These are the basic components of programming,
or authoring a virtual environment.
Definition: An authoring environment is an application programming interface (API)-based
method of creating a virtual world. A run-time environment subsequently allows user’s real-
time interaction with the authored world.
6.1 Toolkits and Scene Graphs
A toolkit is an extendable library of objects-oriented functions designed for VR
specifications. Simulated objects are parts of classes and inherit their default attributes, thus
simplifying the programming task. Modern general purpose toolkits include WorldToolKit
(WTK) [Sense8 Co., 2001], Java 3D [1998], Virtual Reality Modelling Language (VRML), etc.
Scene Graph is a hierarchical organization of objects in the virtual world, together with
the view to that world. The scene graph represents a tree structure, with nodes connected by
branches. The topmost node in the hierarchy is the root, which is the parent node to the whole
graph. The external nodes are called leaves. They typically represent visible objects. The
internal nodes represent transformations, which position their children objects in space in
relation to the root node or in relation to other objects. Any transformation applied to a given
internal node will affect all its children. Scene graphs are not static and change to reflect the
current state of the virtual world.
6.2 JAVA 3D
Java, introduced by Sun Microsystems in the mid 1990s’, has become the programming
environment of choice for platform independent, highly distributed applications. Java 3D is one
of the Java APIs, which was designed for object-oriented programming of interactive 3D
graphics applications. Java 3D uses OpenGL and Direct 3D low-level graphics library functions.
6.2.1 Model Geometry and Appearance
An object’s 3D shape and appearance are specified by Java 3D class Shape3D(), which
os an extension of the scene-graph leaf nodes. The specific values within the Shape3D()
class are set by functions called methods, namely setGeometry() and
setAppearance().
21
Geometries can be constructed from starch, using points, lines, triangles, quads, arrays, etc. For
example, if a geometry is built using triangle arrays, then
/* Create list of 3D coordinated for vertices */
Point3f[] myCoords = {
New Point3f 0.0f, 0.0f, 0.0f,
. . .
}
/* Create list of vertex normal for lighting */
Vertor3f[] myNormals = {
New Vector3f 0.0f, 0.0f, 0.0f
. . .
}
/* Create the triangular array */
TriangleArray myTris = new TriangleArray(
Mycoords.length,
GeometryArray.COORDINATES |
GeometryArray.NORMALS );
myTris.setCoordinates(0, myCoords);
myTris.setNormals(0, myNormals);
/* Assemble the shape */
Shape3D myShape = new Shape3D(myTris, myAppear);
An alternative way of setting object geometry is to import files in formats such as 3DS,
DFX, NFF, and WRL. The geometry importing is done by methods called loaders. Loaders add
content of the loaded file to the scene graph as a single object. If the object needs to be
segmented in order to maintain its intrinsic parent-child dependencies, then parts need to be
accessed individually by subsequent method calls. Let us consider a virtual hand geometry file
Hand.wrl created with VRML. In order to access parts we need to
/* add file to scene graph */
Scene SC = loader.load(“Hand.wrl”);
BranchGroup Bg = SC.getSceneGroup();
/* access to finger subparts of the loaded model */
Thumb = Bg.getChild(0);
Index = Bg.getChild(1);
Middle = Bg.getChild(2);
Ring = Bg.getChild(3);
Small = Bg.getChild(4);
22
The object’s appearance is specified the Java 3D Appearance() class. These material
and texture attributes need to be defined first and then grouped to form a new appearance as
in
Mat = new Material();
Mat.setDiffuseColor(r,g,b);
Mat.setAmbientColor(r,g,b);
Mat.setSpecularColor(r,g,b);
/* import texture file */
TextLd = new textureLoader(“checkered.jpg”, ..., ...);
Tex = TextLd.getTexture();
/* create the appearance and set it */
Appr = new Appearance();
Appr = setMaterial(Mat);
Appr = setTexture(Tex);
Geom.setAppearance(Appr);
6.2.2 Java 3D Scene Graphs
A Java 3D scene graphs is constructed of Node objects in parent-child relationships
forming a tree structure. The arcs of a tree form no cycles. Only one path exists from the root
of a tree to each of the leaves; therefore, there is only one path from the root of a scene graph
to each leaf node.
Each scene graph path in a Java 3D scene graph completely specifies the state
information of its leaf. State information includes the location, orientation, and size of a visual
object.
Each scene graph has a single VirtualUniverse. The VirtualUniverse object has a
list of Locale objects. A Locale object provides a reference point in the virtual universe. Each
Locale object may serve as the root of multiple subgraphs of the scene graph.
A BranchGroup object is the root of a subgraph, or branch graph. There are two different
categories of scene subgraph: the view branch graph and the content branch graph.
Scene Graph Hierarchy
o The VirtualUniverse, Locale, Group, and Leaf classes appear in this portion of the
hierarchy. Other than the VirtualUniverse and Locale objects, the rest of a scene
graph is composed of SceneGraphObject objects. SceneGraphObject is the
superclass for nearly every Core and Utility Java 3D class. SceneGraphObject has two
subclasses: Node and NodeComponent. The subclasses of Node provide most of the
objects in the scene graph. A Node object is either a Group node or a Leaf node
object. Group and Leaf are superclasses to a number of subclasses. Here is a quick
look at the Node class, its two subclasses, and the NodeComponent class. After this
background material is covered, the construction of Java 3D programs is explained.
23
Node Class
o The Node class is an abstract superclass of Group and Leaf classes. The Node
class defines some important common methods for its subclasses. Information
on specific methods is presented in later sections after more background
material is covered. The subclasses of Node compose scene graphs.
Group Class
o The Group class is the superclass used in specifying the location and orientation
of visual objects in the virtual universe. Two of the subclasses of Group are
BranchGroup and TransformGroup. In the graphical representation of the scene
graph, the Group symbols (circles) are often annotated with BG for
BranchGroups, TG for TransformGroups, etc.
Leaf Class
o The Leaf class is the superclass used in specifying the shape, sound, and behavior
of visual objects in the virtual universe. Some of the subclasses of Leaf are
Shape3D, Light, Behavior, and Sound. These objects can have no children but
may reference NodeComponents.
Fig 6.1 Example of Java 3D scene graph Fig 6.2 Scene graph of a cockroach
24
Conclusion
Useful applications of VR include training in a variety of areas (military, medical, equipment
operation, etc.), education, design evaluation (virtual prototyping), architectural walk-through,
human factors and ergonomic studies, simulation of assembly sequences and maintenance
tasks, assistance for the handicapped, study and treatment of phobias (e.g., fear of height),
entertainment, and much more.
Like all great technologies, there's a monumental duality about it.
Virtual Reality technology can represent the next step in the sociological evolution of
humanity. A world where you can do anything, you can enjoy everything in virtual world which
you cannot even dream in this real world, like you can enjoy the latest model of Mercedes
without spending any money and a world where every virtual desire of mankind can be
satisfied for the cost of pennies.
On the other hand, Virtual Reality could be greatest single threat to society. Imagine an
entire modernized civilization leaving the "real" world for the "virtual" one. A nation of empty
streets, empty schools as family spend their entire days plugged into a Virtual Reality Machine
Everybody will be living in their own world and living their life happily without any tensions &
sorrows and above all that world will be according to your taste.
Above all it is concluded that the virtual reality is acting a social evolution or society
depends on the ways it can be used. If you enjoy a drive of Mercedes in virtual reality it will
cause a loss to Mercedes Company and leads to a loss of country’s economy. If you use it in a
way like Virtual Training System and in the field of medical.
References:
Main Reference: “Virtual Reality Technology,” 2nd Edition, Grigore C. Burdea
Other References:
Burdea, G., 1993, “Virtual reality systems and application” *short course+, in electro ’93
International Conference, Edison, NJ
VPL, 1987, DataGlove Model 2 User’s Manual, VPL Research, Redwood City, CA.
Anon, 1998, “3D Navigational and Gesture devices”, VR News, Vol. 7(1), page 26-29.
Immersion Co., 2001 “CyberGlove”, online at www.immersion.com/products/3d
InterSense, 2000b, “IS-900 Precision Motion Tracker”, company brochcure, InterSense Co.
Foxlin, E., 1998, “Motion tracking requirements and technology”, page 163-210.
Monkman and P. Taylor, 1993, “Thermal Tactile sensing”, Vol. 9(3), page 313-318
Bishop, 1986, “Fast Phong Shading Algorithm”, page 103-105.
“Introduction to VR”, www.csie.nctu.edu.tw/course_vr_2001/doc/VRcourse5.pdf
http://www.oracle.com/technetwork/java/javase/tech/desktop-documentation-jsp-138646.html

More Related Content

What's hot

Presentation on Virtual reality
Presentation on Virtual realityPresentation on Virtual reality
Presentation on Virtual realityMd. Salman Ahmed
 
Virtual Reality and Augmented Reality
Virtual Reality and Augmented RealityVirtual Reality and Augmented Reality
Virtual Reality and Augmented RealityNikitaGour5
 
Augmented Reality & Applications
Augmented Reality & ApplicationsAugmented Reality & Applications
Augmented Reality & ApplicationsJishnu Pradeep
 
Virtual Reality - With Demo Video
Virtual Reality - With Demo VideoVirtual Reality - With Demo Video
Virtual Reality - With Demo VideoNikhil Mhatre
 
virtual reality | latest |best presentation
virtual reality | latest |best presentationvirtual reality | latest |best presentation
virtual reality | latest |best presentationvipin mishra
 
AUGMENTED REALITY Documentation
AUGMENTED REALITY DocumentationAUGMENTED REALITY Documentation
AUGMENTED REALITY DocumentationVenu Gopal
 
Augmented Reality Presentation
Augmented Reality PresentationAugmented Reality Presentation
Augmented Reality PresentationSJSU
 

What's hot (20)

Augmented Reality
Augmented RealityAugmented Reality
Augmented Reality
 
Virtual reality
Virtual realityVirtual reality
Virtual reality
 
Augmented Reality
Augmented RealityAugmented Reality
Augmented Reality
 
Presentation on Virtual reality
Presentation on Virtual realityPresentation on Virtual reality
Presentation on Virtual reality
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 
Augmented reality ppt
Augmented reality pptAugmented reality ppt
Augmented reality ppt
 
Virtual Reality and Augmented Reality
Virtual Reality and Augmented RealityVirtual Reality and Augmented Reality
Virtual Reality and Augmented Reality
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 
Augmented Reality & Applications
Augmented Reality & ApplicationsAugmented Reality & Applications
Augmented Reality & Applications
 
Virtual Reality - With Demo Video
Virtual Reality - With Demo VideoVirtual Reality - With Demo Video
Virtual Reality - With Demo Video
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 
Vr & ar 1
Vr & ar 1Vr & ar 1
Vr & ar 1
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 
Virtual Reality
Virtual RealityVirtual Reality
Virtual Reality
 
virtual reality | latest |best presentation
virtual reality | latest |best presentationvirtual reality | latest |best presentation
virtual reality | latest |best presentation
 
AUGMENTED REALITY Documentation
AUGMENTED REALITY DocumentationAUGMENTED REALITY Documentation
AUGMENTED REALITY Documentation
 
Augmented Reality Presentation
Augmented Reality PresentationAugmented Reality Presentation
Augmented Reality Presentation
 
Virtual reality
Virtual realityVirtual reality
Virtual reality
 
Augmented Reality
Augmented RealityAugmented Reality
Augmented Reality
 
Virtual reality
Virtual realityVirtual reality
Virtual reality
 

Viewers also liked

Cross-Platform Developement with Unity 3D
Cross-Platform Developement with Unity 3DCross-Platform Developement with Unity 3D
Cross-Platform Developement with Unity 3DMartin Ortner
 
Maya modelling and animation history
Maya modelling and animation historyMaya modelling and animation history
Maya modelling and animation historymhmaill706
 
3 D Maya Introduction
3 D Maya Introduction3 D Maya Introduction
3 D Maya Introductiontsshivshankar
 
3D modelling and animation using Autodesk maya
3D modelling and animation using Autodesk maya3D modelling and animation using Autodesk maya
3D modelling and animation using Autodesk mayaParvesh Taneja
 
Introduction to Unity3D Game Engine
Introduction to Unity3D Game EngineIntroduction to Unity3D Game Engine
Introduction to Unity3D Game EngineMohsen Mirhoseini
 
The Road from Selestat to Frankfurt
The Road from Selestat to FrankfurtThe Road from Selestat to Frankfurt
The Road from Selestat to FrankfurtThe Diesel Driver
 
Healthcare(factory food)
Healthcare(factory food)Healthcare(factory food)
Healthcare(factory food)Romokid1997
 
Eurocities, a european open data network
Eurocities, a european open data network Eurocities, a european open data network
Eurocities, a european open data network liberTIC
 
למידה ניידת - מצגת כנס הדרכה 2012
למידה ניידת - מצגת כנס הדרכה 2012למידה ניידת - מצגת כנס הדרכה 2012
למידה ניידת - מצגת כנס הדרכה 2012Kineo Israel
 
O zi de munca
O zi de muncaO zi de munca
O zi de muncaRobert
 
Financial Solutions in a Troubling Economy
Financial Solutions in a Troubling EconomyFinancial Solutions in a Troubling Economy
Financial Solutions in a Troubling EconomyStuart Little
 
Introduction to Erasmus Open Data
Introduction to Erasmus Open DataIntroduction to Erasmus Open Data
Introduction to Erasmus Open DataliberTIC
 
Onboarding presentation - כיצד לשפר תהליך קליטה וכניסה לתפקיד
Onboarding presentation - כיצד לשפר תהליך קליטה וכניסה לתפקידOnboarding presentation - כיצד לשפר תהליך קליטה וכניסה לתפקיד
Onboarding presentation - כיצד לשפר תהליך קליטה וכניסה לתפקידKineo Israel
 
Time_Machine_Voice
Time_Machine_VoiceTime_Machine_Voice
Time_Machine_VoiceKruszewski
 

Viewers also liked (20)

Cross-Platform Developement with Unity 3D
Cross-Platform Developement with Unity 3DCross-Platform Developement with Unity 3D
Cross-Platform Developement with Unity 3D
 
Maya modelling and animation history
Maya modelling and animation historyMaya modelling and animation history
Maya modelling and animation history
 
Virtual Reality
Virtual RealityVirtual Reality
Virtual Reality
 
3 D Maya Introduction
3 D Maya Introduction3 D Maya Introduction
3 D Maya Introduction
 
Maya To Unity3D
Maya To Unity3DMaya To Unity3D
Maya To Unity3D
 
Intro to maya
Intro to mayaIntro to maya
Intro to maya
 
3D modelling and animation using Autodesk maya
3D modelling and animation using Autodesk maya3D modelling and animation using Autodesk maya
3D modelling and animation using Autodesk maya
 
Introduction to Unity3D Game Engine
Introduction to Unity3D Game EngineIntroduction to Unity3D Game Engine
Introduction to Unity3D Game Engine
 
Unity presentation
Unity presentationUnity presentation
Unity presentation
 
The Road from Selestat to Frankfurt
The Road from Selestat to FrankfurtThe Road from Selestat to Frankfurt
The Road from Selestat to Frankfurt
 
Healthcare(factory food)
Healthcare(factory food)Healthcare(factory food)
Healthcare(factory food)
 
Eurocities, a european open data network
Eurocities, a european open data network Eurocities, a european open data network
Eurocities, a european open data network
 
Wisdom gives life
Wisdom gives lifeWisdom gives life
Wisdom gives life
 
למידה ניידת - מצגת כנס הדרכה 2012
למידה ניידת - מצגת כנס הדרכה 2012למידה ניידת - מצגת כנס הדרכה 2012
למידה ניידת - מצגת כנס הדרכה 2012
 
O zi de munca
O zi de muncaO zi de munca
O zi de munca
 
Setlist Mostly Harmless - 42
Setlist Mostly Harmless - 42Setlist Mostly Harmless - 42
Setlist Mostly Harmless - 42
 
Financial Solutions in a Troubling Economy
Financial Solutions in a Troubling EconomyFinancial Solutions in a Troubling Economy
Financial Solutions in a Troubling Economy
 
Introduction to Erasmus Open Data
Introduction to Erasmus Open DataIntroduction to Erasmus Open Data
Introduction to Erasmus Open Data
 
Onboarding presentation - כיצד לשפר תהליך קליטה וכניסה לתפקיד
Onboarding presentation - כיצד לשפר תהליך קליטה וכניסה לתפקידOnboarding presentation - כיצד לשפר תהליך קליטה וכניסה לתפקיד
Onboarding presentation - כיצד לשפר תהליך קליטה וכניסה לתפקיד
 
Time_Machine_Voice
Time_Machine_VoiceTime_Machine_Voice
Time_Machine_Voice
 

Similar to Virtual Reality

Introduction to Virtual Reality
Introduction to Virtual RealityIntroduction to Virtual Reality
Introduction to Virtual RealityKAVITHADEVICS
 
Hihihihihihihivivivirtual reality.ppt.pptx
Hihihihihihihivivivirtual reality.ppt.pptxHihihihihihihivivivirtual reality.ppt.pptx
Hihihihihihihivivivirtual reality.ppt.pptxfijomiy607
 
Augmented reality
Augmented realityAugmented reality
Augmented realityvivekuniyal
 
virtual reality Barkha manral seminar on augmented reality.ppt
virtual reality Barkha manral seminar on augmented reality.pptvirtual reality Barkha manral seminar on augmented reality.ppt
virtual reality Barkha manral seminar on augmented reality.pptBarkha Manral
 
augmented reality
augmented realityaugmented reality
augmented realityDark Side
 
Augmented Reality By Jaseem Bhutto
Augmented Reality By Jaseem BhuttoAugmented Reality By Jaseem Bhutto
Augmented Reality By Jaseem BhuttoJaseem Bhutto
 
virtual-reality-889-HYcNcWM.pptx
virtual-reality-889-HYcNcWM.pptxvirtual-reality-889-HYcNcWM.pptx
virtual-reality-889-HYcNcWM.pptx19431YASWANTHKUMAR
 
44328856-Augmented-Reality.ppt
44328856-Augmented-Reality.ppt44328856-Augmented-Reality.ppt
44328856-Augmented-Reality.pptAjayPoonia22
 
44328856-Augmented-Reality.ppt
44328856-Augmented-Reality.ppt44328856-Augmented-Reality.ppt
44328856-Augmented-Reality.pptNagulahimasri
 
Elec Virtual Reality PPT.pptx
Elec Virtual Reality PPT.pptxElec Virtual Reality PPT.pptx
Elec Virtual Reality PPT.pptxKalaiselviDevaraj
 

Similar to Virtual Reality (20)

Introduction to Virtual Reality
Introduction to Virtual RealityIntroduction to Virtual Reality
Introduction to Virtual Reality
 
Hihihihihihihivivivirtual reality.ppt.pptx
Hihihihihihihivivivirtual reality.ppt.pptxHihihihihihihivivivirtual reality.ppt.pptx
Hihihihihihihivivivirtual reality.ppt.pptx
 
Virtual reality
Virtual realityVirtual reality
Virtual reality
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 
virtual reality Barkha manral seminar on augmented reality.ppt
virtual reality Barkha manral seminar on augmented reality.pptvirtual reality Barkha manral seminar on augmented reality.ppt
virtual reality Barkha manral seminar on augmented reality.ppt
 
Augmented reality
Augmented realityAugmented reality
Augmented reality
 
VIRTUAL REALITY
VIRTUAL REALITYVIRTUAL REALITY
VIRTUAL REALITY
 
Virtual Reality
Virtual RealityVirtual Reality
Virtual Reality
 
Unit v
Unit vUnit v
Unit v
 
VIRTUAL REALITY DOCUMENTATION
VIRTUAL REALITY DOCUMENTATION VIRTUAL REALITY DOCUMENTATION
VIRTUAL REALITY DOCUMENTATION
 
augmented reality
augmented realityaugmented reality
augmented reality
 
Virtual Reality
Virtual RealityVirtual Reality
Virtual Reality
 
Virtual Reality(full)
Virtual Reality(full)Virtual Reality(full)
Virtual Reality(full)
 
Augmented reality report
Augmented reality reportAugmented reality report
Augmented reality report
 
Augmented Reality By Jaseem Bhutto
Augmented Reality By Jaseem BhuttoAugmented Reality By Jaseem Bhutto
Augmented Reality By Jaseem Bhutto
 
virtual-reality-889-HYcNcWM.pptx
virtual-reality-889-HYcNcWM.pptxvirtual-reality-889-HYcNcWM.pptx
virtual-reality-889-HYcNcWM.pptx
 
20n05a0418 ppt.pptx
20n05a0418 ppt.pptx20n05a0418 ppt.pptx
20n05a0418 ppt.pptx
 
44328856-Augmented-Reality.ppt
44328856-Augmented-Reality.ppt44328856-Augmented-Reality.ppt
44328856-Augmented-Reality.ppt
 
44328856-Augmented-Reality.ppt
44328856-Augmented-Reality.ppt44328856-Augmented-Reality.ppt
44328856-Augmented-Reality.ppt
 
Elec Virtual Reality PPT.pptx
Elec Virtual Reality PPT.pptxElec Virtual Reality PPT.pptx
Elec Virtual Reality PPT.pptx
 

More from Aditya Sharat

More from Aditya Sharat (16)

Neural networks
Neural networksNeural networks
Neural networks
 
Google apps cloud computing
Google apps cloud computingGoogle apps cloud computing
Google apps cloud computing
 
Deloitte's Cloud Perspectives
Deloitte's Cloud PerspectivesDeloitte's Cloud Perspectives
Deloitte's Cloud Perspectives
 
Number system
Number systemNumber system
Number system
 
Introduction to IT
Introduction to ITIntroduction to IT
Introduction to IT
 
Humanware
HumanwareHumanware
Humanware
 
Generation of computers
Generation of computersGeneration of computers
Generation of computers
 
Flow charts
Flow chartsFlow charts
Flow charts
 
Electronic computer classification
Electronic computer classificationElectronic computer classification
Electronic computer classification
 
Language translators
Language translatorsLanguage translators
Language translators
 
MCS
MCSMCS
MCS
 
Unix shell program training
Unix shell program trainingUnix shell program training
Unix shell program training
 
Railway Management system
Railway Management systemRailway Management system
Railway Management system
 
Mobile communication
Mobile communicationMobile communication
Mobile communication
 
Conducting polymers
Conducting polymersConducting polymers
Conducting polymers
 
IS95 CDMA Technology
IS95 CDMA TechnologyIS95 CDMA Technology
IS95 CDMA Technology
 

Recently uploaded

Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)lakshayb543
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSJoshuaGantuangco2
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Mark Reed
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfTechSoup
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceSamikshaHamane
 
Choosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for ParentsChoosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for Parentsnavabharathschool99
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxthorishapillay1
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...Nguyen Thanh Tu Collection
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Celine George
 
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfAMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfphamnguyenenglishnb
 
Q4 English4 Week3 PPT Melcnmg-based.pptx
Q4 English4 Week3 PPT Melcnmg-based.pptxQ4 English4 Week3 PPT Melcnmg-based.pptx
Q4 English4 Week3 PPT Melcnmg-based.pptxnelietumpap1
 
How to Add Barcode on PDF Report in Odoo 17
How to Add Barcode on PDF Report in Odoo 17How to Add Barcode on PDF Report in Odoo 17
How to Add Barcode on PDF Report in Odoo 17Celine George
 
4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptxmary850239
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...JhezDiaz1
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxiammrhaywood
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4MiaBumagat1
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPCeline George
 
Karra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptxKarra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptxAshokKarra1
 

Recently uploaded (20)

Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in Pharmacovigilance
 
Choosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for ParentsChoosing the Right CBSE School A Comprehensive Guide for Parents
Choosing the Right CBSE School A Comprehensive Guide for Parents
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptx
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17
 
OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...
 
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdfAMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
AMERICAN LANGUAGE HUB_Level2_Student'sBook_Answerkey.pdf
 
Q4 English4 Week3 PPT Melcnmg-based.pptx
Q4 English4 Week3 PPT Melcnmg-based.pptxQ4 English4 Week3 PPT Melcnmg-based.pptx
Q4 English4 Week3 PPT Melcnmg-based.pptx
 
Raw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptxRaw materials used in Herbal Cosmetics.pptx
Raw materials used in Herbal Cosmetics.pptx
 
How to Add Barcode on PDF Report in Odoo 17
How to Add Barcode on PDF Report in Odoo 17How to Add Barcode on PDF Report in Odoo 17
How to Add Barcode on PDF Report in Odoo 17
 
4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx4.18.24 Movement Legacies, Reflection, and Review.pptx
4.18.24 Movement Legacies, Reflection, and Review.pptx
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERP
 
Karra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptxKarra SKD Conference Presentation Revised.pptx
Karra SKD Conference Presentation Revised.pptx
 

Virtual Reality

  • 1. 1 Contents Section 1: Introduction 1.1. The three I’s of Virtual Reality.........................................................2 1.2. A short history of early Virtual Reality.............................................3 1.3. Early commercial VR technology.....................................................4 1.4. The components of a VR system......................................................4 Section 2: Input Devices 2.1. 3D Positional Trackers.............................................................................5 2.1.1. Tracker performance parameters.................................................6 2.1.2. Different types of trackers............................................................8 2.2. Navigation and Manipulation Interfaces....................................................9 2.3. Gesture interfaces....................................................................................9 Section 3: Output Devices 3.1. Graphics Display.................................................................................10 3.2. Haptic Feedback..................................................................................11 3.2.1. Tactile feedback interfaces 3.2.2. Force Feedback Interfaces Section 4: Computer Architecture for VR 4.1. The Rendering Pipeline.........................................................................13 4.2. Haptic Rendering Pipeline....................................................................14 Section 5: Modelling 5.1. Geometric Modelling............................................................................15 5.1.1. Virtual Object Shape 5.1.2. Object Visual Appearance 5.2. Kinematic Modelling.............................................................................16 5.3. Physical Modelling...............................................................................17 5.3.1. Collision detection 5.3.2. Surface deformation 5.3.3. Force computation 5.4. Behaviour Modelling............................................................................18 5.5. Model Management..............................................................................18 5.5.1. Level-of-Detail Management 5.5.2. Cell Segmentation Section 6: VR Programming 6.1. Toolkits and Scene Graphs...................................................................20 6.2. JAVA 3D..............................................................................................21 6.2.1. Model Geometry and Appearance 6.2.2. Java 3D Scene Graphs Conclusion
  • 2. 2 Section 1: Introduction The scientific community has been working in the field of virtual reality (VR) for decades, having recognized it as a very powerful human-computer interface. A large number of publications, TV shows, and conferences have describes virtual reality in various and (sometimes) inconsistent ways. This is led to confusion in the technical literature. Then what is virtual reality? Let us first describe it in terms of functionality. It is a simulation in which computer graphics is used to create a realistic-looking world. Moreover, the synthetic world is not static, but responds to the user’s input (gesture, verbal commands, etc.). This defines a key feature of virtual reality, which is real-time interactivity. Here real time means that the computer is able to detect a user’s input and modify the virtual world instantaneously. People like to see things change on the screen in response to their commands and become captivated by the simulation. Definition: Virtual reality is a high-end user-computer interface that involves real-time simulation and interactions through multiple sensorial channels. These sensorial modalities are visual, auditory, and tactile, smell, and taste. 1.1 The three I’s of Virtual Reality Interactivity: Anybody who doubts the spell binding power of interactive graphics has only to look at children playing videos games. It was reported that two youngsters in United Kingdom continued playing Nintendo even though their house was on fire! Immersion: Interactivity and its captivating power contributes to the feeling of immersion, of being part of the action in the virtual world that the user experiences. Virtual reality pushes it even further by using all human sensorial channels. Indeed, users not only see and manipulate graphics objects in the Virtual world, they can also touch and feel them [Burdea, 1996]. Imagination: The extent to which a virtual reality application is able to solve a particular problem, that is the extent to which a simulation performs well, depends therefore very much on the human imagination. The imagination part of VR refers to the minds capacity to perceive nonexistent things. Fig 1.1 The three I’s of Virtual reality
  • 3. 3 1.2 A short history of early Virtual Reality Virtual reality is not a new invention, but it dates back more than 40 years. In 1962, U.S. Patent #3,030,870 was issued was issued to Morton Heilig for his invention entitled Sensorama Simulator, which was the 1st virtual reality video arcade. As shown in Figure 1.2, this early VR workstation had 3D video feedback (obtained by using a pair of side-by-side 35mm cameras), motion, colour, stereo, aroma, wind effect, and a seat that vibrated. It was thus possible to simulate a motorcycle ride through New York City. In 1981, NASA created a LCD based HMD, they just reverse engineered a Sony Watchman TV and added special optics to focus image near the eyes. Even today HMDs use the same basic principle used by NASA in 1981. 1.3 Early Commercial VR Technology The 1st company to sell VR products was VPL Inc. This company produced the DataGlove (fig 1.3) [VPL, 1987]. Its fibre-optic sensors allowed computers to measure finger and thumb bending, and thus interaction was possible through gestures. Soon after this the game company Nintendo introduces a much cheaper PowerGlove for its gaming console. It used ultrasonic sensors to measure wrist position relative to the screen conductive ink flex sensors to measure finger bending. The 1st commercial head-mounted displays, called EyePhones, were introduced by VPL in 1980s. These HDMs used LCD displays to produce a stereo image, but at extremely low resolution (360X240 pixels). In early 1991 Division Ltd. Introduced the 1st integrated commercial VR workstation. It had a Stereo display on an HDM, 3D sound, hand tracking, and gesture recognition. Fig 1.2 Sensorama Fig 1.3 VPL DataGlove
  • 4. 4 1.4 The Components of a VR System
  • 5. 5 Section 2: Input Devices One of the three I’s defining virtual reality stands for interactivity. In order to allow human-computer interaction it is necessary to use special interfaces designed to input user’s commands into the computer and provide feedback from the simulation to the user. Today’s VR interfaces are varied in functionalities and purpose, as they address several human sensorial channels. For Example, body motion is measured with #d position Trackers or using sensing suits, hand gestures are digitized by sensing gloves, visual feedback is sent through stereo HDMs and large volume displays, virtual sound is computed by 3D sound generators, etc. Some of these technologies are still under research. The aim of the researchers is to allow faster and more natural ways of interaction with the computer and thus overcoming the communication bottleneck presented by the keyboard and mouse. 2.1 3D Positional Trackers Many computer application domains, such as navigation, ballistic missile tracking, ubiquitous computing, robotics, biomechanics, architecture, computer-aided design (CAD), education, and VR require knowledge of real time position and orientation of moving objects within some frame of reference [Hightower and Borriello, 2001]. Objects moving in 3D space have 6 degrees of freedom, three for translations and three for rotations. If a Cartesian coordinate system is attached to the moving object (as illustrated in Fig 2.1), then its translations are along X, Y, and Z axes. Object rotation about these axes are called yaw, pitch, and roll respectively. Fig 2.1 System of coordinates of moving 3D object Definition: The special purpose hardware used in VR to measure the real-time change in a 3D objects position and orientation is called a tracker. VR applications typically measure the motion of the user’s head, hands or limbs for the purpose of view control, locomotion, and object manipulation. In case of the head mounted display illustrated in figure 2.2, the tracker receiver is placed on the user’s head, so when the posture of the head changes so does the position of the receiver. The user’s head motion is sampled by an electronic unit and sent to a host computer (in this case a graphics workstation). The computer calculates the new viewing direction of the virtual scene and to render an updated image.
  • 6. 6 Another VR sensorial modality that uses tracker information is 3D sound, which in figure 2.2 is presented through headphones. Tracker data allow the computer to collocate sound sources with virtual objects the user sees in the simulation. This helps increase the simulation realism and the user’s feeling of immersion in the synthetic world. 2.1.1 Tracker performance parameters All 3D trackers regardless of the technology they use, share a number of very important performance parameters, such as accuracy, jitter, drift and latency. These are illustrated in figure 2.3. Definition: Tracker accuracy represents the difference between the objects actual position and that reported by tracker measurements.
  • 7. 7 Fig 2.3 Tracker performance parameters: a) accuracy b) jitter c) drift d) latency The more accurate a tracker, smaller this difference is and the better the simulation follows the real user’s actions. Accuracy is given separate for tracking translation (millimetres) and rotation (degrees). Accuracy is typically not constant and is degrades with distance from the origin of reference of the system of coordinates. The distance at which accuracy is acceptable defines the tracker operating range or working envelope. Accuracy should not be confused with resolution, which is the granularity or the minimum change in tracker object 3D position that the sensor can detect. The sphere of repeatability is the envelope which encloses repeated measurements of a real object stationary position. Repeatability depends on tracker jitter. Definition: Tracker jitter represents the change in tracker output when the tracked object is stationary. A noisy tracker makes accurate measurements difficult. Just like accuracy, jitter is not constant over the tracker work envelope, and is influenced by environmental conditions in the tracker’s vicinity.
  • 8. 8 Definition: Tracker drift is the steady increase in tracker error with time. The output of the tracker wit drift measure the position of a stationary object is shown in figure 2.3c. As time passes, the tracker inaccuracy grows, which makes its data useless. Drift needs to be controlled by periodically zeroing it using a secondary tracker that does not have drift. Definition: Latency is the time delay between action and result. In the case of the 3D tracker, latency is the time between the changes in object position/orientation and the time the sensor detects this change. Minimal latency is desired, since large latencies have serious negative effect on the simulation. This can induce “simulation sickness”. Latency can be minimized by: 1. Synchronizing measurement, communication, rendering and display loops, this method is called generation lock or ‘genlock’. 2. Using faster communication lines. 3. High update or sampling rate. Definition: Tracker update rate represents the number of measurements that the tracker reports every second. 2.1.2 Different types of trackers 1. Mechanical tracker: A mechanical tracker consists of a serial or parallel kinematic structure composed of links interconnected using sensorized joints. Figure 2. Magnetic tracker: A magnetic tracker is a noncontact position measurement device that uses magnetic fields produced by a stationary transmitter to determine the real-time position of a moving element. There are 2 types: AC and DC 3. Ultrasonic tracker: An ultrasonic tracker is a noncontact position measurement device that uses ultrasonic signals produced by a stationary transmitter to determine the real-time position of a moving receiver element. 4. Optical tracker: A optic tracker is a noncontact position measurement device that uses optical sensing to determine the real-time position of objects. 5. Hybrid inertial tracker: A hybrid tracker is a system that utilizes two or more position measurement technologies to track objects better than any single technology would allow.
  • 9. 9 Figure 2.4 Mechanical tracker Figure 2.5 Optical tracker 2.2 Navigation and Manipulation Interfaces Definition: A navigation/manipulation interface is a device that allows the interactive change of view to the virtual environment and exploration through the selection and manipulation of virtual objects. 2.2.1Types of Navigation/Manipulation Interfaces 1. Tracker-Based: Trackers offer more functionality to VR simulation than simply measuring real-time position of user. Integrated within these trackers are manipulation devices. 2. Trackballs: This is a sensorized cylinder that measures three forces and three torques applied by the user’s hand on a compliant element. 3. 3D probes: Intuitive and inexpensive device which allow either absolute or relative position control of objects. 2.3 Gesture interfaces Definition: Gesture interfaces are devices that measure the real-time position of user’s fingers and wrists in order to allow natural, gesture-recognition based interaction with the virtual world. Devices used: Pinch Glove, 5DT Data Glove, Didjiglove, CyberGlove
  • 10. 10 Section 3: Output Devices Now we look at special hardware designed to provide feedback from the simulation in response to the input. The sensorial channels fed back by these interfaces are sight, sound, and touch. 3.1 Graphics Display Definition: A graphic display is a computer interface that presents synthetic world images in one or several users interacting with the virtual world. Types of displays: 1. Personal Graphics display: A graphics display that outputs virtual scene destined to be viewed by a single user. a. Head-Mounted Displays (HMDs) (fig 3.1) b. Hand-Supported Displays (HSDs) c. Floor-Supported Displays (FSDs) d. Desk-Supported Displays (DSDs) 2. Large-Volume Displays: Graphics display that allows several users located in close proximity to simultaneously view images of the virtual display. a. Monitor-Based (fig3.2) b. Projector-Based Fig 3.1 Head-Mounted Display Fig 3.2 Panoramic Display
  • 11. 11 3.2 Haptic Feedback Definition: Touch feedback conveys real-time information on contact surface geometry, virtual object surface roughness, slippage, and temperature. It does no actively resist the user’s contact motion and cannot stop user from moving through virtual surface. Definition: Force feedback provides real-time information on virtual object surface compliance, object weight, and inertia. It actively resists user’s contact motion and can stop it. 3.2.1 Tactile feedback interfaces These devices use many ways to stimulate skin receptors ranging from air blows, jets, electric impulses, vibrations, micro-pin arrays, direct stimulation and functional neuro-muscular stimulation to provide tactile feedback. Devices: Tactile mouse [Rosenberg and Martin, 2001] CyberTouch Glove [2000 Immersion Co.] (fig 3.3) Temperature feedback glove Fig. 3.3 CyberTouch Glove 3.3.2 Force Feedback Interfaces These devices provide substantial force to stop user’s motion compliant to objects in the virtual world. This implies these devices have larger actuators, heavier structures, larger complexities and greater cost. An important characteristic of force feedback is mechanical bandwidth. Definition: The mechanical bandwidth of a force feedback interface represents the frequency of force and torque refreshes as felt by the user (through finger attachments, handles, gimbals, etc.). Devices: Force Feedback Joystick, Phantom Arm, CyberGrasp Glove, CyberForce Glove
  • 12. 12 Fig 3.4 CyberGrasp force feedback glove Fig 3.5 CyberForce force feedback system
  • 13. 13 Section 4: Computer Architecture for VR Now we look at computing hardware supporting such real-time interaction, which we call the “VR Engine.” This term is an abstraction, which corresponds to various physical hardware configurations, from a single computer to many networked computers supporting a given simulation. Definition: The VR Engine is a key component of any VR system, which reads its input devices, accesses task-dependant databases, performs the required real-time computations to update the state of the virtual world, and feeds the result to the output display. During a VR simulation it is impossible to predict all users’ actions and store all corresponding synthetic world stated in memory. Therefore the virtual world is created in real time. For a smooth simulation at least 30 frames/second needs to be displayed. Therefore the VR engine needs to recompute the virtual world every 33 msec! This process alone takes a large computational load that needs to be handled in parallel with other tasks. 4.1 The Rendering Pipeline The term rendering is associated with graphics. It represents the process of converting 3D geometrical models populating the virtual world into 2D scene presented to the user. But this is also extended to render the haptic feedback of the VR system. Graphics rendering has three fundamental stages as illustrated in figure 4.1. The first stage the application stage, which is done entirely in software by the CPU. It reads the world geometry database as well as the user’s input mediated by the input devices. In response to the user’s input the application stage may change the view to the simulation or change orientation of virtual objects, or create/destroy objects. The application stage results are fed to the geometry stage, which can implement either hardware or software. This stage consists of model transformations, lighting computations, scene projection, clipping, and mapping. The last stage in graphics pipeline is the rasterizing stage, which is done in hardware, in order to gain speed. This stage converts the vertex information output by the geometry stage into pixel information needed by the video display. This stage also does anti-aliasing in order to smooth out jagged edges. This leads to large computational loads. Therefore performed in parallel.
  • 14. 14 Fig 4.1 Advanced graphic rendering pipeline 4.2 Haptic Rendering pipeline VR simulation systems implement additional sensorial modalities, such as haptics, that need to meet similar real-time constrains. This can be implemented through a multistage haptics rendering pipeline. During the first stage of the haptic rendering pipeline the physical characteristics of the 3D object are loaded from the database. These include surface compliance, smoothness, weight, surface temperature, etc. The first stage of the pipeline also performs collision detection to determine which virtual objects collide, if any. Here only altered objects are passed to the next stage of the pipeline. The second stage calculates the change in characteristics of the virtual object based on various simulation models. More objects altered the more the computational load on the pipeline. The third and the last stage is the haptic texturing stage, which renders the touch and force feedback components of the simulation to the output devices. This stage is largely hardware dependent. Graphics accelerators: ATI Fire GL2 [2000] NVIDIA 3Dlabs Wildcat II Haptics rendering devices: CyberForce Temperature feedback glove Force feedback joysticks
  • 15. 15 Section 5: Modelling Another important aspect is the modelling of the virtual world. This means first the mapping of I/O devices with the simulation scene, then developing object databases for populating the world. This means modelling object shape, appearance, kinematic constraints, intelligent behaviour, and physical characteristics. Finally, in order to maintain real-time interaction with the simulation, the model needs to be optimized during the model management step. 5.1 Geometric Modelling 5.1.1 Virtual Object Shape The shape of virtual objects is determined by their 3D surface, which can be described in several ways. The vast majority of virtual objects have their surface composed of triangular meshes. Triangular meshes are preferred because they use shared vertices and are faster to render. Another way of shape representation is to use parametric surfaces. Fig 5.1 parametric surfaces VR authoring tool I/O mapping Geometric modelling Kinematic modelling Physical modelling Intelligent behaviour Model management
  • 16. 16 There are several methods by which object surfaces can be constructed. Using Toolkit Editors Importing CAD Files Creating surfaces with 3D Digitizer Creating surfaces with 3D Scanner Using online 3D Object Databases 5.1.2 Object Visual Appearance The next step is to illuminate the scene such that the object becomes visible. The appearance of an object will depend strongly on the type and placement of virtual light sources as well as on the object’s surface. Scene Illumination: Local scene illumination treats the interactions between objects and light sources in isolation, neglecting the interdependences between objects. Global illumination models the inter-reflections between objects and shadows, resulting in more realistic looking scene. Texturing: is a technique performed in the rasterizing stage of the graphics pipeline in order to modify the object model’s surface properties such as colour, specular reflection, or pixel normal. 5.2 Kinematic Modelling This determines the location of 3D objects with respect to a world system of coordinates as well as their motion in the virtual world. Object kinematics is governed by parent-child hierarchical relations, with the motion of a parent object affecting that of its child. Homogeneous transformation matrices are used to express object translation and rotations, and scaling. Object position is expressed using 3D coordinate systems. This system of coordinates is attached to the object, usually at its centre of gravity, and oriented along the object axes of symmetry. Fig 5.2 View of 3D world.
  • 17. 17 5.3 Physical Modelling Next step in virtual world modelling is the integration of object’s physical characteristics. These include weight, inertia, surface geometry, compliance, deformation mode, etc. These features bring more realism to the simulation. 5.3.1 Collision detection The first stage of haptic rendering is collision detection, which determines weather two or more objects are in contact with each other. This can be considered a form of haptic clipping since only objects that collide are processed by the haptic rendering pipeline. There are 2 types of collision detections: approximate and exact. 5.3.2 Surface deformation Collision detection is followed by collision response, which depends on characteristics of the virtual objects in contact and on the particular application being developed. Surface deformation changes the 3D objects geometry interactively and thus needs to be coordinated with graphics pipeline. 5.3.4 Force computation When the user interacts with 3D object surfaces they should feel the reaction force. These forces need to be computed by the haptic rendering pipeline and sent to haptic force feedback devices to the user. Force computation takes into account the type of surface contact, the kind of surface deformation, as well as the objects physical and kinematic characteristics. There are main 3 types of virtual objects: Elastic Virtual objects Plastic Virtual objects Rigid Virtual objects
  • 18. 18 5.4 Behaviour Modelling It is also possible to model object behaviour that is independent of the user’s actions. This becomes critical in very large simulation environments, when users cannot possibly control all interactions that are taking place. Consider modelling of a virtual office, for example. Such an office could have an automatic door, a clock and a desk calendar, as well as furniture. The time displayed by the clock and date shown by the calendar should be automatically adjusted by the VR Engine. The virtual doors should open when the user walks into the office. All these behaviours have to be modelled into the objects and virtual environment. 5.5 Model Management Definition: Model management combines techniques to help the VR engine render complex virtual environments at interactive rates without a significant impact on the simulation quality. 5.5.1 Level-of-Detail Management This technique involves using the same objects with different polygon counts. These objects represent the same object with different level of detail. The reason to do so stems from the realization that the human eye perceives less and less details as objects are further away. Thus would be wasteful to represent distant objects with high level of detail. LOD management can be further classified into Static and adaptive. Fig 5.3 LOD Management
  • 19. 19 5.5.2 Cell Segmentation When models are too large and cannot be fitted into the RAM they need to be rendered such a way that the impact of memory swaps on the simulation frame rate minimized. This involves partitioning the large model into smaller ones, then rendering those with static or adaptive LOD management. There are two methods: Automatic cell segmentation Combined cell, LOD, and Database methods Fig 5.4 Cell segmentation: A car model
  • 20. 20 Section 6: VR Programming Previously discussed topics shows how to model the geometry, appearance, physical, and behaviour properties of virtual objects as well as their parent-child object hierarchy. All this is done to reduce the frame rendering time. These are the basic components of programming, or authoring a virtual environment. Definition: An authoring environment is an application programming interface (API)-based method of creating a virtual world. A run-time environment subsequently allows user’s real- time interaction with the authored world. 6.1 Toolkits and Scene Graphs A toolkit is an extendable library of objects-oriented functions designed for VR specifications. Simulated objects are parts of classes and inherit their default attributes, thus simplifying the programming task. Modern general purpose toolkits include WorldToolKit (WTK) [Sense8 Co., 2001], Java 3D [1998], Virtual Reality Modelling Language (VRML), etc. Scene Graph is a hierarchical organization of objects in the virtual world, together with the view to that world. The scene graph represents a tree structure, with nodes connected by branches. The topmost node in the hierarchy is the root, which is the parent node to the whole graph. The external nodes are called leaves. They typically represent visible objects. The internal nodes represent transformations, which position their children objects in space in relation to the root node or in relation to other objects. Any transformation applied to a given internal node will affect all its children. Scene graphs are not static and change to reflect the current state of the virtual world. 6.2 JAVA 3D Java, introduced by Sun Microsystems in the mid 1990s’, has become the programming environment of choice for platform independent, highly distributed applications. Java 3D is one of the Java APIs, which was designed for object-oriented programming of interactive 3D graphics applications. Java 3D uses OpenGL and Direct 3D low-level graphics library functions. 6.2.1 Model Geometry and Appearance An object’s 3D shape and appearance are specified by Java 3D class Shape3D(), which os an extension of the scene-graph leaf nodes. The specific values within the Shape3D() class are set by functions called methods, namely setGeometry() and setAppearance().
  • 21. 21 Geometries can be constructed from starch, using points, lines, triangles, quads, arrays, etc. For example, if a geometry is built using triangle arrays, then /* Create list of 3D coordinated for vertices */ Point3f[] myCoords = { New Point3f 0.0f, 0.0f, 0.0f, . . . } /* Create list of vertex normal for lighting */ Vertor3f[] myNormals = { New Vector3f 0.0f, 0.0f, 0.0f . . . } /* Create the triangular array */ TriangleArray myTris = new TriangleArray( Mycoords.length, GeometryArray.COORDINATES | GeometryArray.NORMALS ); myTris.setCoordinates(0, myCoords); myTris.setNormals(0, myNormals); /* Assemble the shape */ Shape3D myShape = new Shape3D(myTris, myAppear); An alternative way of setting object geometry is to import files in formats such as 3DS, DFX, NFF, and WRL. The geometry importing is done by methods called loaders. Loaders add content of the loaded file to the scene graph as a single object. If the object needs to be segmented in order to maintain its intrinsic parent-child dependencies, then parts need to be accessed individually by subsequent method calls. Let us consider a virtual hand geometry file Hand.wrl created with VRML. In order to access parts we need to /* add file to scene graph */ Scene SC = loader.load(“Hand.wrl”); BranchGroup Bg = SC.getSceneGroup(); /* access to finger subparts of the loaded model */ Thumb = Bg.getChild(0); Index = Bg.getChild(1); Middle = Bg.getChild(2); Ring = Bg.getChild(3); Small = Bg.getChild(4);
  • 22. 22 The object’s appearance is specified the Java 3D Appearance() class. These material and texture attributes need to be defined first and then grouped to form a new appearance as in Mat = new Material(); Mat.setDiffuseColor(r,g,b); Mat.setAmbientColor(r,g,b); Mat.setSpecularColor(r,g,b); /* import texture file */ TextLd = new textureLoader(“checkered.jpg”, ..., ...); Tex = TextLd.getTexture(); /* create the appearance and set it */ Appr = new Appearance(); Appr = setMaterial(Mat); Appr = setTexture(Tex); Geom.setAppearance(Appr); 6.2.2 Java 3D Scene Graphs A Java 3D scene graphs is constructed of Node objects in parent-child relationships forming a tree structure. The arcs of a tree form no cycles. Only one path exists from the root of a tree to each of the leaves; therefore, there is only one path from the root of a scene graph to each leaf node. Each scene graph path in a Java 3D scene graph completely specifies the state information of its leaf. State information includes the location, orientation, and size of a visual object. Each scene graph has a single VirtualUniverse. The VirtualUniverse object has a list of Locale objects. A Locale object provides a reference point in the virtual universe. Each Locale object may serve as the root of multiple subgraphs of the scene graph. A BranchGroup object is the root of a subgraph, or branch graph. There are two different categories of scene subgraph: the view branch graph and the content branch graph. Scene Graph Hierarchy o The VirtualUniverse, Locale, Group, and Leaf classes appear in this portion of the hierarchy. Other than the VirtualUniverse and Locale objects, the rest of a scene graph is composed of SceneGraphObject objects. SceneGraphObject is the superclass for nearly every Core and Utility Java 3D class. SceneGraphObject has two subclasses: Node and NodeComponent. The subclasses of Node provide most of the objects in the scene graph. A Node object is either a Group node or a Leaf node object. Group and Leaf are superclasses to a number of subclasses. Here is a quick look at the Node class, its two subclasses, and the NodeComponent class. After this background material is covered, the construction of Java 3D programs is explained.
  • 23. 23 Node Class o The Node class is an abstract superclass of Group and Leaf classes. The Node class defines some important common methods for its subclasses. Information on specific methods is presented in later sections after more background material is covered. The subclasses of Node compose scene graphs. Group Class o The Group class is the superclass used in specifying the location and orientation of visual objects in the virtual universe. Two of the subclasses of Group are BranchGroup and TransformGroup. In the graphical representation of the scene graph, the Group symbols (circles) are often annotated with BG for BranchGroups, TG for TransformGroups, etc. Leaf Class o The Leaf class is the superclass used in specifying the shape, sound, and behavior of visual objects in the virtual universe. Some of the subclasses of Leaf are Shape3D, Light, Behavior, and Sound. These objects can have no children but may reference NodeComponents. Fig 6.1 Example of Java 3D scene graph Fig 6.2 Scene graph of a cockroach
  • 24. 24 Conclusion Useful applications of VR include training in a variety of areas (military, medical, equipment operation, etc.), education, design evaluation (virtual prototyping), architectural walk-through, human factors and ergonomic studies, simulation of assembly sequences and maintenance tasks, assistance for the handicapped, study and treatment of phobias (e.g., fear of height), entertainment, and much more. Like all great technologies, there's a monumental duality about it. Virtual Reality technology can represent the next step in the sociological evolution of humanity. A world where you can do anything, you can enjoy everything in virtual world which you cannot even dream in this real world, like you can enjoy the latest model of Mercedes without spending any money and a world where every virtual desire of mankind can be satisfied for the cost of pennies. On the other hand, Virtual Reality could be greatest single threat to society. Imagine an entire modernized civilization leaving the "real" world for the "virtual" one. A nation of empty streets, empty schools as family spend their entire days plugged into a Virtual Reality Machine Everybody will be living in their own world and living their life happily without any tensions & sorrows and above all that world will be according to your taste. Above all it is concluded that the virtual reality is acting a social evolution or society depends on the ways it can be used. If you enjoy a drive of Mercedes in virtual reality it will cause a loss to Mercedes Company and leads to a loss of country’s economy. If you use it in a way like Virtual Training System and in the field of medical. References: Main Reference: “Virtual Reality Technology,” 2nd Edition, Grigore C. Burdea Other References: Burdea, G., 1993, “Virtual reality systems and application” *short course+, in electro ’93 International Conference, Edison, NJ VPL, 1987, DataGlove Model 2 User’s Manual, VPL Research, Redwood City, CA. Anon, 1998, “3D Navigational and Gesture devices”, VR News, Vol. 7(1), page 26-29. Immersion Co., 2001 “CyberGlove”, online at www.immersion.com/products/3d InterSense, 2000b, “IS-900 Precision Motion Tracker”, company brochcure, InterSense Co. Foxlin, E., 1998, “Motion tracking requirements and technology”, page 163-210. Monkman and P. Taylor, 1993, “Thermal Tactile sensing”, Vol. 9(3), page 313-318 Bishop, 1986, “Fast Phong Shading Algorithm”, page 103-105. “Introduction to VR”, www.csie.nctu.edu.tw/course_vr_2001/doc/VRcourse5.pdf http://www.oracle.com/technetwork/java/javase/tech/desktop-documentation-jsp-138646.html