Virtual input devices for 3D systems - Visualization, 1993

Virtual Input Devices for 3D Systems
Taosong He and Arie E. Kaufman
Department of Computer Science
State University of New York at Stony Brook
Stony Brook, NY 11794400
Abstract
The device unijied interface is a generalized and
easily expandable protocol f o r the communication
bebveen applications and input devices. The key idea is
to unrSy various device data into the parameters of a
so-called
“virtual input device.”
The device
information-base, which includes device dependent
information, is also incorporated into the virtual input
device. Using the device unifred interface, system
builders are able to design their applications
independent of the input devices as well as utilize the
capabilities of several devices in the same application.
modified when the corresponding devices are changed.
In this paper we introduce the device unified
interface (DUI), which is a protocol for communicating
between applications and input devices. It achieves full
input device independence by allowing users to
interactively control the device operations and modify
the device configuration with no effect on the
application. The reasons for problems in designing such
a protocol are discussed in Section 2. Section 3 and
Section 4 focus on the virtual input device paradigm
used by DUI and the device information-base
incorporated into it. Section 5 discusses the
implementation issues of DUI.
2. Issues Related to a Unified Protocol
Keywords: Device unified interface, 3D input device,
Virtual input device, Device information-base
Input device independence means the separation
of input device handling into an independent component
of the system, as well as the design of a unified protocol
through which applications get the device information.
There are several reasons why input device
independence is an important goal for general-purpose
3D systems:
0 General-purpose 3D
systems are used for
different kinds of applications. However, for some
applications, the most suitable input devices may not be
available. For example. although a DataGlove might be
the most appropriate device for some applications of an
interactive visualization system, this system could be
running on platforms with only a mouse and a keyboard
as the input devices. In a device independent system, the
device configuration change between the DataGlove and
mouse affects only the system performance. However, in
a device dependent system, the application code might
need to be modified.
0 With the increasing interest in 3D interaction,
new input devices and natural interaction methods have
been and will be proposed. By using a unified
protocol, the addition of new input devices or the
incorporation of new interaction methods is restricted to
the input device component with no effect on the
applications.
0 Users might want to interactively choose the
input devices in accordance with specific tasks. For
example, they might prefer a mouse for slice clipping in
1. Introduction
Input devices play a crucial role in 3D “natural”
interaction, which has become indispensable to 3D user
interface for scientific visualization, virtual reality, and
multimedia systems. However, most system designs are
directly based upon certain input devices. For example,
DataF‘aper [6] and edvol [7] supported Isotrak and VPL
DataGlove, and RB2 [2] used EyePhone and DataGlove.
The advantage of such device dependent systems is that
the applications can fully utilize the device capabilities
since they are coupled with the specific input devices.
However, for the same reason, the application code
needs to be changed when the device configuration of
the system is changed or new input devices or new
interaction techniques are introduced. Applications thus
need to deal with the low level input device support,
which should be transparent to most users.
Several systems have been developed to provide
certain levels of input device independence. For
instance, Shaw et al. [9] have developed MR, which
utilizes a client-server model to manage input devices.
Each server is responsible for interacting with a specific
device and continually collecting information from it.
Adding a new input device simply means the addition of
a server, and the existing applications are not affected.
However, applications of MR are still designed in
accordance with specific servers and need to be
142
1070-2385/93$3.000 1993 IEEE
a data visualization system, but select a Spaceball or a
DataGlove for 3D data rotation. The switch between
different devices should be controlled by the user
through a user interface. In other words, these operations
should be transparent to the application code.
However, input device independence is not easy to
achieve. Some problems make the design of a unified
communication protocol difficult:
There are numerous input devices, from a dial to
a simple mouse to a flying mouse to a complex
DataGlove. They provide information ranging from a
numeric value to a 2D position to both 3D position and
3D orientation to flexing of the knuckles and tactile
feedback. To design an input-device independent
environment, various kinds of information must be
translated into a unified form. Also, simulation methods
must be provided so that devices can be exchanged with
one another to achieve true device independence.
0
0 Different input devices are suitable for different
environments. For example, a hand tracker is suitable
for indicating position and orientation, a glove gesture
for discrete control of objects or environments, and a
Spaceball for controlling the velocity of objects [3]. The
reason is that each input device has its specific
characteristics. In order to make use of these
characteristics, the communication protocol should not
totally “hide” the device dependent information.
0 For some applications, speed is critical. For
example, Bryson [3] pointed out that all computation
and rendering involved in a virtual environment must
take place at a minimum of ten frames/second.
Therefore, a unified protocol must provide fast
communication between input devices and applications,
which requires a high level abstraction with minimum
overhead.
DUI uses the virtual input device paradigm to
alleviate these problems. The details of the virtual input
device concept is described in the following sections.
appropriate simulation methods, DUI converts different
raw data into the unified parameters of a virtual input
device. Device-dependent information is provided to the
application by a device information-base incorporated
into each virtual input device.
3.1. Virtual input device parameters
As discussed above, virtual input devices are used
to convert raw data received from different input
devices into a unified form. In general, what the
application needs is the position and orientation of a
“system device” (“virtual hand,” camera, etc.), or the
operations performed by the user (start, select, etc.).
The parameters of a virtual input device are classified
into four groups:
(1) 2D position.
(2) 3D position.
(3) 3D orientations. According to the application, users
may choose description methods of orientation from
Euler angles, rotation matrix, or quaternions. The
comparison among these methods can be found in [lo].
(4) Selection. Selection is the abstraction of device
operations. Any device operation can be interactively
assigned a selection number. These operations include
direct ones like keyboard events, or button clicks of a
mouse or a Spaceball. They also include abstract ones
like voice commands, or gestures of a DataGlove or a
mouse. The conversion of operations into selections is
incorporated into the protocol, and the map between
them is interactively controlled through the user
interface.
Not only the source of selection but also the
source of each virtual input device parameter can be
decided interactively. For example, a virtual input
device can be created whose 3D position is received
from an Isotrak and whose orientation is received from
a Spaceball. To satisfy this requirement, all kinds of
input devices and device combinations must be
supported to simulate a virtual input device.
3. Virtual Input Device
3.2. Simulation methods
To design an input device independent system, it
is useful to observe that many applications need
essentially similar information from different types of
input devices. For example, nearly all applications
request data of 2D position, and some of them may
further require data of 3D position and orientation. DUI
thus can be created by unifying all kinds of data into the
forms desired by the applications.
From the viewpoint of the application, DUI can be
seen as a “black box” (Figure 1). All input devices
provide their original information to this black box,
which has an interface to every device. Using the
In order to achieve input device independence,
there are two requirements imposed on the simulation
methods. First, each device or device combination
should be able to fully simulate a virtual device.
Second, simulations should be natural and easy to learn.
The problem is that it is often difficult to design a
simulation method which satisfies both these
requirements. As examples, we discuss simulation
techniques for specific devices:
Dial: Dial only provides 1D values. Other devices,
like keyboard or mouse, are needed to make natural
selections and indicate the meaning of the dial values.
143
2 0 positions
Virtual Device Parameters
/Virtual De vice
Mouse
Dial
Spaceball
Hand pos.&orien.
Dataglove
Voice
De vice Information-base
I
sound
Figure 1: The structure of the device unified interface (DUI).
Keyboard: Directly pressing keys and using
specific sliders controlled by arrow keys are both
methods of providing virtual device parameters.
However, pure keyboard input is unnatural in most
cases.
Mouse: Users can interactively choose from
several simulation methods like triad mouse [8], 3D
trackball [41, or ARCBALL [ll]. However, none of
them is good enough. ARCBALL and 3D trackball are
only for 3D rotation, while triad mouse is only for 3D
translation. None of them utilize the gesture capability
of the mouse. Since the mouse is a de facto standard
device, further work needs to be done on “natural”
simulations.
Isotrak: The Polhemus Isotrak can directly provide
data of 3D position and orientation. However,
selections are generally obtained with the help of other
devices.
DataGlove: Information about 3D position and
orientation can be obtained from an Isotrak which is
incoqmrated into the DataGlove. Gesture recognizers are
needed to classify gestures into the desired selection
types.
Natural simulation is a difficult and important
task. Most research in this area deals with specific
devices. DUI provides a frame in which new simulation
methods can be easily incorporated and tested. Another
advantage of DUI is that if a new simulation method is
designed, only a simulation program needs to be added.
The application does not have to be modified since the
form of the virtual input device parameters is not
affected.
4. Device Information-Base
As mentioned in Section 2, different input devices
are suitable for different environments although they
may provide similar parameters. A classical example is
a stylus being more appropriate for sketching than a
mouse. Since a virtual input device hides the device
characteristics, utilization of specific device capabilities
becomes a problem. The solution chosen by DUI is to
create a structure where device dependent information is
classified, stored, and made accessible to the
applications. In DUI, this structure is called device
informution-base, which is part of each virtual input
device (Figure 1). A device information-base includes
six information entities:
(1) Current available input devices: The
application can get a list of the available input devices
of the system.
(2) Virtual device configuration: Each virtual
device has a device configuration structure whose
contents can be changed interactively when the virtual
device is created or modified.The structure format is:
pos2d-x, p o s 2 d y : source device;
pos3d-x, p o s 3 d y , pos3d-z : source device;
orientation: source device;
orientation-mode: quaternions or rotation matrix
or Euler angles;
selections: array of selection-map
selection-number : integer;
selection-device :source device;
device-operation-number :integer;
1
144
In the substructure selection-map, selection
numbers are mapped to device operation numbers. The
simulation method in use is responsible for generating a
map between device operations and operation numbers.
Although dif Eerent simulation methods would process
different operations and generate different maps, a
virtual input device needs to deal with only the abstract
operation numbers. In this way, DUI achieves a certain
kind of simulation independence.
(3) Classifications of physical devices which
constitute the virtual device: Physical input devices can
be classified into different categories. Two device
classifications are used in DUI. According to the degrees
of freedom they provide, input devices can be classified
twointo
zero-dimensional,
one-dimensional,
dimensional, three-dimensional, and special devices.
Another criterion is device functionality according to
which input devices can be grouped into locator devices,
valuator devices, choice devices, and command devices.
Unlike Foley et al. [51, we distinguish command devices
from choice devices in DUI. Choice devices are those
which can be used to naturally select objects, and
command devices are those which can be used to
directly issue commands. Table 1 gives the
classifications of several typical devices.
(4) Characteristics of the virtual device
parameters: Position and orientation parameters can be
direct or indirect, continuous or discrete, absolute or
relative [5], original or processed. “Original” means
that the parameter is directly received from the device,
while “processed” means that the parameter has been
processed after collection. Each device also has its
suitable orientation description methods. Other
characteristics of position and orientation parameters
include coordinate system in use, data resolution, and
refresh rate. Selection parameter can be classified as
“natural” or “processed.” Table 2 illustrates
characteristics of the parameters of typical devices.
(5) Relations among different virtual device
parameters: Parameter dependence is decided by the
simulation method in use and the configuration of the
corresponding virtual input device. For each parameter,
DUI associates a dependence list (DL) and a “semidependence” list (MDL) to indicate which other
parameters it depends on. “Semi-dependent” means that
the two parameters are not totally independent. For
example, position and orientation data of a Spaceball are
semi-dependent on each other because, although a
Spaceball theoretically provides six degrees of freedom,
in practice it is difficult to separate force and torque.
To interactively decide the relations among
parameters, we first create a device information-base for
each physical device under every simulation method.
When a virtual input device is created or modified, the
following algorithm is used to combine the
corresponding source device (SD) information-bases for
145
Table 2: Characteristics of the virtual input device
parameters.
Table 3: Raw information structure of some input
devices.
D position 3 0 position Orientation Selection
Event Queues
1
1
I
Mouse
Isotrak
I2Dposition
1
I
I
Position, orientation
alignment, boresight
envelope
II
B u#on operation
sequence
Keyboard event
I
I
I
Legend: OIP: OriginaUProcessed DIl: Directlhdirect
OS: ConlinuousfDiscrete A/R: AbsolutelRelafive
EIQIM: Euler AngleslQuaternionslRotation Matrix
three axes are mutually independent, camera position is
directly derived from them. Otherwise, “selection 1” is
used to switch between camera movement on the x-y
plane and along the z-axis. Another advantage of a
device information-base is that it can easily be
expanded. From the point of view of the application,
adding new information abstractions only means the
addition of some query functions.
physical devices and get the desired relations:
for each parameter p l of the virtual device
for each parameter p2# p l of this virtual device
if p l and p2 come from the same source then
if SD-DL (pl) n SL-DL (p2) # 0 then
add p2 to pl’s DL
if SD-MDL. (pl) n SD-MDL (p2) # 0 then
add p2 to pl’s MDL
end
end.
5. Implementation of DUI
VoZVis is a comprehensive volume visualization
system developed at the State University of New York
at Stony Brook [l]. DUI has been implemented using X
and incorporated into VoZVis to provide and support 3D
interactions.
Figure 2 describes the implementation of DUI.
Every active input device is associated with a control
panel and a device driver. The control panel is used to
manipulate the physical input device. A user may
initialize the device, set the device parameters, and
select the simulation methods through the control panel.
Since each device driver corresponds to a simulation
method, the device driver in use is changed interactively
upon the selection of various simulation methods. As
mentioned in Section 4, different simulations may
process different device operations and generate
different maps between device operations and device
operation numbers. The device driver is responsible for
providing such a map to the control panel so that the
(6) Raw data: The user might be familiar with the
low level physical input device, and wants to directly
use the raw data of the device. Therefore, each input
device has an appropriate data structure to store its raw
information. The application can access this data
structure by querying the corresponding device
information-base. Table 3 presents the contents of this
structure for several devices.
Although all device dependent information can be
directly obtained from the system, in the device
information-base it is classified and stored in an abstract
form. Therefore, users need not be concemed about the
details of low level device support. Instead, they can
focus on the specific requirements of the applications
for input devices. For example, a program can be
written in such a way that, if the 3D position data along
146
virtual reality system on common platforms. A
DataGlove provides data on 3D position and orientation
together with a certain number of gestures. To simulate
it by a 2D mouse, we choose the triad mouse method to
provide 3D position and ARCBALL method to provide
3D orientation panmeter of the virtual input device
using the mouse control panel. The switch between
position and orientation simulations is assigned to be
“selection 4” of the virtual input device.
To simulate gestures, we assign different mouse
and keyboard operations to different virtual device
selections through the virtual device selection control
panel (Figure 4). The set of mouse operations is decided
by the chosen simulation method, and can be 2D
gestures, button operations, or menu selections. Since
DataGlove gestures are also assigned to virtual device
selections in DUI, it can be said that we have created a
map between mouse operations and DataGlove gestures.
The application makes no difference between them
unless it queries the corresponding device informationbase. Of course, system performance is affected.
The insertion of a virtual input device layer
between the input devices and the applications adds
some overhead time. Creating, modifying, allocating,
activating, and stopping a virtual device, and creating
the device information-base can all be seen as
preprocessing. However, when the application request
data, it takes additional time to find the allocated virtual
device, retrieve device drives, and combine information
tokens. Data and message exchange between the input
user can interactively assign selection numbers to device
operations. The device drivers also provide such maps to
the virtual device configuration panel, through which the
user can interactively decide the virtual device
configuration.
In DUI each application is interactively allocated
a virtual input device. When an application is started,
an input device server is started at the background. This
server takes full control of the physical input devices
and assumes the role of the corresponding virtual input
device. By using timeout events of X at the background,
all the physical input devices which consist of this
virtual input device provide raw data periodically to its
active device driver, where the information is converted
to the unified parameters of an “information token.”
When the application running at the foreground needs
input device information, it sends a request message to
the server and waits for an reply token. Upon receiving
a request message, the server collects information tokens
from device drivers according to its configuration, and
combines these tokens to generate a “virtual
information token,” which includes the desired data.
The token is then sent back to the application. This kind
of message and data exchange facility can be utilized in
several ways. Pipe, a UNIX interprocess communication
channel, is chosen for the implementation of DUI.
As an example of using DUI, we show how to
substitute a mouse and a keyboard for a DataGlove as
the input device of a system (Figure 3 and Figure 4).
This replacement is not unusual if we want to run a
...........................
..........................
Device
Device
.
control)
I
*I
Raw
Data
j
’.........................
1
:.
Control Panel
mnp
Virtual Device
token
token
Virtual Device
I
Figure 2: The implementation of DUI.
Figure 3: Physical device control panels.
(See color plates, p. CP- 14.)
147
8. References
1. Avila, R. S . , Sobierajski, L. M. and Kaufman, A.
E.,
“Towards
a Comprehensive Volume
Visualiztion System”, Visualization ‘92, Boston,
MA, 1992, 13-20.
Blanchard, C., “Reality Built for Two: A Virtual
2.
Reality Tool”, Computer Graphics, 24, 2 (March
1990), 35-36.
Bryson, S., “Virtual Environment Techniques in
3.
Scientific Visualization”,
Visualization ’ 92
Tutorial, October 1992.
Chen, M., Mountford, S. J. and Sellen, A., “A
4.
Study in Interactive 3-D Rotation Using 2-D
Control Devices.”, Computer Graphics, 22 , 4
(1988), 121-129.
Foley, J. D., vanDam, A., Feiner, S. K. and
5.
Hughes, J. F., Computer Graphics Principle and
Practice, Addision-Wesley Publishing Company,
1990.
Green, M., “The Datapaper: Living in the Virtual
6.
World”, Graphics Interface’90, May 1990, 123130.
Kaufman, A., Yagel, R. and Bakalash, R., “Direct
7.
hteraction with a 3D Volumetric Environment”,
Computer Graphics, 24, 2 (March 1990), 33-34.
Nielson, G. M. and Olsen Jr., D. R., “Direct
8.
Manipulation Techniques for 3D Objects Using
2D Locator Devices”, Proceedings ACM
Workshop on Interactive 3D Graphics, Chapel
Hill, NC, 1986, 175-182.
Shaw, C . , Liang, J., Green, M. and Sun, Y . , “The
9.
Decoupled Simulation Model for Virtual Reality
Systems”, CHI ’92 Conference Proceedings, May
1992,321-328 .
10. Shoemake, K., “Animation Rotation with
Quaternion Curves”, Computer Graphics, 19, 3
(July 1985), 245-254.
11. Shoemake, K., “ARCBALL: A User Interface for
Specifying Three-Dimensional Orientation Using a
Mouse”, Graphics Interface’92, 1992, 151-156.
Figure 4: Virtual device control panels.
device server at the background and the application at
the foreground currently utilize the pipe facility of
UNIX, which consumes extra time. It also takes time to
query the device information-base when necessary or
periodically convert raw data into information tokens.
However, experiences in VolVis shows that the
overhead time, compared with computation time or
physical input device lag, is negligible.
6. Conclusions
DUI achieves full input device independence
through the virtual input device paradigm. Incorporation
of new input devices or interaction methods simply
means the design of some low level device drivers.
Also, users can find the most suitable interaction method
for their specific applications by interactively trying
different virtual device configurations with no effect on
the application code. With the help of the so-called
device information-base, whose contents can easily be
expanded, applications are able to focus on the high
level requirements for the input devices, while fully
utilizing the specific device capabilities.
7. Acknowledgements
This work has been supported by the National
Science Foundation under grant IRI-9008109. Special
thanks to Lisa Sobierajski and Rick Avila for their help
with VolVis.
(See color plates, p . CP- 14.)
148
Virtual Input Devices f o r 3 0 Systems, T. He and A.E. Kaufman, pp. 142- 148.
Figure 3. Physical device control panels
Figure 4. Virtual device control panels
CP-I4
REST COE’Y AVRX LRBLE