OSL logo

Haptic Interface

Research at the University of Colorado

Go up


Haptic Interface  Hardware

Computer Hardware

Computer Software

Visualization Examples

Haptic Perception Studies

Personnel Involved 

Publications 

Related Literature 

Related Web Sites 

    Overview

    The Haptic Interface Project began in 1991 at the University of Colorado with the design and development of a high-fidelity haptic interface. This interface provides 5 or 6 degrees of freedom, large range of motion, and accurate specification of forces applied to the user's fingers through a stylus. Through support from the National Science Foundation and the Office of Naval Research, the haptic interface has been augmented with a graphics capability to provide an integrated visual/haptic interface facility. This facility is currently being used to conduct research in human-computer interfaces, scientific visualization, and human haptic perception.


Figure 1:  The University of Colorado Visual-Haptic Interface, shown during use for visual/haptic exploration of data for scientific visualization.
 

The combined visual/haptic interface in its current configuration provides a research testbed for exploring data rendering techniques in scientific visualization. The user operates the haptic interface by grasping the stylus in a precision (pencil) grip, with the elbow resting on a padded surface. As the user moves the stylus, a yellow arrow (glyph) on the screen moves accordingly. This is similar to the common mouse and screen pointer, except three degrees of freedom in translation (x,y,z) and two degrees of freedom in orientation (pitch and yaw) are provided simultaneously.

Data corresponding to the physical location at the glyph tip is rendered by various mappings (haptic rendering modes) associating data values to forces applied to the stylus. This data is also rendered visually by various graphics techniques (visual rendering modes). Shown above is a case study in visualization of vector fields, using a display of streamlines indicating magnetic flux lines in an electromagnetic actuator.  Haptic rendering in this case includes an "orientation constraint", where torques are applied to the stylus to cause it to point along the local streamline at the glyph tip.  Forces are also applied to push the stylus toward a selected streamline. This allows free motion along the streamline, feeling as if the stylus were a "bead on a wire".  The figure below shows how stylus forces F and torques Tau are based on the measured deflection D and angle Theta relative to the selected streamline position and local orientation. The "bead on a wire" effect is thus created by virtual damped springs, centered on the selected streamline. See publications   [7]  and  [10]  for more details.
 
 


Figure 2:  Bead-on-a-wire haptic rendering mode.

Haptic Interface Hardware

Figure 3:  Close-up of the stylus grip of the Haptic Interface with 5 DOF enabled. Three actuator rods connect at the left end of the stylus, and two rods connect at the right. The grip of the stylus contains buttons for selecting functions during interaction with data. Within the approximately spherical workspace of diameter 0.3 m, positions (x, y, & z) and orientations (pitch & yaw) are determined from rod length measurements to a resolution of 7 microns. The parallel actuation scheme places all actuator modules so they provide forces directly between ``ground'' and the user's fingers, enabling  high-bandwidth transmission of forces in each DOF.
Figure 4:  Clasp of one of the six actuator units, containing a motor which propels the actuator rod through a simple friction drive transmission, and an optical encoder for rod length sensing. Motor and rod are smoothly coupled by preloaded rolling elements, eliminating backlash and cogging, yet providing an extremely stiff link which enables high bandwidth transmission of forces to the user's fingers. Each actuator is mounted on a 2 DOF gimbal, with center of rotation at the point where the rod contacts the motor drive wheel so no bending moments are applied to the rod.
Figure 5:  Miniature 3 DOF gimbals on the distal end of each actuator rod enable transmission of only axial forces from the actuator to the stylus. The left end of the stylus is shown, with yellow finger grip and mode selection switches. Size is indicated by the dime on the left. The stylus is free to rotate about its long axis (this degree of freedom is neither sensed or controlled).
Figure 6:  Miniature 3-axis force sensors (center) are used to measure forces transmitted to the stylus, and in a force feedback loop to provide high bandwidth and high-fidelity tracking of force commands [4]. Measured cartesian residual friction is below 0.08 N, with maximum forces above 8.0 N, yielding a more than 100:1 dynamic range. A steel core (right) is instrumented with 4 silicon strain gages, which are calibrated in the computer to measure axial force while rejecting transverse forces. A dime is included (left) for reference.

Computer Hardware


Two computers support haptic/visual rendering. The visual component is provided by a Silicon Graphics ONYX2, with Reality Engine 2 graphics pipeline, wide format monitor, and stereo shutter eye wear. The haptic rendering is provided by a 5-processor digital signal processing computer based on the Texas Instruments TMS320C40 floating point processor. Computer I/O boards interface with signal conditioning and power amplifier electronics which connect with haptic interface sensors and actuators. A PC hosts the DSP processors, provides software development tools, and links the haptic and visual computers via a serial line to synchronize the haptic and visual displays.


Figure 7:  Physical layout of the Haptic Interface Testbed, showing display hardware, computers, and interface electronics mounted on a modified computer workstation table.


Figure 8:  The haptic computer hardware is hosted on the ISA and PCI busses of a PC, but data is communicated between processors and I/O on internal high-speed busses. 


Computer Software

Figure 9:  (Picture of AVS visual programming screen, from Utah talk)

The software package being used for visual rendering on the Silicon Graphics Onyx 2 is Advanced Visual Systems' Express/Developers Edition.  Utilizing a visual programming interface, Express provides a wide variety of standard visualization functions, such as 3D volume and surface rendering, slicing, contouring, texturing, glyphing, etc.  This provides a basic set of visual rendering modes for the visual portion of our research.  The Developers Edition also allows custom rendering modes to be developed by combining functions, both existing and original functions written in C, into an application.  In order to fully integrate AVS/Express into the Haptic Interface the X-Windows drivers on the Silicon Graphics Onyx 2 was modified to pass the position (represented by cartesian coordinates), orientation (represented by a quaternion), and the button clicks from the haptic interface to AVS/Express.  With these modifications the haptic interface can be utilized as a windows pointer, replacing the mouse, or a combination of a 5 DOF pointer within AVS/Express' viewer window and a 5 DOF haptic display.  The yellow arrow seen within the viewer window in Figure 1 above is the 5 DOF pointer.  Its position and orientation match those of the stylus as the user moves it through the data.

Figure 10:  (Picture of haptic software architecture, from vis00 paper)

Haptic rendering software is written in C and implemented on the DSP processors using the Parallel C real time operating system. This software determines the Cartesian position and orientation of the stylus from rod length measurements [8], uses this to interpolate the data field to determine data values at the current probe location [3], computes Cartesian forces and torques to haptically render the data according to the rendering mode selected [5],[7],[10], computes force commands to each actuator rod, and computes force control laws based on sensed rod force [4]. Presently, these computations are repeated at 2 kHz rates, providing high-quality rendering of the data, free from corrupting effects of computation delay and undesired electro-mechanical dynamics.


Visualization Examples

Our current focus in the Haptic Interface Project is toward enhancing an individual's ability to understand physical systems, devices, or processes. Computer models of these physical systems are used to generate data describing their structure and behavior. This data is then displayed in various ways for human interpretation.   Traditionally, visualization software, such as AVS/Express, is the only means of data display, leading to the use of "data visualization" as a synonym for "data understanding".   While visualization software packages are an invaluable tool, they do have one great handicap: the two dimensional computer screen.  In order to see inside a volume of data, the analyst is forced to use simplifying visual display modes such as isosurfaces and slices.  The techniques for viewing vector fields are even more simplified to keep the image from becoming too cluttered to understand.    We are developing modes of haptic data rendering which combine with visual display modes in a synergistic way, so that the unique abilities of the haptic sense complement those of vision.

An example of vector field visualization is shown below, where the magnetic flux density vector B in a Lorenz force actuator is rendered visually by streamlines (advection) of the vector field emanating from a grid of seed points.  Color indicates the magnitude of the vector along each streamline (red highest and blue lowest). In this application, the 3D shape of the vector field in the central blue area is of interest, since this dictates the force on a current-carrying coil located in this area.


Figure 11:  Magnetic flux density vector in a Lorenz force actuator.
 

A close-up of this region of interest is shown below. While the direction of the vector field components in the viewing plane is clear, the out-of-plane components are difficult to discern. Moving the viewpoint and using stereo can help, but it remains difficult to see the detail without excessive clutter, where streamlines overlap so much that their structure is obscured. The yellow arrow in this view corresponds to the stylus of the haptic interface.  A mode which has proved useful in this application is the "bead-on-a-wire", where forces and torques are applied to the stylus which emulate springs centered around the location and orientation of the streamline.  If the user does not resist these forces and torques, the hand glides along the streamline. We have found that this kinesthetic feedback, particularly in orientation, helps to pick out the streamline among the clutter on the screen, and provides out-of-plane information not available from the visual display. See [7], [10]  for more detail.

Figure 12:  Close-up of region of interest from Figure 11.
 

Fluid dynamics provides another class of visualization problems. Shown below is the flow field around a hypersonic aircraft called a Waverider.  The vehicle is shown in gray, with the nose to the left. A half-space of flow data is also shown on the right side of the vehicle (the left is similar due to symmetry). Color in this plot indicates the air density between two values around the free stream density. This provides information on the structure of shock surfaces, which are characterized by locations in the flow where density (and other flow variables) change suddenly.

Figure 13:  Waverider hypersonic vehicle, showing shock wave containing high pressure volume on the underside (shown on
the right half of centerline only).
 

A more direct rendering of shock surfaces can be seen by displaying the isosurface corresponding to a large density gradient magnitude (as seen in Figure 13). Typically, such displays make the location and shape of a shock wave clear on the slicing planes, but not in the interior of the data. Multiple slices are often necessary to reconstruct the 3D shape of interest.  We are currently investigating haptic modes of rendering shocks (see [5], [7] ), which create virtual surfaces from density gradient data, allowing one to slide along the surface of the shock with the haptic interface stylus to help clarify its shape and to locate any secondary shocks which may be obscured in the visual view.

In addition to shocks, vortex structures can be difficult to visualize in fluid dynamics models. The figure below shows a simple artificial flow field containing several vortices. Again, streamlines are used to illustrate the vector field, but the complexity of the flow makes vortex structure and location difficult to see. More on this to come.....


Figure 14:  Vortex core identification mode.


Haptic Perception Studies

We have conducted tests using the haptic interface to better understand human haptic perception of data rendering elements which can be conveyed by haptic interfaces involving dextrous finger-type grips on a stylus or tool.

Hardness Perception

We have carried out experimental tests which demonstrate that conventional requirements for high quality virtual surface rendering do not match well with human haptic perception.  Based upon tests with 49 volunteer subjects, we have gathered strong evidence that humans differentiate hard virtual surfaces primarily based on a quality we defined as Rate-Hardness, the ratio of initial rate of change of force versus initial velocity upon penetrating the surface [6].  This is in contrast to the conventional view that high stiffness of the surface is the salient characteristic which makes it feel hard.  High values of Rate-Hardness can be achieved by adding damping-type behavior in the virtual surface, which has the distinct advantage of enhancing the stability of the control system implementing the surface.  In contrast, stable, high stiffness control is difficult to achieve in multi-degree-of-freedom, large range-of-motion haptic interfaces.  Hence, we have discovered a nuance of human haptic perception which can be exploited to improve performance and reduce costs in haptic interfaces.  In effect, we now understand how to create a haptic illusion of hard surfaces using dynamic compensation in software, considerably reducing requirements on mechanical and electronic components in the overall system.

Friction Perception

We have also carried out another experimental analysis to explore human perception thresholds of friction.  Resistance to motion in haptic interfaces causes unwanted corruption of rendered data.  Quantifying perceptual thresholds of this friction is important as a quality measure and design metric for these devices.  We conducted an experiment, with 41 volunteer subjects, to determine sensory thresholds for our haptic interface using pairs of friction-reducing force controllers to vary levels of residual friction presented to users [9].  We identified key difficulties in determining these perceptual thresholds and also proposed a series of stylus-grip specific measurements that would provide the needed fundamental basis to formulate a friction perception model.  This model combines voluntary motions, haptic interface dynamics, and human sensory channel frequency responses to predict when sensory thresholds would be exceeded by voluntary motion of any particular device.  Such a model could be used to compare the quality of haptic interfaces in terms of residual, sensory-meaningful friction.  As a design tool, this model would also be helpful in deciding when additional improvements are likely to pay off with significant reductions in perceived friction.


Publications

[1]  C. D. Lee, D. A. Lawrence, and L. Y. Pao. "Dynamic Modeling and Parameter Identification of a Parallel Haptic Interface," Proc. 10th Annual Symposium on Haptic Interfaces for Virtual Environment and Teleoperator SystemsIEEE VirtualReality Conference,  to be held in Orlando, FL, Mar. 2002 (PS file , PDF file.)

[2] R. Y. Novoselov,  D. A. Lawrence, and L. Y. Pao. "Haptic Rendering of Data on Unstructured Tetrahedral Grids," Proc. 10th Annual Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, IEEE VirtualReality Conference,  to be held in Orlando, FL, Mar. 2002 (PS file , PDF file.)

[3]  R. Y. Novoselov,  D. A. Lawrence, and L. Y. Pao. "Haptic Rendering of Data on Irregular Grids," Proc. 9th Annual Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, ASME International Mechanical Engineering Congress and Exposition, Orlando, FL, Nov. 2000 (PS file , PDF file.)

[4]  C. D. Lee, D. A. Lawrence, and L. Y. Pao. "A High-Bandwidth Force-Controlled Haptic Interface," Proc. 9th Annual Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, ASME International Mechanical Engineering Congress and Exposition, to be held in Orlando, FL, Nov. 2000 (PS File,  PDF file .)

[5]  D. A. Lawrence, C. D. Lee, L. Y. Pao, and R. Y. Novoselov. "Shock and Vortex Visualization Using a Combined Visual/Haptic Interface," Proc. IEEE Conference on Visualization and Computer Graphics, Salt Lake City, UT, Oct. 2000 (PS file , PDF file .)

[6]  D. A. Lawrence, L. Y. Pao, A. M. Dougherty, M. A. Salada, and Y. Pavlou. "Rate-Hardness: a New Performance Metric for Haptic Interfaces," IEEE Transactions on Robotics and Automation, 16(4):  357-371, Aug. 2000 ( PS file,  PDF file.)

[ 7]  F. Infed, S. W. Brown, C. D. Lee, D. A. Lawrence, A. M. Dougherty, and L. Y. Pao. "Combined Visual/Haptic Rendering Modes for Scientific Visualization," Proc. ASME Dynamic Systems and Control Division, DSC-Vol. 67, pp. 93-99,  ASME Int. Mech. Engr. Cong. & Expo., Nashville, TN, Nov. 1999. ( PS file,  PDF file.)

[8]  C. D. Lee, D. A. Lawrence, and L. Y. Pao. "Guaranteed Convergence Rates for Five Degree of Freedom In-Parallel Haptic Interface Kinematics," Proc. IEEE Conference on Robotics and Automation, Vol. 4, pp. 3267-3274, Detroit, MI, May 1999 ( PS file,  PDF file.)

[9]  D. A. Lawrence, L. Y. Pao, A. M. Dougherty, Y. Pavlou, S. W. Brown, and S. A. Wallace. "Human Perceptual Thresholds of Friction in Haptic Interfaces," Proc. ASME Dynamic Systems and Control Division, DSC-Vol. 64, pp. 287-294, ASME Int. Mech. Engr. Cong. & Expo., Anaheim, CA, Nov. 1998 ( PS file,  PDFfile.)

[10]  L. Y. Pao and D. A. Lawrence. "Synergistic Visual/Haptic Computer Interfaces," Proc. Japan/USA/Vietnam Workshop on Research and Education in Systems, Computation, and Control Engineering, Hanoi, Vietnam, pp. 155-162, May 1998 ( PS file,  PDF file.)

[11]  D. A. Lawrence, L. Y. Pao, M. A. Salada, and A. M. Dougherty. " Quantitative Experimental Analysis of Transparency and Stability in Haptic Interfaces," Proc. ASME Dynamic Systems and Control Division, DSC-Vol. 58, pp. 441-449, ASME Int. Mech. Engr. Cong. & Expo., Atlanta, GA, Nov. 1996 ( PS file,  PDF file.)

Related Literature

(link to web database)

Related Web Sites

Haptic Research at Northwestern University
haptics-e, an online haptics journal.

Advanced Visual Systems Homepage
The International AVS Centre
Silicon Graphics Inc.
SteroGraphics Corporation


This page has been viewed 99 times since May 01, 2016.


webmaster: webmaster@osl-www.clorado.edu