Video of William Chapin's CDR Projects


The videos on this page all represent projects at Stanford's Center for Design Research related to William Chapin's thesis work.  The common thesis was intuitive human to data interaction, without an intermediary mapping.  DesignSpace was a design environment/tool that evolved from the quest.  The CyberGlove, a result of Jim Kramer's thesis work, was the only commercial product to come of the set of projects. 

Except were specifically noted, these videos were all produced between 1992 and 1994.  Except for the externally-linked Computer Chronicles and Discover Science pieces, all videos were produced at CDR by W. Chapin.

CyberGlove and VirtualHand



Stanford doctoral student James F. Kramer originally developed an instrumented glove as a means of recognizing hand gestures in American Sign Language (ASL).  His project was called the "Talking Glove", because he synthesized the gestures he recognized into speech.  There were several newly developed instrumented gloves in the mid to late 1980's, but none had the degrees of freedom and repeatibility required for ASL recognition.  Jim developed an electronic goiniometer using differential strain-gauges.  His use of strain-gauges for a bend-sensor was novel and was patented by Stanford University. 

W. Chapin met J. Kramer in 1990 within Professor Larry Leifer's graduate student group.  Chapin immediately saw the potential for the Kramer glove to enable manipulation in Virtual Reality.  Chapin developed a computer driver to connect computers to the glove instrumentation and read the joint angles.  He then developed a virtual hand model based on bio-mechanical analysis that could be driven dynamically from the joint angles. 

Together Kramer and Chapin co-founded Virtual Technologies in the summer of 1990 to commercialize the glove for virtual reality, naming it the "CyberGlove".  The CyberGlove is still available as a product from CyberGlove Systems

TeleSign



Within CDR, Kramer, Chapin, Leifer, and Cathy Haas teamed-up to tackle a different challenge for handicapped individuals: how can deaf individuals "talk on the phone" to one another? The TeleSign solution required a computer and modem at each location and one phone line.  Video transmission over single twisted-copper was not supported until DSL came of maturity in the late 1990's.  TeleSign transmitted only the 32 bytes of joint angles and tracking data.  On a 9600 baud modem, a data rate of 30 frames per second could be sustained, bi-directionally.  Deaf individuals do rely on facial expressions, but ASL supports expressions through gestures. 

Unlike the Talking Glove which only recognized static hand gestures, TeleSign transmitted dynamic gestures for human recognition. 

Virtual Manipulation



The "manipulation" clip is an early demonstration of virtual grasp.  Most previous "gloves and goggles" demonstrations employed hand gestures to enable grabbing a virtual object.  This development estimated virtual normal force vectors from the fingertips that contacted a virtual object.  When 3 opposing vectors created a stable grasp of an object, it would "break loose" from its static position and be dynamic. 

The video was filmed first-person within DesignSpace which employed motion-parallax 3D projection.  The user could never see the virtual hand model on the screen, because their real hand occluded the view.  This was a problem for when objects were supposed to be between the hand and the user's viewpoint. 

Immersive Remote Collaboration



Multiple-person design collaboration was a primary objective of DesignSpace.  The video here was captured in late 1993, after DesignSpace was first demonstrated at Siggraph '93 in Los Angeles, and before its final demonstration at SigCHI '94 in Boston.  Gloves were not used in this video for dexterous manipulation, likely because they were a shared resource. 

The primary DesignSpace workstation manned by W. Chapin in this video was located in the CDR Bldg 560 conference room.  The secondary DesignSpace workstation manned by T. Lacey (depicted by avatar) was located in another ME building.  Connection was via two phone lines: one analog voice line and one analog voice line carrying 14.4 kbaud modem data.  All of computer graphics were rendered locally on a 486 PC and Division dView graphics boards.  The only information transmitted was object position. 

The Internet existed, but did not yet support voice over Internet protocol (VoIP).  The world-wide-web existed, but was infant (perhaps 100 web servers in the world). 

Two-handed Immersive Presence



A demonstration of the holy-grail for a dexterous-manipulation researcher: two 22-sensor CyberGloves available.  The availability of two fully-sensored instrumented gloves was a rarity for a cash-strapped research project. 

This video shows the donning of gloves, and of the avatar from a fixed perspective. 

VirtualGrasp



This "virtual grasp" video was likely shot in the same session as the two-handed video above, as it features the same pair of 22-sensor CyberGloves.  Video is shot from near first-person, which shows how the physical hands occlude the virtual hands.  This is the only recording of free-form deformation from finger manipulation. 
Discover Science




Discover Magazine's "The World of Science" program, hosted by Peter Graves, presents a special focus on two Stanford projects, "Dexter" and "Talking Glove", that show promise to facilitate speaking/hearing/seeing impaired individuals to interact with regular world. 

Key segments:

  • 1:25 - 2:45 Dexter II
  • 5:13 - 10:35 TalkingGlove
The Dexter project was not done within the Center for Design Research, but was closely related, under Larry Leifer.  Read David Jaffe's history of Dexter here

The video probably dates from ~1989.  W. Chapin does not appear in this video, nor does it depict any of his work.  This video depicts related pre-existing CDR projects, prior to his arrival. 

Computer Chronicles




Computer Chronicles, hosted by Stewart Cheifet, presents an edition on Virtual Reality that features several contributions of W. Chapin.  W. Chapin does not appear in the video. 

Key segments:

  • 1:25 - 2:45 Jim Kramer's TalkingGlove
  • 3:33 - 4:04 Crystal River Engineering's Scott Foster and 3D audio
  • 5:10 - 13:38 Autodesk's CyberSpace Project with Chris Allis
  • 15:30 - 23:30 Virtual Technologies' VirtualHand
W. Chapin began working with Autodesk's CyberSpace Project in 1988, which was the catalyst for his transfer to Stanford University in 1989.  The virtual world and virtual room acoustics within CRE's demonstration was entirely coded and modeled by W. Chapin.  The VirtualHand model and controller were the contribution of W. Chapin, while Jim Kramer was responsible for the CyberGlove and gesture recognition.  Jim Helman ported W. Chapin's VirtualHand to the Silicon Graphics GL and X/Motif platform.  Larry Edwards developed the 3Form 3D modeling tool on SGI. 

Return to WLC -> Projects
Return to DesignSpace or VSEL