Virtual Spinal Tap

Certain procedures are particularly hard to learn because watching the teacher does not tell you much. In common clinical procedures, we have been very successful in using short video snippets as an introduction to the technique, which allows our learners much more hands-on experience.

Using our CURIOS video mashup tool, we have integrated many of these video snippets into our OLab scenarios. And our Clinisnips channel on YouTube continues to see a healthy 100+ visitors per hour.

For some procedures, however, there is not much to see. For example, in the spinal tap, a procedure that most patients would prefer not to be the guinea pig, seeing the needle being poked into the back tells you very little about how to perform the procedure. It is a matter of feel.

Because of this need, we developed our Virtual Spinal Tap model, using HandshakeVR and the Phantom OMNI haptic device. You can access the model via our Virtual Spinal Tap case, which will give you a bit more context as to why you are performing this procedure. (Use “demo” as the keyword to unlock the case.)

proSENSE were so chuffed with our model that they featured it in their promotional material.

The Phantom OMNI haptic device has 6 degrees of freedom, with force feedback in 3 of those axes. This does not provide a perfect representation of what a spinal tap feels like for you as the operator, but it does reproduce some key sensations like finding your way between the spinous processes or the tiny pop sensation as you enter the spinal canal.

It provides a working volume of about 20cm per side and is highly sensitive. The HandShakeVR software provides a rapid application development interface, which you can just see in the box diagram above. It also allowed us to import a 3-D volumetric model, based on a CT scan of a lumbar spine, which gave us terrific verisimilitude. We used Osirix to edit and anonymize the DICOM data.

One of the strengths of HandShakeVR is that it allows you to seamlessly pair two devices. This opened up the educational possibilities enormously, with dual haptic controls working on the same model. In Student-Teacher mode, the teacher can feel exactly what the student is feeling, and can also nudge and provide subtle force cues back to the learner to help guide them on the right path.

In Student-Examiner mode, the teacher can still feel exactly what the student is feeling, but the student does not feel the examiner at all, thereby preventing subtle but unwanted feedback.

Einstein did his best to foil us with Relativistic effects.

As summarized on this slide, because good haptic feedback needs to be cycled about 1000 times per second, there is a practical limit over which you can send/receive haptic data. The HandShakeVR software neatly circumvents this problem with data caching and smart interpolation.

We did find that simply plonking our learners down in front of the hardware did not lead to good learning designs. We used OpenLabyrinth scenarios to bookend our haptic models, to give the learners some context for why they were performing this procedure. We were able to provide some data integration between our Virtual Spinal Tap model and OpenLabyrinth. This was later included in some of our work on blended simulation in the HSVO project.

The Virtual Spinal Tap project was very successful. Sadly, because of system failures and faulty backups, the project data was lost for a few years. We are happy to report that we have been able to retrieve a lost archival copy of the project notes and data. This has now been published —

Topps, D., Korolenko, M., de Domenico, J., & Newhouse, D. (2018). Virtual Spinal Tap: using haptic data to learn procedures with feel. Med Ed Publish. https://www.mededpublish.org/manuscripts/1547 ; http://doi.org/https://doi.org/10.15694/mep.2018.0000076.1