Search Ace Linguist

November 2, 2020

Recommendations: ProTactile Sign Languages

I wanted to share some interesting links. Within linguistics, we recognize that a language can have different modalities: spoken, signed, arguably written. But I recently found that languages can also have a tactile modality.

ProTactile American Sign Language (ASL) developed in 2008 as a way for deafblind people to communicate with each other. One of the most notable aspects of it, compared to visual American Sign Language, is that speakers will hold each other's hands as they sign, and listeners will tap or lightly scratch the speaker to display their emotional reaction.

There does not appear to be a lot of linguistic research on ProTactile ASL, or other ProTactile languages. The potential for ProTactile languages to help deafblind folks communicate with each other as well as with non deafblind folks is enormous. It would be wonderful for future researchers to look into things like how to teach ProTactile languages, stories from deafblind people about themselves and their communities, and both the differences and similarities between languages where tactile elements are critical and languages where it isn't.

Further reading and watching:

ProTactile ASL, a video by Quartz.

A video presentation at the Deaf Theatre Action Planning Session on ProTactile ASL, as well as an explanation of "backchanneling."

ProTactile: Touch Language Techniques, a video by Seek the World.

The story of Heather Lawson (video), an Australian advocate for deafblind people. She speaks Tactile Auslan augmented by haptics, which is similar to backchanneling.

An interview with Terra Edwards, a Ph.D. student in Anthropology.

An honors thesis by Alissa McAlpine: "Keep in Touch: A Comparative Analysis of Visual and ProTactile American Sign Language."

1 comment: