Read Touch Language
Touch Language
TABLE OF CONTERNTS
• Table of Contents
• Introduction.
• Example - The Electronic TV and video for the Blind and Deafblind (eTV)
• Accepting Touch Language
• Touch Language Preliminaries
• Type of Information Provided.
• Touch Language Parameters.
• Approach Philosophy
• Descriptive Groups
• THE GROUPS
• Group Delivery Receptors
• Static elements
• Definitions:
• Screen Coordinates
• Preliminary Definitions
• Geographic (Screen) Definitions
• Operational Impacts Definitions
• Static Parameters
• Forming Shapes
• Dynamic Parameters
• Idiosyncrasies of TV Stations and Video Broadcasting
• Activity continued beyond the PalmScreen
• Screen Change Variable
• Beginning and Ending of Commercials
• Multiple Information on TV Screen
• Foreign Language Translated at Bottom of Screen
• Background Music Creating Suspense
• Station ID Number
• Time Display
• Problems in Broadcast reception
• Dark Screen Due to Problem in TV Reception
• “Snow” Screen Due to Problem in TV Reception
• Interruption by the Emergency Broadcasting Service
• Understanding the TV Guide Channel on Screen
• The Remote Control
• Definitions
• Dynamic Parameters
• Basic Dynamic Parameters
• Combinatory Dynamic Parameters
• Combinatory Dynamic Parameters
• Attributes for Mechanical Messages
• Attributes of Touch Language
• Group Delivery Receptors
• Back of Fingers as Receptors
• Groups that Represent Personal Character.
• Sub-groups that Represent Gender.
• Examples of Signal Delivery
• Face of Fingers as Receptors
• Utility of the Back of the Hand
• Group Representation Utilizing Face of the Fingers
• Language Representation
• How it all Works
• The Face of the Pointer Finger with Nibbles (Peoples Finger):
• Examples of Use
• Examples:
• The Face of the Middle Finger with Nibbles (Lighting Finger)
• The Face of the Fourth Finger with Nibbles (Description Finger)
• The Thumb
• Thumb Control and Alert Function
• Thumb of Cradle in Middle Position
• The Thumb as a Passive Receptor
• The Thumb Cradle
• Thumb of Cradle in Lateral Positions
• The Face of the Fifth (Pinky) Finger with Nibbles (Cross Relationship Finger)
• Female representation
• TV channel query and selection
• Summary of Receptors Versus Group Representation
• Functional Equivalence of Sentence Information
• Touch Language Utility for eCane Usage
• Touch Language Modes for the eCane
• Putting It All Together
• The Universality of Touch Language
• The Cultural Aspect
• Multiple Meaning in Visual Human Signs
• Multiple Perceptions in Interpretation
• Translating Multiple Perceptions
• Relevancy Issues
• Reversed Multiple Perceptions
• Rules of Association
• Synchronization
• The Rules
• Translation Mechanism
• The Touch Language Principles
• Key Words – revisited
• Functional Equivalent Acronyms
• Usage Particulars
• Scene Locality
• Process Principles
• Delimiters
• The Dialog Segment
• The Dialog Segment
• The Dialogue Information
• The Dialogue Reduced State (DRS)
• Summary of Functional Equivalent Touch Language
• Touch Language Utility
• Functional Equivalent Sound Effects
• Vibrations and Frequency
• Nibbles and Frequency
• The Other Hand
• The Left Hand Pinky
• Grammatical Tense
• Direction and Cause/Receipt Designation
• The Person Hand
• The Object Hand
• Intent and Action
• Connotations and Modifiers
• Direction and Cause/Receipt Designation
• Object and Person Hand
• The Person Hand
• The Object Hand
• Intent & Action
• Connotations & Modifiers
• The Aggression Group
• Neutral and Matter of Fact Group
• The Non-aggressive Group
• Physical Features in Touch Language
• Mechanics of Articulating Physical Features
• Characterization
• Magnitude
• Location Exchange Action and Stereo Sound
• Location Exchange
• Stereo Sound
• Localization of Scene Variables
• Geographic Localization
• Scene Inclusion Variables
• The In-Wait Mode Variables
• Value Indicator for Variables
• Environmental Scene Variables
• Conceptual Localization
• Multiple-Screen Variable
• Teaching Touch Language to the Deafblind
• The Language
• The Procedure
• Touch Language and Sensory Integration
• The Vestibular and Proprioceptive senses
• The Relevance for Touch Language
• Learned interpretations
• Tools of Avoidance
• The Non-entertainment Segment
• The Touch Language Elements
• Emotional States
• The Back of the non-dominant Hand Revisited
• The Pause and Continue Command
• Identifying the Speaker
• Geographical Placement of Speaking Parties
• Format
• Number Systems in Touch Language
• Magnitude
• The Back of the Third Finger (Magnitude)
• Face of the Third Finger (Magnitude)
• Fuzzy Magnitude
• Fuzzy and Undetermined Articulation
• Fuzzy and elements of Discourse Analysis
• Seasons of the year
• Mathematics in Touch Language
• Geometry
• Gloves Usage for Geometric Forms
• The mechanical process
• The Passive Mode
• The Active Mode
• Finger Sensitivity and Avoiding Reception Mistakes
• Touch Language Utilized in Communication
• The transmission Mechanism
• Communication Controls
• The Functional Equivalent Communication Controls for the eCane
• Communication between two Deafblind Persons with Touch Language
• Mechanism of Direct TLC
• Touch Language Grammar
• Grammar Binding Rules
• The Rules
• Additional Remarks
• Adverbs of Approximation
• The Comparative Groups
• Modifiers
• Commingled Use of same fingers
• Various Grammatical Tense Issues
• Multiple Past and Future Tense Occurrences
• Point of Reference
• Verbs in Touch Language
• Touch Language Structural Concept
• Question and Question Mark in Touch Language
• Command and the Exclamation Mark in Touch Language
• Multiples
• Appropriated Words
• Singular and Plural Descriptions
• Personification
• Reciprocity
• Spelling in Touch Language
• Inclusion and Connectivity
• Affirmation Negation and Emphasis
• Coupled Adverbs and Pronouns
• Religious and Spiritual Articulation
• Miscellaneous (AND, Yes, No, Parentheses)
• AND
• Yes NO and Parentheses
• Help and User’s Guide
• On-line Instant Help Facility
• The procedure
• The Active Mode
• The Passive Mode
• Declarations in the Help Facility
• Concluding Remarks
• Vehicle for Touch Language Evolution
• Mathematics in Touch Language
• The Scientific Foundation For Touch Language
• Preface
• Introduction
• Principal Types of Cutaneous Receptors
• Threshold Issues
• Mechanoreceptors Targeted During Touch Language Execution
• Various Facts and Corollaries
• The Biological translation of senses
• Scientific Foundation Through Intuitive Imagery
• Military Technology
• Morse Code in Touch Language
Acknowledgement
• Index of Appendices
• Appendix A - Static Parameters
• Appendix B - Basic Dynamic Parameters
• Appendix C - Combinatory Dynamic Parameters
• Appendix D - Attributes of Touch Language
• Appendix E - Possible Embodiments For Receptors
• Appendix F - Touch Language Representation
• Appendix G - Lower &Upper Nibble Impact Representation
• Appendix H - Multiple Meaning in Visual Human Signs
• Appendix J: Teaching touch language
• Appendix K: Definition Structure for Signal Transmission
• Appendix L: Geometric Forms
• Appendices
• Figures
• The Morse Code
• The Author
Introduction
Deaf persons utilize Sign Language that enables them to communicate among themselves and with some auxiliaries also with hearing persons. Sign Language does not solve all perceivable audio situations. Deaf persons cannot hear thunder, the wind, sounds of streaming rivers, sounds of animals or sounds produced in our society like cars, trains, or music. There are partial solutions to enable deaf persons such perception of sound by either providing a description of it or by enabling perception through another sense of the five human senses (hearing, seeing, smell, touch, and taste). Such solutions involve for example light signals (sense of seeing) in alarms, vibrations (sense of feel/touch) for alerts, but above all, they see with their eyes and have an immediate unfettered access via a human sense to their environment as a compensating auxiliary element.
Deaf individuals are able to utilize their sense of seeing as the compensating auxiliary for perceived connectivity to their environment. Blind persons use the opposite for such connectivity. That is, their sense of hearing, which maintains perception of what happens in their environment. Thus, audio description of sights enables a substitution on some level for their lack of seeing. The blind can also augment, to a degree, their lack of seeing by partially utilizing the sense of touch as is done when reading Braille, letting their fingertips perceive the combination of raised dots that comprise appropriate letters. Other sensory utilizations have been introduced, such as the Tac-Tile sounds in the works of Russ Palmer, Paul Chamberlain and David Mitchell or holistic and interactive communication methods by Riitta Lahtinen.
Thus, we notice that a person with dysfunctional sense can compensate, to a degree, lack of a certain sense by enhanced use of another sense. In the case of blind persons we notice that another sense, the sense of touch, is partially used in reading Braille. The blind can also “feel” their way around by touching elements in their immediate environment, such as walls, doors, or door handles. The blind can also utilize a combination of a somewhat extended sense of touch combined with the sense of hearing when they utilize a cane to navigate.
However, persons who are both deaf and blind (deafblind) are in a different category of perception. The lack of two major senses, hearing and seeing, appears to leave at most only three other senses. The sense of touch, smell and taste. When the senses of taste and smell are intact, a deafblind person can enjoy almost to the fullest the satisfaction of food or a meal, and even to the fullest when garnishing is not a factor. The major contributor to such enjoyment being the sense of taste, augmented by the sense of smell, and finally the sense of touch provided not by the hands but rather by the inner parts of the mouth, namely tongue, palate and even teeth. However, utilization of these senses does not enable a deafblind person to perceive the environment at large and are suitable only for very limited auxiliaries. The invention of the eCane and its auxiliary devices (Liebermann 2003) enables the deafblind to communicate with others, locate desired places or objects and navigate to them if desired. The Security Emergency Vehicle Alert Companion (SEVAC) in one embodiment can enable deafblind to feel confident in the home environment so that they know if a break-in occurred in another room in the house, or the vibrating fire alarm with direction can alert them to fire occurring on another floor in a building. However, all such auxiliary devices fall short in enabling the deafblind to enjoy other amenities in our society such as listening to a radio or enjoying a television show. Television perception for the deafblind is an invention (Liebermann 2008) that utilizes two other senses of the five known to us, the senses of smell and touch. There is no need to create any interface to the sense of smell, as even the deafblind utilize their sense of smell much the same as their seeing and hearing counterparts do. However, when it comes to the sense of touch, an additional tool is required in translating touch components of dynamically changing and even static scenes. Touch Language is therefore proposed as such a communication medium that could be useful for the blind and in particular for the deafblind. Furthermore, the proposed Touch Language has great utility for communication, allowing for the transmission and reception of pragmatics in an intuitive and condensed manner and thereby substituting the need for dependency on word articulation as is done in Braille.
Sign Language used by deaf, utilizes motions of the fingers, hands and arms. However, additional components are supplied through facial features and body language to relate such elements as magnitude, emphasis, or bewilderment that spoken language provides through intonations and changes in form of utterance. Taking our cue from Sign Language, we realize that such visual auxiliaries are inappropriate for deafblind persons. Specifically when a deafblind person may want to partake in a TV or video enjoyment or other amenities. Therefore, it is proposed to extend the analogy of additional visual communication components of Sign Language to provision of appropriate elements via touch. However, in contrast to Sign Language, the task in utilizing touch is much more difficult due to territorial confinement for signal delivery and due to its complexity. Touch parameters need to be defined under additional constraints relating to their descriptive modes, where no previous visual clues exist and where intuitive perception guides our consideration, rather than some accepted methodology. Though, sounding somewhat ambitious, and with the caution discussed below, we will refer to it as Touch Language.
Sign Language as utilized by deaf individuals, was developed and has been evolving in an intuitive form due to existing needs. As such it has a distinct anthropological aspect. An experiment that was done some years ago, grouped deaf children who did not know Sign Language and kept them isolated from hearing children or adults. Within a relatively short time, the deaf children developed their own form of communication that was basically a Sign Language of sorts. Interestingly enough it had many of the existing elements of currently practiced Sign Language.
The children in the experiment had a base from where to start. Namely, they had three basic (and minimal) conditions satisfied.
• They were free to devote time to develop their communication form, as their basic needs of food and shelter were taken care off.
• They were among like individuals (i.e., children) with the same basic lack of hearing abilities.
• They could see each other and thereby their motions for communication.
The deafblind on the other hand, have at least two of the above basic conditions missing. Thus, it is the responsibility of an advanced and humanistic society to close the gap of the two missing conditions, or provide a somewhat “artificial” mode of communication.
To a certain degree, such provision has been rendered through the teaching of palm fingerspelling and the exercise of such new skills in interpersonal communication. However, it still leaves a lot to be desired and falls short of a communication medium that could open up more access to our social amenities, as well as getting rid of the almost complete dependency on others for basic necessities and amenities taken for granted by persons who do not have such dysfunctional senses. Touch Language is advanced here as a step in that direction. The idea of touch as a language is not new (Enakoski, R., and Routsalo, P.). We build on the accepted base of such surmise and introduce below the needed component for Touch Language.
Persons who interface with autistic individuals know that deep pressure helps such individuals to be connected and specifically connected to the external world. Therefore, Touch Language needs to contain deep pressure elements that are as important to the autistic person as they are to the deafblind people. It is important when remembering that deafblind people are not privy to the audio and visual connectivity elements, to the external world. That goes both to personal connectivity as well as inanimate objects. Thus, Touch Language has built in elements of deep touch achieved via strong vibrations, pressure points and pecking sensation defined here as nibbles. The invocation of these elements is not artificial, such as being evoked every so often by a program routine, but rather is integrative. Namely, Touch Language is built on a basic corner stone that provide the deep touch connectivity throughout its usage without the need for their artificial introduction.
We provide an example of Touch Language to illustrate its utility. The components and elements used in the example will become apparent only later on, when they are properly introduced in the book. Thus, the example has no didactic value and is merely provided in the interest of acquainting the reader with the operative value of Touch Language. The example given below is that of an electronic TV or video presentation for deafblind persons based on Touch Language.
Example - The Electronic TV and Video for the Blind and Deafblind (eTV)
Functional equivalent of visuals on TV or video are made available for the blind and deafblind via an apparatus and method. The apparatus is a set of electromechanical gloves that translate the dynamic visual content on the TV or computer screen to [a set of] vibrations and impacts on the palm, fingers and back of the hands. The signals of the electromechanical gloves, perceived by the blind and deafblind, are built on scientifically proven basis of communicating such excitations to the brain. The signals and excitations are a subset of Touch Language, a powerful communication tool created for such and other communication purposes. Both subsets will be discussed later on in the book.
The palm of the dominant hand serves as a functional equivalent TV or computer screen onto which dynamic motions from the TV or computer screen are “projected”, while simultaneously conveying through impacts and vibrations to other parts of the hands and fingers, essential information completing the perception of actions on the screen. The information imparted through impacts and vibrations is done by pragmatics, where topics transmitted enable full comprehension of the following the actions on the screen.
Imparting fine grain detail is an integral part of the process and the transliterated functional equivalent visuals with their actions and content occur in an instant, all ripe with fine grain detail.
For example, the back of the four fingers of the dominant hand (i.e., without the thumb) represent people, the fingers’ location bespeaks of gender, and impacts on selective parts of the fingers provide information about character and age brackets. Impact on the back of a finger identifies the action initiator, while impact on the face of it identifies the action subject or recipient. Thus, a scene whereby a nice old lady is chasing a young mischievous young boy would be transmitted as follows:
The back (denoting action taken or initiation) of fifth finger (nice lady) receives three impacts (denoting older age); the hand vibrates or receives impacts with the Morse code of the single verb “chase”; the face (denoting recipient or target of the action) of the second finger (denoting not nice male) receives a single impact (denoting young age), while a contour of the chase is drawn on the palm of the hand. It only takes about a second to perform and contains the complete story illustrated on the TV or any video screen. The written description is lengthy, the execution is instant and the comprehension is immediate.
At present time, the deafblind can communicate only by fingerspelling in the cup of the hand or read Braille (books or computer Braille readers). Thus, they live in total isolation of sensory excitation and severe deprivation of social contact. The electronic TV and video that is based on Touch Language, could enable them access to the media and facilitate for them the ability to partake in life events.
Touch Language has five components: Morse code, PalmScreen, impact nibbles, vibrations and the optional Braille dialogue segment. As will be seen later, we will strive to minimize the latter (Braille Dialogue Segment) and operate Touch Language devout of any Braille components.
The book is structured on an evolutional tract leading the reader through the building blocks of Touch Language, rather than the assumption of an already developed language with the didactical divisions to segments.
The decision to use Morse code rather than Braille was a result of an experiment conducted at Focus Groups, a division of Signtel, Inc. The company hired well over 100 deaf and hard of hearing employees, some of which were blind. They were all employed as assistant developers for the products developed in the company. They were divided into focus groups and analyzed and developed their responses to a variety of issues related to their culture and the products under development. In the experiment mentioned, they were first taught Braille and then Morse code. Subsequently, they were asked to produce their names, first in Braille and then in Morse. They were then asked, which of the two (Braille or Morse) was easier. Morse won hands down. Thus, the Morse code was adopted as the preferred form of communication.
Accepting Touch Language.
Innovations and new technologies have many times to overcome resistance to their acceptance. The public usually welcomes and embraces such innovations not realizing the effort and sometimes even uphill battles to bring the innovations from concept to market and usage. Acceptance at the various stages is crucial as it begets support and support begets action, resulting in the introduction of the innovation to the users. Thereupon comes the cycle of acceptance by users, whether immediate or delayed and leads to public demand for the successful products. The more innovative, groundbreaking or revolutionary the products are, the more resistance they could face. Touch Language may be in that category of innovations. To illustrate how resistance is overcome by tenacious innovators, we cite below excerpts from a commemoration address given by President George W. Bush in 2004, honoring the Wright Brothers who invented the airplane that lead to the commercial and passenger aviation industry.
In his address, President Bush told the audience that: The United States Patent Office “concluded the plans were inadequate and the machine could never function as intended”.
“The New York Times once confidently explained, why all attempts at flight were doomed from the start.”
"To build a flying machine’, declared one editorial, ‘would require the combined and continuous efforts of mathematicians and mechanicians from one million to ten million years.’ As it turned out, the feat was performed eight weeks after the editorial was written.”
President Bush continued to say: “There is something in the American character that always looks for a better way, and is unimpressed when others say it cannot be done. Those traits still define our nation. We still rely on men and women who overcome the odds and take the big chance -- with no advantage but their own ingenuity and the opportunities of a free country.”
Another example relates to the author of Touch Language, who together with two engineers from Los Alamos Laboratory, Michael Wolf and John Crowley built in the early 1980s a laptop computer operating both DEC, as well as IBM operating systems. When they looked for venture capital money to market the laptop computer they were told “A laptop computer? Who in the world will ever need a laptop computer”!
The acceptance of Touch Language could encounter some type of resistance. It is a new and intricate concept, it is revolutionary and will require a learning curve like any new language. Teachers of Touch Language will encounter the arduous and repetitive tasks so common in teaching new languages, yet magnified several folds; but so did Mrs. Sullivan who taught Helen Keller, and the rest is history.
The potential cost of hindered acceptance of Touch Language, would not be measured by money lost, nor by time spent, but rather by the number of deafblind persons who could have used the Touch Language, as a tool and an aperture to the hearing and seeing amenities of life, available to the hearing and seeing community.
It is up to us, who care for the deafblind, to be courageous and dedicated, so that we could enable generations of deafblind persons to benefit from partaking in a more active communicative and perceptive role and enjoyment of life amenities available to the hearing and seeing persons in our society. Touch Language makes it possible.
Touch Language Preliminaries
Sign Language is a visual language. To a degree, even regular textual language is visual, since readers perceive the complete word visually as part of the cognitive process. Furthermore, most of the languages practiced by hearing persons, are such that sound cues of the words become an integral part of cognition. Indeed, most of these languages are phonetic languages (Liebermann 2000). Thus, for most of us, we utilize both phonetics, as well as visual cues in exercising our reading skills as part of our total language skills.
Therefore, persons who can utilize their hearing and seeing senses, have complete basic communication instruments. Deaf persons substitute or augment the hearing sense with visual auxiliary by viewing Sign Language that is functionally equivalent to hearing spoken words. The end result is the cognition, irrespective of the route taken to achieve it. It is a well-known fact, that on the micro level of our anatomy, i.e., the nervous system, one can observe the same functional result irrespective of the route taken to achieve it. Namely, if we pinch a particular nerve in the body, or we apply acid to it, or we pass current through it, ultimately, the same result is observed. That is, a current is produced in the nerve. Therefore, taking the cue from the micro-level and exercising it at the macro level of language cognition, we contend that producing the same cognitive result, irrespective of the route taken is an appropriate step in evolution.
Blind person are deprived of the visual cues in reading text, but can substitute lack of visual sense by the sense of touch, yet maintaining the phonetic cues as part of the cognitive process. The situation is however quite different, when it comes to persons who are both blind and deaf. The question before us, is whether the sense of touch can operate on two separate levels, providing on one level, the reading comprehension utilizing Braille, and on the other level the substitution for the visual resulting from the sense of seeing. Basically, what we ask, is whether the sense of touch can double up in two separate forms, to enable both reading by touch (Braille) and visual substitution by touch. Namely, can our sense of touch be effectively utilized to provide visual cognition in a route that is functionally equivalent to visual perception? It should be stressed, that we do not discuss an alternative route to the impact of a picture on the human retina, but rather a functional equivalent process rendering a similar but not congruent cognitive result.
We therefore propose, to utilize a body part, rich with sensory abilities close to the skin that can accomplish such a task. We already know, how fingertips are effective in their sense of touch when reading Braille, and that fingers are used for fingerspelling words in the cupped hand of deafblind persons, as a form of communication. Therefore, it would stand to reason to utilize similar body parts for purpose of communication, both because of their utility as well as familiarity. Thus, the entire length of fingers without the tips involved in reading Braille, is one possibility; the other is the palm of the hand, or combination of both. The more difficult task is the question of what information to provide such chosen receptors. How to transliterate visuals to such information, and finally how to organize it in a cohesive manner into a utility that is universally taught and exercised in cognitive perception. The proposed Touch Language is our attempt to achieve that goal.
It should be mentioned from the outset, that if it is to be a real Universal language that includes syntax and grammar, then we have a serious task of overcoming the idiosyncrasies of each individual cultural language. Sign Language is called a language and indeed it has its own functionally equivalent syntax and grammar, as is the case in American Sign Language, or ASL. However, that very aspect hinders it from becoming a real universal language. Nonetheless, independent usage of signs, having selected universal meanings, enable a modicum of communication among persons from foreign places (Liebermann 2002) that greatly surpasses the ability of hearing persons to communicate in a language unknown to them, if possible at all. Therefore, we will tread with caution, will boldly assume the name of Touch Language, willing to discard it as a language, and thereby reduce its status, if universality has to be compromised. However, as will be realized later on, the careful choice of parameters will enable us to relax our caution and accept it as a bone fide language, while maintaining its universality under the proposed construct.
Different embodiments utilize different body areas that become the object of perceiving the signals of Touch Language. For example, Scott Stoffel of the engineering department at Temple University, designed a “Palm Braille” that%2
TABLE OF CONTERNTS
• Table of Contents
• Introduction.
• Example - The Electronic TV and video for the Blind and Deafblind (eTV)
• Accepting Touch Language
• Touch Language Preliminaries
• Type of Information Provided.
• Touch Language Parameters.
• Approach Philosophy
• Descriptive Groups
• THE GROUPS
• Group Delivery Receptors
• Static elements
• Definitions:
• Screen Coordinates
• Preliminary Definitions
• Geographic (Screen) Definitions
• Operational Impacts Definitions
• Static Parameters
• Forming Shapes
• Dynamic Parameters
• Idiosyncrasies of TV Stations and Video Broadcasting
• Activity continued beyond the PalmScreen
• Screen Change Variable
• Beginning and Ending of Commercials
• Multiple Information on TV Screen
• Foreign Language Translated at Bottom of Screen
• Background Music Creating Suspense
• Station ID Number
• Time Display
• Problems in Broadcast reception
• Dark Screen Due to Problem in TV Reception
• “Snow” Screen Due to Problem in TV Reception
• Interruption by the Emergency Broadcasting Service
• Understanding the TV Guide Channel on Screen
• The Remote Control
• Definitions
• Dynamic Parameters
• Basic Dynamic Parameters
• Combinatory Dynamic Parameters
• Combinatory Dynamic Parameters
• Attributes for Mechanical Messages
• Attributes of Touch Language
• Group Delivery Receptors
• Back of Fingers as Receptors
• Groups that Represent Personal Character.
• Sub-groups that Represent Gender.
• Examples of Signal Delivery
• Face of Fingers as Receptors
• Utility of the Back of the Hand
• Group Representation Utilizing Face of the Fingers
• Language Representation
• How it all Works
• The Face of the Pointer Finger with Nibbles (Peoples Finger):
• Examples of Use
• Examples:
• The Face of the Middle Finger with Nibbles (Lighting Finger)
• The Face of the Fourth Finger with Nibbles (Description Finger)
• The Thumb
• Thumb Control and Alert Function
• Thumb of Cradle in Middle Position
• The Thumb as a Passive Receptor
• The Thumb Cradle
• Thumb of Cradle in Lateral Positions
• The Face of the Fifth (Pinky) Finger with Nibbles (Cross Relationship Finger)
• Female representation
• TV channel query and selection
• Summary of Receptors Versus Group Representation
• Functional Equivalence of Sentence Information
• Touch Language Utility for eCane Usage
• Touch Language Modes for the eCane
• Putting It All Together
• The Universality of Touch Language
• The Cultural Aspect
• Multiple Meaning in Visual Human Signs
• Multiple Perceptions in Interpretation
• Translating Multiple Perceptions
• Relevancy Issues
• Reversed Multiple Perceptions
• Rules of Association
• Synchronization
• The Rules
• Translation Mechanism
• The Touch Language Principles
• Key Words – revisited
• Functional Equivalent Acronyms
• Usage Particulars
• Scene Locality
• Process Principles
• Delimiters
• The Dialog Segment
• The Dialog Segment
• The Dialogue Information
• The Dialogue Reduced State (DRS)
• Summary of Functional Equivalent Touch Language
• Touch Language Utility
• Functional Equivalent Sound Effects
• Vibrations and Frequency
• Nibbles and Frequency
• The Other Hand
• The Left Hand Pinky
• Grammatical Tense
• Direction and Cause/Receipt Designation
• The Person Hand
• The Object Hand
• Intent and Action
• Connotations and Modifiers
• Direction and Cause/Receipt Designation
• Object and Person Hand
• The Person Hand
• The Object Hand
• Intent & Action
• Connotations & Modifiers
• The Aggression Group
• Neutral and Matter of Fact Group
• The Non-aggressive Group
• Physical Features in Touch Language
• Mechanics of Articulating Physical Features
• Characterization
• Magnitude
• Location Exchange Action and Stereo Sound
• Location Exchange
• Stereo Sound
• Localization of Scene Variables
• Geographic Localization
• Scene Inclusion Variables
• The In-Wait Mode Variables
• Value Indicator for Variables
• Environmental Scene Variables
• Conceptual Localization
• Multiple-Screen Variable
• Teaching Touch Language to the Deafblind
• The Language
• The Procedure
• Touch Language and Sensory Integration
• The Vestibular and Proprioceptive senses
• The Relevance for Touch Language
• Learned interpretations
• Tools of Avoidance
• The Non-entertainment Segment
• The Touch Language Elements
• Emotional States
• The Back of the non-dominant Hand Revisited
• The Pause and Continue Command
• Identifying the Speaker
• Geographical Placement of Speaking Parties
• Format
• Number Systems in Touch Language
• Magnitude
• The Back of the Third Finger (Magnitude)
• Face of the Third Finger (Magnitude)
• Fuzzy Magnitude
• Fuzzy and Undetermined Articulation
• Fuzzy and elements of Discourse Analysis
• Seasons of the year
• Mathematics in Touch Language
• Geometry
• Gloves Usage for Geometric Forms
• The mechanical process
• The Passive Mode
• The Active Mode
• Finger Sensitivity and Avoiding Reception Mistakes
• Touch Language Utilized in Communication
• The transmission Mechanism
• Communication Controls
• The Functional Equivalent Communication Controls for the eCane
• Communication between two Deafblind Persons with Touch Language
• Mechanism of Direct TLC
• Touch Language Grammar
• Grammar Binding Rules
• The Rules
• Additional Remarks
• Adverbs of Approximation
• The Comparative Groups
• Modifiers
• Commingled Use of same fingers
• Various Grammatical Tense Issues
• Multiple Past and Future Tense Occurrences
• Point of Reference
• Verbs in Touch Language
• Touch Language Structural Concept
• Question and Question Mark in Touch Language
• Command and the Exclamation Mark in Touch Language
• Multiples
• Appropriated Words
• Singular and Plural Descriptions
• Personification
• Reciprocity
• Spelling in Touch Language
• Inclusion and Connectivity
• Affirmation Negation and Emphasis
• Coupled Adverbs and Pronouns
• Religious and Spiritual Articulation
• Miscellaneous (AND, Yes, No, Parentheses)
• AND
• Yes NO and Parentheses
• Help and User’s Guide
• On-line Instant Help Facility
• The procedure
• The Active Mode
• The Passive Mode
• Declarations in the Help Facility
• Concluding Remarks
• Vehicle for Touch Language Evolution
• Mathematics in Touch Language
• The Scientific Foundation For Touch Language
• Preface
• Introduction
• Principal Types of Cutaneous Receptors
• Threshold Issues
• Mechanoreceptors Targeted During Touch Language Execution
• Various Facts and Corollaries
• The Biological translation of senses
• Scientific Foundation Through Intuitive Imagery
• Military Technology
• Morse Code in Touch Language
Acknowledgement
• Index of Appendices
• Appendix A - Static Parameters
• Appendix B - Basic Dynamic Parameters
• Appendix C - Combinatory Dynamic Parameters
• Appendix D - Attributes of Touch Language
• Appendix E - Possible Embodiments For Receptors
• Appendix F - Touch Language Representation
• Appendix G - Lower &Upper Nibble Impact Representation
• Appendix H - Multiple Meaning in Visual Human Signs
• Appendix J: Teaching touch language
• Appendix K: Definition Structure for Signal Transmission
• Appendix L: Geometric Forms
• Appendices
• Figures
• The Morse Code
• The Author
Introduction
Deaf persons utilize Sign Language that enables them to communicate among themselves and with some auxiliaries also with hearing persons. Sign Language does not solve all perceivable audio situations. Deaf persons cannot hear thunder, the wind, sounds of streaming rivers, sounds of animals or sounds produced in our society like cars, trains, or music. There are partial solutions to enable deaf persons such perception of sound by either providing a description of it or by enabling perception through another sense of the five human senses (hearing, seeing, smell, touch, and taste). Such solutions involve for example light signals (sense of seeing) in alarms, vibrations (sense of feel/touch) for alerts, but above all, they see with their eyes and have an immediate unfettered access via a human sense to their environment as a compensating auxiliary element.
Deaf individuals are able to utilize their sense of seeing as the compensating auxiliary for perceived connectivity to their environment. Blind persons use the opposite for such connectivity. That is, their sense of hearing, which maintains perception of what happens in their environment. Thus, audio description of sights enables a substitution on some level for their lack of seeing. The blind can also augment, to a degree, their lack of seeing by partially utilizing the sense of touch as is done when reading Braille, letting their fingertips perceive the combination of raised dots that comprise appropriate letters. Other sensory utilizations have been introduced, such as the Tac-Tile sounds in the works of Russ Palmer, Paul Chamberlain and David Mitchell or holistic and interactive communication methods by Riitta Lahtinen.
Thus, we notice that a person with dysfunctional sense can compensate, to a degree, lack of a certain sense by enhanced use of another sense. In the case of blind persons we notice that another sense, the sense of touch, is partially used in reading Braille. The blind can also “feel” their way around by touching elements in their immediate environment, such as walls, doors, or door handles. The blind can also utilize a combination of a somewhat extended sense of touch combined with the sense of hearing when they utilize a cane to navigate.
However, persons who are both deaf and blind (deafblind) are in a different category of perception. The lack of two major senses, hearing and seeing, appears to leave at most only three other senses. The sense of touch, smell and taste. When the senses of taste and smell are intact, a deafblind person can enjoy almost to the fullest the satisfaction of food or a meal, and even to the fullest when garnishing is not a factor. The major contributor to such enjoyment being the sense of taste, augmented by the sense of smell, and finally the sense of touch provided not by the hands but rather by the inner parts of the mouth, namely tongue, palate and even teeth. However, utilization of these senses does not enable a deafblind person to perceive the environment at large and are suitable only for very limited auxiliaries. The invention of the eCane and its auxiliary devices (Liebermann 2003) enables the deafblind to communicate with others, locate desired places or objects and navigate to them if desired. The Security Emergency Vehicle Alert Companion (SEVAC) in one embodiment can enable deafblind to feel confident in the home environment so that they know if a break-in occurred in another room in the house, or the vibrating fire alarm with direction can alert them to fire occurring on another floor in a building. However, all such auxiliary devices fall short in enabling the deafblind to enjoy other amenities in our society such as listening to a radio or enjoying a television show. Television perception for the deafblind is an invention (Liebermann 2008) that utilizes two other senses of the five known to us, the senses of smell and touch. There is no need to create any interface to the sense of smell, as even the deafblind utilize their sense of smell much the same as their seeing and hearing counterparts do. However, when it comes to the sense of touch, an additional tool is required in translating touch components of dynamically changing and even static scenes. Touch Language is therefore proposed as such a communication medium that could be useful for the blind and in particular for the deafblind. Furthermore, the proposed Touch Language has great utility for communication, allowing for the transmission and reception of pragmatics in an intuitive and condensed manner and thereby substituting the need for dependency on word articulation as is done in Braille.
Sign Language used by deaf, utilizes motions of the fingers, hands and arms. However, additional components are supplied through facial features and body language to relate such elements as magnitude, emphasis, or bewilderment that spoken language provides through intonations and changes in form of utterance. Taking our cue from Sign Language, we realize that such visual auxiliaries are inappropriate for deafblind persons. Specifically when a deafblind person may want to partake in a TV or video enjoyment or other amenities. Therefore, it is proposed to extend the analogy of additional visual communication components of Sign Language to provision of appropriate elements via touch. However, in contrast to Sign Language, the task in utilizing touch is much more difficult due to territorial confinement for signal delivery and due to its complexity. Touch parameters need to be defined under additional constraints relating to their descriptive modes, where no previous visual clues exist and where intuitive perception guides our consideration, rather than some accepted methodology. Though, sounding somewhat ambitious, and with the caution discussed below, we will refer to it as Touch Language.
Sign Language as utilized by deaf individuals, was developed and has been evolving in an intuitive form due to existing needs. As such it has a distinct anthropological aspect. An experiment that was done some years ago, grouped deaf children who did not know Sign Language and kept them isolated from hearing children or adults. Within a relatively short time, the deaf children developed their own form of communication that was basically a Sign Language of sorts. Interestingly enough it had many of the existing elements of currently practiced Sign Language.
The children in the experiment had a base from where to start. Namely, they had three basic (and minimal) conditions satisfied.
• They were free to devote time to develop their communication form, as their basic needs of food and shelter were taken care off.
• They were among like individuals (i.e., children) with the same basic lack of hearing abilities.
• They could see each other and thereby their motions for communication.
The deafblind on the other hand, have at least two of the above basic conditions missing. Thus, it is the responsibility of an advanced and humanistic society to close the gap of the two missing conditions, or provide a somewhat “artificial” mode of communication.
To a certain degree, such provision has been rendered through the teaching of palm fingerspelling and the exercise of such new skills in interpersonal communication. However, it still leaves a lot to be desired and falls short of a communication medium that could open up more access to our social amenities, as well as getting rid of the almost complete dependency on others for basic necessities and amenities taken for granted by persons who do not have such dysfunctional senses. Touch Language is advanced here as a step in that direction. The idea of touch as a language is not new (Enakoski, R., and Routsalo, P.). We build on the accepted base of such surmise and introduce below the needed component for Touch Language.
Persons who interface with autistic individuals know that deep pressure helps such individuals to be connected and specifically connected to the external world. Therefore, Touch Language needs to contain deep pressure elements that are as important to the autistic person as they are to the deafblind people. It is important when remembering that deafblind people are not privy to the audio and visual connectivity elements, to the external world. That goes both to personal connectivity as well as inanimate objects. Thus, Touch Language has built in elements of deep touch achieved via strong vibrations, pressure points and pecking sensation defined here as nibbles. The invocation of these elements is not artificial, such as being evoked every so often by a program routine, but rather is integrative. Namely, Touch Language is built on a basic corner stone that provide the deep touch connectivity throughout its usage without the need for their artificial introduction.
We provide an example of Touch Language to illustrate its utility. The components and elements used in the example will become apparent only later on, when they are properly introduced in the book. Thus, the example has no didactic value and is merely provided in the interest of acquainting the reader with the operative value of Touch Language. The example given below is that of an electronic TV or video presentation for deafblind persons based on Touch Language.
Example - The Electronic TV and Video for the Blind and Deafblind (eTV)
Functional equivalent of visuals on TV or video are made available for the blind and deafblind via an apparatus and method. The apparatus is a set of electromechanical gloves that translate the dynamic visual content on the TV or computer screen to [a set of] vibrations and impacts on the palm, fingers and back of the hands. The signals of the electromechanical gloves, perceived by the blind and deafblind, are built on scientifically proven basis of communicating such excitations to the brain. The signals and excitations are a subset of Touch Language, a powerful communication tool created for such and other communication purposes. Both subsets will be discussed later on in the book.
The palm of the dominant hand serves as a functional equivalent TV or computer screen onto which dynamic motions from the TV or computer screen are “projected”, while simultaneously conveying through impacts and vibrations to other parts of the hands and fingers, essential information completing the perception of actions on the screen. The information imparted through impacts and vibrations is done by pragmatics, where topics transmitted enable full comprehension of the following the actions on the screen.
Imparting fine grain detail is an integral part of the process and the transliterated functional equivalent visuals with their actions and content occur in an instant, all ripe with fine grain detail.
For example, the back of the four fingers of the dominant hand (i.e., without the thumb) represent people, the fingers’ location bespeaks of gender, and impacts on selective parts of the fingers provide information about character and age brackets. Impact on the back of a finger identifies the action initiator, while impact on the face of it identifies the action subject or recipient. Thus, a scene whereby a nice old lady is chasing a young mischievous young boy would be transmitted as follows:
The back (denoting action taken or initiation) of fifth finger (nice lady) receives three impacts (denoting older age); the hand vibrates or receives impacts with the Morse code of the single verb “chase”; the face (denoting recipient or target of the action) of the second finger (denoting not nice male) receives a single impact (denoting young age), while a contour of the chase is drawn on the palm of the hand. It only takes about a second to perform and contains the complete story illustrated on the TV or any video screen. The written description is lengthy, the execution is instant and the comprehension is immediate.
At present time, the deafblind can communicate only by fingerspelling in the cup of the hand or read Braille (books or computer Braille readers). Thus, they live in total isolation of sensory excitation and severe deprivation of social contact. The electronic TV and video that is based on Touch Language, could enable them access to the media and facilitate for them the ability to partake in life events.
Touch Language has five components: Morse code, PalmScreen, impact nibbles, vibrations and the optional Braille dialogue segment. As will be seen later, we will strive to minimize the latter (Braille Dialogue Segment) and operate Touch Language devout of any Braille components.
The book is structured on an evolutional tract leading the reader through the building blocks of Touch Language, rather than the assumption of an already developed language with the didactical divisions to segments.
The decision to use Morse code rather than Braille was a result of an experiment conducted at Focus Groups, a division of Signtel, Inc. The company hired well over 100 deaf and hard of hearing employees, some of which were blind. They were all employed as assistant developers for the products developed in the company. They were divided into focus groups and analyzed and developed their responses to a variety of issues related to their culture and the products under development. In the experiment mentioned, they were first taught Braille and then Morse code. Subsequently, they were asked to produce their names, first in Braille and then in Morse. They were then asked, which of the two (Braille or Morse) was easier. Morse won hands down. Thus, the Morse code was adopted as the preferred form of communication.
Accepting Touch Language.
Innovations and new technologies have many times to overcome resistance to their acceptance. The public usually welcomes and embraces such innovations not realizing the effort and sometimes even uphill battles to bring the innovations from concept to market and usage. Acceptance at the various stages is crucial as it begets support and support begets action, resulting in the introduction of the innovation to the users. Thereupon comes the cycle of acceptance by users, whether immediate or delayed and leads to public demand for the successful products. The more innovative, groundbreaking or revolutionary the products are, the more resistance they could face. Touch Language may be in that category of innovations. To illustrate how resistance is overcome by tenacious innovators, we cite below excerpts from a commemoration address given by President George W. Bush in 2004, honoring the Wright Brothers who invented the airplane that lead to the commercial and passenger aviation industry.
In his address, President Bush told the audience that: The United States Patent Office “concluded the plans were inadequate and the machine could never function as intended”.
“The New York Times once confidently explained, why all attempts at flight were doomed from the start.”
"To build a flying machine’, declared one editorial, ‘would require the combined and continuous efforts of mathematicians and mechanicians from one million to ten million years.’ As it turned out, the feat was performed eight weeks after the editorial was written.”
President Bush continued to say: “There is something in the American character that always looks for a better way, and is unimpressed when others say it cannot be done. Those traits still define our nation. We still rely on men and women who overcome the odds and take the big chance -- with no advantage but their own ingenuity and the opportunities of a free country.”
Another example relates to the author of Touch Language, who together with two engineers from Los Alamos Laboratory, Michael Wolf and John Crowley built in the early 1980s a laptop computer operating both DEC, as well as IBM operating systems. When they looked for venture capital money to market the laptop computer they were told “A laptop computer? Who in the world will ever need a laptop computer”!
The acceptance of Touch Language could encounter some type of resistance. It is a new and intricate concept, it is revolutionary and will require a learning curve like any new language. Teachers of Touch Language will encounter the arduous and repetitive tasks so common in teaching new languages, yet magnified several folds; but so did Mrs. Sullivan who taught Helen Keller, and the rest is history.
The potential cost of hindered acceptance of Touch Language, would not be measured by money lost, nor by time spent, but rather by the number of deafblind persons who could have used the Touch Language, as a tool and an aperture to the hearing and seeing amenities of life, available to the hearing and seeing community.
It is up to us, who care for the deafblind, to be courageous and dedicated, so that we could enable generations of deafblind persons to benefit from partaking in a more active communicative and perceptive role and enjoyment of life amenities available to the hearing and seeing persons in our society. Touch Language makes it possible.
Touch Language Preliminaries
Sign Language is a visual language. To a degree, even regular textual language is visual, since readers perceive the complete word visually as part of the cognitive process. Furthermore, most of the languages practiced by hearing persons, are such that sound cues of the words become an integral part of cognition. Indeed, most of these languages are phonetic languages (Liebermann 2000). Thus, for most of us, we utilize both phonetics, as well as visual cues in exercising our reading skills as part of our total language skills.
Therefore, persons who can utilize their hearing and seeing senses, have complete basic communication instruments. Deaf persons substitute or augment the hearing sense with visual auxiliary by viewing Sign Language that is functionally equivalent to hearing spoken words. The end result is the cognition, irrespective of the route taken to achieve it. It is a well-known fact, that on the micro level of our anatomy, i.e., the nervous system, one can observe the same functional result irrespective of the route taken to achieve it. Namely, if we pinch a particular nerve in the body, or we apply acid to it, or we pass current through it, ultimately, the same result is observed. That is, a current is produced in the nerve. Therefore, taking the cue from the micro-level and exercising it at the macro level of language cognition, we contend that producing the same cognitive result, irrespective of the route taken is an appropriate step in evolution.
Blind person are deprived of the visual cues in reading text, but can substitute lack of visual sense by the sense of touch, yet maintaining the phonetic cues as part of the cognitive process. The situation is however quite different, when it comes to persons who are both blind and deaf. The question before us, is whether the sense of touch can operate on two separate levels, providing on one level, the reading comprehension utilizing Braille, and on the other level the substitution for the visual resulting from the sense of seeing. Basically, what we ask, is whether the sense of touch can double up in two separate forms, to enable both reading by touch (Braille) and visual substitution by touch. Namely, can our sense of touch be effectively utilized to provide visual cognition in a route that is functionally equivalent to visual perception? It should be stressed, that we do not discuss an alternative route to the impact of a picture on the human retina, but rather a functional equivalent process rendering a similar but not congruent cognitive result.
We therefore propose, to utilize a body part, rich with sensory abilities close to the skin that can accomplish such a task. We already know, how fingertips are effective in their sense of touch when reading Braille, and that fingers are used for fingerspelling words in the cupped hand of deafblind persons, as a form of communication. Therefore, it would stand to reason to utilize similar body parts for purpose of communication, both because of their utility as well as familiarity. Thus, the entire length of fingers without the tips involved in reading Braille, is one possibility; the other is the palm of the hand, or combination of both. The more difficult task is the question of what information to provide such chosen receptors. How to transliterate visuals to such information, and finally how to organize it in a cohesive manner into a utility that is universally taught and exercised in cognitive perception. The proposed Touch Language is our attempt to achieve that goal.
It should be mentioned from the outset, that if it is to be a real Universal language that includes syntax and grammar, then we have a serious task of overcoming the idiosyncrasies of each individual cultural language. Sign Language is called a language and indeed it has its own functionally equivalent syntax and grammar, as is the case in American Sign Language, or ASL. However, that very aspect hinders it from becoming a real universal language. Nonetheless, independent usage of signs, having selected universal meanings, enable a modicum of communication among persons from foreign places (Liebermann 2002) that greatly surpasses the ability of hearing persons to communicate in a language unknown to them, if possible at all. Therefore, we will tread with caution, will boldly assume the name of Touch Language, willing to discard it as a language, and thereby reduce its status, if universality has to be compromised. However, as will be realized later on, the careful choice of parameters will enable us to relax our caution and accept it as a bone fide language, while maintaining its universality under the proposed construct.
Different embodiments utilize different body areas that become the object of perceiving the signals of Touch Language. For example, Scott Stoffel of the engineering department at Temple University, designed a “Palm Braille” that%2