Math for Blind

From TEILab
Jump to: navigation, search

Mathematics Instruction for the Blind

Project Team

  • Project Lead: Francisco Oliveira PhD - Completed 2011
  • Dr. Francis Quek Professor, Center for Human Computer Interaction
  • Dr. David McNeill "Professor, Psychology, U. of Chicago"
  • Dr. Mary Ellen Bargerhyuff, Assoc. Prof. Special Education, Wright State U.
  • Mr. Jeffrey Vernooy, Director, Disability Services, Wright State U.
  • Bing Fang PhD - Completed 2011
  • Yannick Verdie MS - Completed

Note: This is a recently completed project that is currently dormant

Project Statement

We are studying the impacts on academic performance of those who are blind when they become able to understand the teacher's deictic references when she is at the board.

Project Overview

This project is focused on enhancing the ability of learning of the visually impaired. It is a multidisciplinary project involving Phisiology, Psychology, Electric Engineering and computer Science. We are investigating ways of keeping teacher/student communication situated. Blind students do not have access to the teacher’s gestures. We have developed haptic gloves that help blind student to navigate through instructional material. This material is a raised-line version of the class notes. The student’s reading finger must be on the point the teacher is referring to during the lecture. We call this point PIF – Point of instructional focus. We use computer vision techniques to track both teacher’s and student’s hand and calculate the direction the student need to move his/her hand to reach the PIF. This direction is then, continuously sent through the glove.

The research described above raises many investigation questions about how humans use their affordances to engage collaborative work in different environments, how technology modifies these environments providing novel ways for humans interaction and how humans react to this new augmented world.

Related Research Areas

Embodied Interaction, Haptic Interaction, Accessibility, Education

The Haptic Glove

The glove contains a set of eight vibrating motors arranged in such way that each motor corresponds to a direction: North, Northeast, East, etc. The glove is connected to a box with a PIC18F452-I/P microcontroller via serial cable. The controller box is in turn connected to a computer also via serial cable. The program on the microcontroller controls the timing and intensity of the vibration of each individual motors on the glove. The motors are assembled into copper tubes, which are put into the glove’s small pockets. The spinning within the pocket produces the sensation of vibration.

How it's made

Initial Findings

  1. The glove conveys the sense of direction. An experiment was designed to answer the question: Can the gloves convey directional information? Eight direction signals were sent randomly to the glove. Before each trial, the experimenter asked the participant whether she was ready to receive the signal. Upon an affirmative answer, the experimenter clicked a button to start sending the signals. The participant was asked to quickly and accurately tap on the arrow corresponding to the perceived direction on a touch-sensitive pad.The response times averaged around half a second and no significant difference was found among the glove models (Anova 95%). Participants who reported that they could clearly perceive the directions also had higher hit rates. More...
  2. The Glove vibration does not interfere with fingertip reading. We wanted to make sure that the vibration produced by the motors on the palm of the hand wouldn’t interfere with fingertip reading. For that, we assign two targets on a board with dots forming numbers. The program helped on the blindfolded participant’s navigation to the targets. Once they found the targets, we asked them what number they were reading. The glove never stopped vibrating. The hit percentage for all participants were way above guessing. More ...
  3. One can navigate with the help of the system while listening to a story. Furthermore, It is possible to fuse the information received from hearing and tactile modes. These two findings are very important pre-requisites because without them, we could not advance to actual instruction experiments. We told five different one-paragraph-long stories to each participant. Some numerical story content (e.g. how many bedrooms were in a house) was conveyed on the tactile board (like that used in experiment 2) and not by speech. Subjects were guided to the numbers co-temporally with the speech using the glove. This simulates the simultaneous presentation of graphical material with deictic references while speaking. After each story, we asked five questions about the story. For each of them, five possible answers were asked and the participant had to pick one. To correctly answer the questions, the participant had to recall information obtained from either the board or the speech. The participants who reported that they could clearly perceive the directions while listening to the stories also answered correctly a higher number of questions and were able to navigate faster. More ...

A Game for the Blind

Individuals who are disabled are motivated to learn the assistive technologies devices and tend to use them over a long period of time. To address this we developed a computer game which is placed permanently at Office for Disabilities Services at Wright State University - Dayton, Ohio.

This demonstrates the use of games as means to motivation and engagement for training.

A short video on the game


  • F. Quek and F. Oliveira, “Enabling the Blind to See Gestures”, to appear in Transactions of Computer-Human Interaction, ToCHI.
  • F. Oliveira, F. Quek, H. Cowan and B. Fang. "The Haptic Deictic System - HDS: Bringing blind students to mainstream classrooms", to appear, IEEE Transactions on Haptics.
  • M.E. Bargerhuff, H. Cowan, F. Oliveira, F. Quek and B. Fang. Developing Technology to Support Learning of mathematics by Students Who Are Blind: The Gamers Have It!. In Journal of Visual Impairment and Blindness, November 2010, Volume 104, Number 11. 2010.


This research has been supported by NSF grant: Embodiment Awareness, Mathematics Discourse and the Blind, IIS-0451843, and CRI: Interfaces for the embodied mind, IIS-0551610.