Image
© djeyewater, stock.xchngDaniel Kish lost his sight in infancy, but taught himself to echolocate with dolphin-like clicks.
I am 6 years old and it's my first day at school. The bell rings for recess and all my classmates run gleefully away. But unlike them I cannot see. At least, not with my eyes. Instead, I click my tongue, listening for echoes from the wall to my left. I walk with my hands slightly outstretched to keep me from running into chairs that may have been left askew. I hear kids laughing and shouting through the open door, and by clicking I also hear the presence of the sides of the doorway in front of me. I go through it to the playground for the first time.

After a few steps, I stop to listen. I stand on a crack in the pavement that runs parallel to the building behind me. I click my tongue loudly and turn my head from side to side. The way is open, shot through with scurrying voices, balls bouncing and shoes scampering to and fro. What is around me? How do I get there? How do I get back?

Clicking my tongue quickly and scanning with my head, I move cautiously forward, catching fleeting images of bodies darting hither and thither. I follow spaces that are clear, avoiding clusters of bodies, keeping my distance from bouncing balls. I am not afraid - to me, this is a puzzle. I turn my head and click over my shoulder. I can still hear the wall of the building. As long as I can hear that, I can find my way back.

The ground slopes downward very gradually. As I continue, the sounds in front of me take on a softer hue, suggesting there's a big field of grass ahead. Eventually my feet find the grass. I speed up now I'm out of the press of darting bodies and flying projectiles. Suddenly, there is something in front of me. I stop. "Hi," I venture, thinking at first that someone is standing there quietly. But as I click and scan with my head, I work out that the something is too thin to be a person.

I realise it is a pole before I reach out to touch it. I click around me, and just barely hear something else. Leaving the pole, I move toward this next thing. I hear that it is also a pole just like the first. I detect yet another one, and another. There are nine poles all in a line. I later learn that this is a slalom course, and while I never attempted to run it, I practised my biking skills by slaloming among rows of trees, clicking madly.

At this point a buzzer goes off. I scan around me, clicking my tongue, but I can't hear the building. I clap my hands. I hear something, from the same direction that all the other kids are running in. Later, as I got to know the playground, I would run, too. As I move forward, clicking and clapping, I can hear a wall in the distance getting nearer, louder.

Kids are lined up in front of the wall, but I don't know which class is mine. I ask, and someone points me in the right direction. Clicking and scanning once more, I find the end of my line. As we enter the classroom, I click to make sure I don't run into anyone. Sensing I'm the right distance from the wall in front of me, I reach to my left and find a desk with a braille writer on it. I take my seat, wondering just how big the playground is and if it has a slide. I will find that out next recess.

At the time I went to school, blind kids either waited for people to take us around, or we taught ourselves to strike out on our own. My way was by clicking my tongue and listening for the patterns of reflections from objects around me. By doing this, I could get 3D images of my surroundings.

I can't remember when or how I first started using sonar, because it was when I was very young. I have a memory of climbing over the fence into the neighbour's yard and clicking to find out what was around me, when I was just 2½ years old.

As a child, while I was pleased to have a guide when someone was willing, I could do a lot by myself. I could ride a bicycle through my neighbourhood in the Los Angeles area, play tag with my friends, find trees to climb, and walk just about anywhere on my own.

For this, I thank my parents. I was presented with most of the opportunities that sighted kids enjoyed, and it was really just assumed that I would work things out.

I am certainly not the first person to teach myself echolocation. In fact, human sonar has probably been around for as long as humans. There are two types of sonar: passive and active. Passive sonar exploits sounds made for other purposes to get a sense of the environment, and everyone does it to some extent. We can hear our voice change depending on the kind of space we are in, for instance.

Humans probably used to rely on echolocation far more in the days before artificial lighting, when we had to find our way round in the dark. The readiness with which people learn sonar suggests to me it may be an inbuilt skill.

The first documented case of a blind person using sonar dates back to the mid-18th century. The French philosopher Denis Diderot wrote in 1749 of a blind friend so sensitive to his surroundings that he could distinguish an open street from a cul-de-sac. In the 19th century, the famous "Blind Traveller", James Holman, was reported to sense his surroundings by tapping his stick or listening to hoof beats.

At the time, no one understood the basis of this skill. Some thought it relied on the skin on the face and called it "facial vision". Only in the 1940s did a series of experiments prove this ability relies on hearing echoes.

Echoes can be used to perceive three characteristics of objects: where they are, their general size and shape and, to some extent, what they are like - solid versus sparse, sound-reflective versus sound-absorbent. This allows the brain to create an image of the environment.

For example, I perceive a parked car as a large object that starts out low at one end, rises in the middle and drops off again. The difference in the height and slope pitch at either end helps me identify the front from the back end; typically, the front will be lower, with a more gradual slope up to the roof.

Distinguishing between types of vehicles is also possible. A pickup truck, for instance, is usually taller, with a hollow sound reflecting from its bed. An SUV is usually taller and sounds blockier.

A tree has narrow and solid characteristics at the bottom - the trunk - broadening and becoming more sparse towards the top. More specific characteristics, such as the size, leafiness or height of the branches, can also be determined.

Passive sonar that relies on incidental noises such as footsteps produces relatively vague images. Active sonar, in which a noise such as a tongue click is produced specifically to generate echoes, is much more precise. My colleagues and I use the term FlashSonar for active sonar, because for us each click is similar to the brief glimpse of the surroundings sighted people get when a camera flash goes off in the dark.

FlashSonar

Because an active signal can be produced very consistently, the brain can tune in to this specific signal. Even in complex or noisy environments, this allows for easy recognition of the echoes that are elicited. It's like recognising a familiar face or voice in a crowd.

The characteristics of an active signal can also be changed to fit the situation. For instance, I click more rapidly when moving fast, and more quietly in quieter environments so as not to get more information than needed. Large objects like buildings can be detected hundreds of yards away. Up close, objects about the size of a credit card can be detected.

FlashSonar struggles most with figure-ground distinction - distinguishing one object or feature from others around it. Elements tend to blur together, so it is not possible to recognise faces, for instance. Also, high noise levels or wind can mask echoes, requiring louder clicks and more scanning.

Nowadays, I spend part of my time teaching FlashSonar. I originally trained as a psychologist and later became the first fully blind individual in the US to qualify as an orientation and mobility specialist, a job that involves teaching blind people how to get around. After working in this area for several years, however, I felt that blind people were not generally trained to achieve as much as they could.

Instead, I developed my own approaches, helping blind students participate in a wide range of activities including solo bicycling, ball play, solo wilderness hiking - sometimes even competitively - or just to get around their community with greater ease and comfort. In 2001 I quit my job to set up the non-profit organisation World Access for the Blind, to make this approach available to blind people round the world. We work closely with families, helping blind children and adults to achieve everything they could imagine in a manner of their choosing, and also run workshops for instructors.

Although our programme has many facets, we are best known for teaching FlashSonar. Its ability to give blind people a way to perceive their environment far beyond the reach of an arm or a cane is fast being recognised by people who work with blind people and in other disciplines. We are the first to develop a systematic, comprehensive way of teaching it.

We start by sensitising students to echoes, usually by having them detect and locate easy targets, such as large plastic panels or bowls. Once they can do this, we move on to learning to recognise more complex echoes by comparing them to familiar ones.

For example, when facing a hedge, a student might say, "It sounds solid?" I might reply, "As solid as the wall to your house?" "No, not that solid," she might say. "As sparse as the fence of your yard?" "No, more solid than that," she might answer. Now we have a range of relativity to work with. "Does it remind you of anything else near your house, maybe in the side yard?" "Bushes?" she might query. "But what seems different from those bushes?" "These are sort of flat like a fence." Ultimately, students verify what they hear by touching.

Besides training, we have also created prototypes of a device called SoundFlash. This is a head-mounted unit that produces high-frequency sounds modelled on bat chirps, but within the human audible spectrum. The intention is to produce a better sound for echolocation than we can make with our tongues. The results so far have been encouraging, with students able to detect three times as much detail at three times the distance as they can with a tongue click.

We have also worked with biologists trying to understand animals that perceive the world through echolocation, as well as with robotics researchers teaching robots how to navigate using sonar and with neuroscientists studying the imaging systems of the brain.

Ultimately, we aim to make our approach available to blind people throughout the world, so they can access opportunities and have a quality of life of their own choosing - with freedom and self-direction. By showcasing successful students in the media, we are changing expectations of what a blind person can do. It is our hope that we can help blind people step away from the idea that their perception of the environment need be limited to the length of a stick, or to someone else's eyes.

Bibliography

For more information on echolocation, including videos, visit HERE.

Echolocation for beginners

Close your eyes and have someone hold a small bowl or open box in front of your face. Speak. Listen to how your voice sounds hollow. (You may not be able to tell with a tongue click unless you've practised a good deal.) Try it with and without the box.

Next, try this with a larger box or a pot. Hear how your voice sounds deeper. Try with a pillow or cushion, and notice how your voice sounds soft instead of hard. Try going to a corner of a room, and hear how your voice sounds hollow when you're facing the corner. How does the sound change when you face away from it?

Artificial echolocation

Human echolocation is limited by our hearing: the detail we can detect depends on the wavelength of sound. Ultrasound can reveal 10 times as much, which is why many groups over the years have tried to develop echolocation systems for blind people based on ultrasound, such as the UltraCane and the K-sonar.

Ultrasound devices can detect an object as small as a postage stamp from 5 metres away - but they can't detect the side of a barn from 10 metres, because the range of ultrasound is so limited. Another problem is conveying the ultrasound images to users, who must learn to interpret some form of coded aural or tactile feedback.

What's more, these devices tend to be designed by engineers who don't fully understand the problem they are trying to solve. For instance, blind people need to be able to move their cane around easily - you can't pack it full of heavy gadgetry. For these reasons and more, artificial echolocation devices have never really caught on; every one has fallen by the wayside.