Neurohacking: Rewiring Your Brain by Don Vaughn (Transcript)

Take out your earplugs.

Yeah, like, pretty terrible, actually. And look, I kind of set you all up to fail on that one because lip reading is a really difficult problem. But lip reading babies who, as all you missed, can make vocalizations without their mouth moving at all, and then who speak with toys in their mouth is a really, really difficult problem. This is harder than neuroscience.

And so imagine, though, that Mimi is your child, and this is every day of your life. That’s the reality for deaf parents and the over 90% of their children who are hearing-able. And this can lead to a real divide between families because parents are unable to engage in traditional baby talk.

And it’s now very clear that traditional baby talk is not the cute or annoying musings of parents, but instead, it is a tool specifically designed by nature to teach language and to foster connection.

And you can imagine if that was gone, it would be a really difficult issue. So the question that Dr. Arianna Anderson and myself at UCLA asked was: If you can’t get infant vocalizations in through the ears, is there another option?

Well, when you scan brains across the population, you see that there are very specific parts of your cortex that are devoted to processing one type of sensory modality or another.

So, for example, this morning on the TEDx tour through UCLA’s Staglin Center, we see that there’s a very particular part of Stephanie’s brain that lights up to only visual information.

And independently, there’s a completely separate part of the brain that responds only to touch. But here’s where it gets really interesting. When you scan the brains while blind people are feeling braille, you don’t just get the touch parts of their brain active, you see visual areas active as well.

ALSO READ:   Mario Carpo: "The Second Digital Turn" @ Talks at Google (Transcript)

And similarly, when you scan the brains of the deaf population while they’re communicating using sign language, you don’t just see the visual areas active from seeing the gestures, but you actually see auditory cortex activated. Somehow, your brain isn’t just plastic, it’s not just random, it is intelligently plastic.

And somehow, it’s rewiring itself in order to maximize and process as much information from the outside world as possible. Somehow your brain is learning to see braille and to hear sign language. So, this is called sensory substitution, and the idea, as Paul Bach-y-Rita and David Eagleman have exploited, is that you can take information from one sense that’s lost, translate it to a different sensory modality, and put it in that way.

Surprisingly, your brain figures it out because your brain is plastic. And this is the idea that we’ve taken, and we’ve moved forward with developing an application with a small grant from UCLA called “Chatter Baby.”

And what Chatter Baby does is it turns auditory information into visual information — it’s a type of sensory substitution. So now Mimi’s chatter comes alive in visual form. You don’t miss anything now.

Even though her lips aren’t moving, you can see what’s going on. And the idea is that deaf parents can use this tool to learn baby talk and to connect as deeply as possible with their child, and we believe that eventually, once they become fluent in using this tool, they will be able to hear their child through sight.

And to me, that is just such an important application of neuroplasticity. And this shows that it’s not just a fun tool for turning auditory information into visuals, but instead, sensory processing plasticity, it has the ability to connect deaf parents and their babies. That’s the power of plasticity.

But that’s only half the battle. The other part of this disconnect is that when deaf parents aren’t in the same room as their children, they don’t know what mood they’re in. And the best baby monitors out there on the market, they say, yes, there’s sound; no, there’s no sound.

ALSO READ:   What is Imposter Syndrome and How Can You Combat It? - Elizabeth Cox (Transcript)

But that doesn’t really tell me what I care about. It doesn’t tell me, is my child happy and content? Or instead, are they hungry? Are they crying? Is something going on that I need to be there and I need to address?

Instead, I’m constantly wondering, sound? No sound? I don’t know.

So what we’re doing with Chatter Baby is we are gathering the world’s largest database of infant sounds, and we’re then using sophisticated mathematics to take that sound and predict what child’s mood states are. So he’s really hungry.

And the idea is that we can use neuroplasticity to make a real difference in how deaf parents communicate with their children. And I’ve talked to you about now how to treat depression and addiction and sensory impairments using neuroplasticity, but that’s really only the beginning.

We’re starting to move into diseases that you wouldn’t expect would be treated by something like this, like Alzheimer’s and Parkinson’s and stroke. And that’s really just the beginning of what I’ve been calling neurohacking.

And I don’t mean neurohacking in the sense that these aren’t thoroughly researched — they’re really well thought-out ideas, and there’s a lot of literature to support why they work.

But in the sense that we’re not directly trying to fix the nuts and the bolts of all these problems; we’re not trying to change every biochemical cascade that’s going on in the brain, which is essentially the root of the problem.

Pages: First | ← Previous | ... | 2 |3 | 4 | Next → | Last | Single Page View

Scroll to Top