The quest for human language

The quest for human language
Like Tweet Pin it Share Share Email

The quest for human language

The quest for human language

Computers can do a lot of things now. They can recognize faces. Even better, they can recognize your face with Facebook now being able to suggest whose face it is seeing in a picture.

Similarly, speech recognition has come a long way with a lot of software has progressed past the stage where every time you tried to call anybody the machine calling your mom.

Even in the realm of translation giant steps are being taken, with Google recently has released a new version of its translation software that knocks the socks off of its previous version.
At the same time, it can’t beat humans though.

And that’s the thing. Though computers are advancing in all fields (and threatening countless jobs in the process) they still can’t yet grasp the human language. Still, they’re getting better.

And, like so many things in life, it’s down to something called deep learning.

The quest for human language

Deep learning

Deep learning is a kind of algorithmic structure inspired by the brain. The idea is that you create a huge network of interlinked neurons which can then actively learn and be trained in certain tasks.

The idea is that as long as the networks are big enough and we continue to train them with ever bigger data sets, these algorithms can then continue to improve and get better at what they’re doing.

This makes it hugely different from other types of learning techniques, which often end up reaching a plateau after which there is no real progress in terms of learning.

It is this that makes deep learning so powerful – and it is also this that means it scares many of the big brains in technology.

ALSO READ  How to hack whatsapp account using pc

This is down to the fact that when we train networks in this way, we don’t know what they’re actually doing.

And it has happened before that some networks had to be closed down because the people who built it no longer actually knew what was going on.

In one recent case, for example, the network that was being trained had invented its own language at which point the people that built it no longer knew what it was saying to its subcomponents, at which point the people that built it decided it was time to end that, particular exercise.

Still, it isn’t really that surprising that we don’t understand what these machines are doing as we don’t actually know what our own brains are doing either and in many ways the opportunities are immense.

A lot of the recent advancements in technology have indeed come from deep learning and if we want to continue down the road of ever better computers doing ever more complex thing it would seem that this is the road we must travel – even if it might potentially be a dangerous one.

Language acquisition

Certainly for the learning of language no better option has come along.

For example, using deep learning one group was able to train a computer to consider the following text:

Jane went to the hallway.

Mary walked to the bathroom.

Sandra went to the garden.

Daniel went back to the garden.

Sandra took the milk there.

And then, when asked “Where is the milk?” it was able to answer “garden.”

That’s a big deal, as it means that the algorithm can actually understand human speech at least up to a level. That makes for some good translation service options.

ALSO READ  12 Business Prototypes for Mobile Apps

Of course, it is only up to a certain level as this is factual information. On another level, these kinds of algorithms continue to lag behind human understanding as they don’t (and for some time yet) won’t be able to understand sentiment or emotions.

The limits of language lie in what we ourselves struggle to understand

And really, that’s not that strange. After all, we barely understand sentiment and emotions.

I mean, if you ask somebody ‘do you know what an emotion is’ almost everybody will say yes. Ask them to then explain an emotion, and they’re stumped.

And that’s because emotions are evolved responses to the world around us that give us an edge we’d otherwise not have.

Understanding that edge and the emotion that gave it to us does not give us a similar edge (or at least didn’t used to in the environment we evolved in) and is therefore not included in the package that is our brain.

That makes things difficult. As it means it’s very hard for us to explain to computers what an emotion actually is.

After all, how can you explain something that you yourself don’t understand? And as sentiment and emotional understanding are most certainly a part of true language understanding that leaves us at an impasse.

Do we wait for our understanding of the brain to advance far enough where we can offer an understandable model? Or do we, instead go down the same deep learning route as we did before, where we give computers a lot of examples of an emotion and hope that it then learns what that emotion is all by itself?

That second option is certainly faster than the first one, but it does bring a danger with it. And that is that in effect we’re now asking computers to evolve emotions.

ALSO READ  Blockchain use cases and benefits for Oil & Gas

And I’m sure you’ll agree that that might not be a good idea. I mean, what happens when a computer becomes afraid of being switched off?

So that’s where we’re at

Language acquisition is easy for us because it takes place in the very place where language originally evolved namely the brain.

That means that the brain has evolved for language acquisition as well. We’ve got both the hardware and the software to use it correctly.

Computers don’t have that ‘hardware’ yet. We can give it to them. But in doing so, we’re opening up a whole can of worms where we don’t know if they will simply be able to learn to understand emotions analytically or if instead, we’re now going to have to start worrying about AI systems that might get paranoid, or be given over to bouts of anger.

The problem is that for companies the attraction to create an AI that has true language understanding and the profits that it can make from such a device might be so enticing that it does not decide to play it safe and instead decides to tear the lid of the Pandora’s Box of emotional acquisition.

Then we’ll have AI that doesn’t just understand us and can speak like us but can feel like us as well. I just hope there is a third option.

Author Bio:

Dina Indelicato is a blogger enthusiast and freelance writer. She is always open to research about new topics and gain new experiences to share with her readers. You can find her on Twitter @DinaIndelicato and Facebook.

Comments (0)

Leave a Reply

Your email address will not be published. Required fields are marked *