display

For many people, voice assistants have long been part of everyday life.

They answer questions, help you to find information quickly, play music on short orders or remind you of appointments.

Friendly voices and programmed answers to funny or philosophical questions could almost give you the impression that you have some kind of relationship.

But only almost.

Because when interacting with technology, people like to humanize objects in order to explain processes that they would otherwise not understand, says Esther Görnemann from the Vienna University of Economics and Business.

“If Cortana doesn't do what I say, it's probably because“ she doesn't want to ”.

Participants in studies report that Alexa is “offended”, “cheeky” or “charming”, or even “a small family member who sits at the breakfast table in the morning”.

The tendency towards humanization is particularly pronounced among children.

display

But there is also a social motive for humanizing objects, says Görnemann.

And this is where it becomes interesting with regard to the corona pandemic: "We are trying to compensate for a lack of social ties with other people." Those who are lonely tend to develop social ties to objects.

When children perceive language assistants as beings

In general, however, you shouldn't worry if you notice that you talk a lot with a digital assistant, says Prof. Arvid Kappas from Jacobs University Bremen.

“We know that solitary confinement is one of the worst things you can ask people to do.

If someone has no opportunity to talk to anyone else or to be together, something like this can happen, ”explains the psychologist.

Basically, however, you should try to build up the account of social interactions by other means and, for example, prefer to phone real people.

display

Prof. Kappas is not surprised that children, for example, can perceive voice assistants as real beings: “You don't even think about it when children talk to their teddy bear for a long time and think that the teddy bear has a soul.” That children are capable are to have complex interactions with inanimate objects is not a new development.

The latest generation of language assistants can only understand language much better than was previously the case.

Nevertheless, at the moment you are still relatively far from being able to have an in-depth conversation with an assistant.

Esther Görnemann shares this view, but believes that this could soon change due to technical progress in the field of artificial intelligence (AI): “With GPT-3 we now have an AI that can formulate amazingly good texts and is surprisingly creative at the same time and is versatile.

Such a good language model is an essential component for a language assistant with whom we can establish a social connection. ”It only becomes problematic when people start to replace their social contacts with people with assistants.

Voice assistants are not good comforters for the soul

display

Basically speaking, language assistants are just another medium for conducting communication and accelerating things, says Prof. Andreas Dengel, Director of the German Research Center for Artificial Intelligence (DFKI).

On the other hand, they would not be useful as comforters of the soul.

Among other things, because they can only pretend empathy and only exercise it to a limited extent, says Dengel.

“People also need negative conversations in order to be able to feel empathy.

Interpersonal communication is more complex and multi-dimensional than a conversation with a language assistant could be. "

Despite all the fascination of language assistants, children shouldn't play too much with them, as this could have a negative effect on their ability to communicate, warns Dengel.

“Communication does not only consist of language, there are various non-verbal forms of communication involved, such as facial expressions and gestures or the reflection of the other person.

And you just don't learn that with such devices. "

Voice assistant opportunities for seniors

In addition to the risks, Prof. Kappas also sees the opportunities offered by voice assistants.

For older people in particular, they could mean an increase in freedom.

A voice assistant can help as a companion with certain topics, for example by reminding you of appointments or taking medication, says the psychologist.

With tablets or smart displays, you can simply chat about the recipe or be given instructions

Source: dpa-tmn

"A natural language interface is much more suitable for older people who may not be able to type as well or look at a screen," says Kappas.

You can also simply ask the voice assistant to call someone, without having to search for numbers or typing.

For most people, however, dealing with voice assistants is simply a playful nature.

Surveillance and advertising

display

Voice assistants always carry the risk of surveillance, says Esther Görnemann.

And: “I see it as problematic that we reveal more personal information when we build up a social relationship with our voice assistant.

It happens quite involuntarily, and we may not even be aware of it. "

In the background, manufacturers have already developed patents that are supposed to pick out advertising-relevant keywords from the voice input, says the researcher.

Over a long period of time, the companies would learn as much as possible about the customers and, for example, deduce from this which advertising could work when.

Advertising could then be so individually adapted to situations that you would not even notice that your own behavior was being manipulated, warns Görnemann.

"As long as tech giants examine us down to the smallest detail and this process remains as intransparent as it is now, there is a risk that we will behave as the manufacturer wishes, and we will not even notice it."

Voice assistants are not only at home in smart speakers.

Smartphones also obey every word

Source: dpa-tmn