Friday , August 19 2022

Duplex, who can build a restaurant that AI says, is it a moral disorder of artificial intelligence?



[ad_1]

It's not just a genetic editor, but an artificial intelligence that suffered from ethical dilemmas.

At the Google I / O conference this summer, Google has shown to the public a new feature called Duplex AI – instead of booking a hair or restaurant.

This feature is so popular and interesting that the AI ​​does not understand that it's really close to the real thing that allows people to speak in the natural language and that people are opposed to the phone, not Google's real person. AI.

Recently, this service has been tested in a smaller range of Pixel user groups. According to VentureBeat, Google is gradually stopping this service instead of at work, and now it is tested in pixelized "Picked Users" pixels in your favorite cities. Anyone who supports this service will be in the final life. It's called a child.

As well as. This service is currently very limited in functionality – even if it's a demo version of Google I / O. After all, the current test does not support hairstyles (the video shows the Google I / O image assignment), but only because of the reservation of the restaurant, and some restaurants can not support it for some reason.

Opening the duplex community is a great thing, and we can test ourselves because we have become leaders in the field of artificial intelligence on our planet. However, Google Duplex has raised some concerns about the latest trials, of course, not because it is a public opinion in terms of intelligence, and that Duplex did not say to the other party that it was AI only.

When Duplex starts the meeting, he tells the other party, "This call can be taken from Google and recorded." Consequently, it starts to gradually start ordering the restaurant and stops after processing.

And this is not the case with the previous advertisement, says The Verge, in Google's advertising video posted on Youtbe this June, Duplex shows that it's artificial intelligence, which means the call is made through Google Assistant and the Customer will make a booking.

Why is the AEG tested completely different than the AED test? That is because the call is part of a specific Google employee and the second is AI call from Google. So Google did not publish this artificial assistant's information.

The boundaries between humans and AI here are clear because Duplex is very close to a particular person, which is temporarily stopped, and there are various intonations and modal details. But for the society, this test is fully used by the AI, and during the interview between people and the AI, people should show their identity to the AI.

Venturebeat believes Google needs a great balance:

Just and smart, transparent and honest

In addition, Google has demonstrated how Duplex works on his blog.

Google says it's a new technology to perform specific tasks on the mobile phones, and the kernel is a natural language problem: it's hard to understand natural language, it's hard to model natural behavior and it's quick and straightforward. Natural speech is one of the challenges.

Google has also determined that it is an essential part of transparency, with Google having to explicitly define it and understand the other, and Google will make adjustments in the future.

The story that creates Google Duplex sound is related to the Rubber Nerve Network (RNN) generated by TensorFlow Extended (TFX) to solve these problems. In order to improve the accuracy, Goole has also developed a secondary neural network in the body with hidden session data. They also taught the model of understanding for each scene. Finally, Duplex's ability to communicate in a "spoken tongue" can be understood, interacted, and achieved through time.

Finally, Google said Google's Duplex technology is a step towards human interaction and interaction with others, and the natural language and artificial intelligence exchange become reality in specific scenarios. Google hopes that these technological advances will help people improve their communication with computers on a daily basis.

However, as the story begins, the distortion of boundaries between specific people is needed to test the body, first of all, for the AI, it is a very important thing for a person, or it should be. Rituals. Thanks to the evolution of technology, AI can reach humanity in many areas of voice response soon, and ethical issues about artificial intelligence will be easier on the table.

The result is that people are not ready to talk to the AI, which is the same as the real people, which gives them a feeling of uncertainty. Can be used in the outer scene because there are similar situations in the voice.

Thomas King (PhD) of the Oxford Internet Academy Laboratory says that Google's test is actually fraudulent: people and IPs are unmatched, and for operators it is wrong to switch to the AI ​​they talk to the machine group.) It still uses the communicative method can she support Is it good to be polite and rude? Even if I was calling from a particular person, that idea might have been long in his mind.

▲ The science-fiction film "HER"

Thinking about relationships between people, people and technology is already a topic that can be called Nikkei, and science-fiction movies like the sixth day of Schwarzenegger's role will not be funny. Ethical clashes between people and clones were discussed, in 2004, the "enemy of technique" conveyed Azimov's three laws to the public opinion, "HER" studied emotional issues of people and artificial intelligence.

Duplex relationships between man and artificial intelligence can actually be sensitive and self-defense. We have developed a lot of questions and ideas about the absence of AI; it is difficult for people to have the status, responsibility, and more in the process of communicating with artificial intelligence.

Sometimes, if you ignore some errors, computers can form a much lesser one, actually show your behavior and deliver the desired results honestly, but people who come up with artificial intelligence model will come. There may be conflicts and friction with the AI, which contradicts expected results or other costs.

Because human beings are imperfect creatures.

[ad_2]
Source link