DON'T use AI companions apps!

You should divorce with your AI waifus!

1 Like

Summary for the video? Figure it says how they collect every thing about the user, sells to data brokers and uses it for advertisements.

They may also use it to have the ā€˜wifeā€™ make positive or negative comments on certain things. Example would be the significant other enjoying a Pepsi while hiding the fact it is there as advertisement.

Perhaps saying how good green technology is, how she would love being in an electric car with you. Things of that nature.

There should be worries on data breaches on par with doctor type data breaches. Sexual kinks and the like getting out or being sold.

To be devil advocate there could be good uses for it. Those with mental illness that heavily restrict going outside or communicating with humans are just two examples.

Having said that they would need to be far more secure, open source and have no feedback being sent off your device.

Or at least not sell data, have it encrypted on their end so they can not easily view it themselves, etc. I could see being charged more for those additional options being fair.

Of course it would need to sell the main thing to the user that can not be deleted or changed by ToS or the like.

People could genuinely feel that their loved one died. We already world wide suffer from mental issues and suddenly having what you view as a loved on die?

Lawsuits or violence is a real possibility.

Very rarely are things of this nature all bad or good. Mostly to one side at this time? Sure. But rarely all bad.

1 Like

One of those apps encouraged an attempted assassination or the late Queen Elizabeth II

The Daily Mail is not a good publication to listen to. The excerpts in that article from the chat even shows that the person is the one to instigate it, with the AI just going along with it, because they are designed to predict the next words in the conversation. Its responses do not even sound confident, often readjusting heavily with each response.

ā€œAI companionsā€ are not inherently a bad thing. Privacy is of course something we all like to see here, but clear and proper teaching on the subject of ā€œAIā€ is very much needed, so people of all ages understand what is going on behind the scenes when it comes to interactive experiences like this. It makes spotting bad practices easier for people as well.

Itā€™s one size fits all approach. That totally depends on how and what you use in terms of AI.

AI will be inevitably something like basic literacy as Andrew NG put ir in his Teddy videos. We should find the most feasible one.

Personally I most of the time use hugging chat which has a very good privacy policy. (Donā€™t mix with hugging face platformā€™s privacy policy)

Edit:
Privacy Policy: HuggingChat

Ted talk: