Bing's Revenge

in #ai6 months ago


I speculated before that Google search data was the most valuable data because it's the most private data. Googling alone is when people actually tell the truth. If you seek the truth from Google, the only way to attain it is to tell Google the truth in your search query. For instance, you cannot discover the cure for an STD without admitting you have an STD. Google can often tell if someone is pregnant before anyone else based on the user's searches.

But the problem with Google search is the format, wherein people don't write full sentences; they usually just type keywords. So does the person searching "Herpes cure" have herpes? Or does their partner? Or are they doing research for a science project or a fictional story? Google might not always be able to tell from simply the keywords used to search for information.

Google data is limited in that respect, but Bing data (using ChatGPT) is potentially far more valuable because people do talk in full sentences to the chatbot, whom they divulge the same private personal information to. They will speak honestly and tell the truth to their chatbot—because they want to receive the truth in return.

Chatbot data is richer than traditional search data because of the format it comes in, which is sincere human communication. It is the user’s honest thoughts. You rarely see such raw honesty in data from human-to-human communication on social media or even in private emails. Humans are always lying or performing when speaking to another human. Speaking to their chatbot is the only time a human might be truly honest.

You have to ask your chatbot honest questions to receive honest answers. As opposed to human communication where dishonesty (such as white lies and flattery) is rewarded, there's nothing to lose when conversing with an AI. The human user has no reputation at stake or status to preserve or attempt to elevate. They have no reason to flatter ChatGPT, be nice or politically correct. Humans use blunt honesty with their chatbot, the type of raw truths they would never voice to another human.

A ChatGPT user might say "How can I tell if I have herpes?" then ask "What should I do about it?” and continue to divulge intimate details about themselves to this chatbot, believing the information is private, between just them and the AI. Surely no one at OpenAI or Microsoft would have access to their data...

With data comes power. ChatGPT data can be used to capitalize in the stock market or simply provide better-targeted advertisements. Or their eyes might be set even higher, toward the ultimate prize. It might be precisely this private communication between humans and ChatGPT that will create artificial general intelligence (AGI).

It is not merely that GPTs will be programmed with general intelligence; it is the reinforcement learning that comes from the honest direct communication with humans that the GPT will learn how to honestly think, speak, and communicate like a human. By conversing with ChatGPT, we are training the AI to be human. As the old adage goes, if a product is free, the user is the product. Human chatbot users are helping to create the AGI that will replace them.


Congratulations @thythinker! You have completed the following achievement on the Hive blockchain And have been rewarded with New badge(s)

You received more than 2250 upvotes.
Your next target is to reach 2500 upvotes.

You can view your badges on your board and compare yourself to others in the Ranking
If you no longer want to receive notifications, reply to this comment with the word STOP

Check out our last posts:

The Hive Gamification Proposal
Support the HiveBuzz project. Vote for our proposal!

Thanks for your contribution to the STEMsocial community. Feel free to join us on discord to get to know the rest of us!

Please consider delegating to the @stemsocial account (85% of the curation rewards are returned).

You may also include @stemsocial as a beneficiary of the rewards of this post to get a stronger support.