Tuesday, November 4, 2025
No Result
View All Result
newshub
  • Global news
    • Climate & energy
      • Climate
      • Carbon
      • Coal
      • Disruptive
      • Gas
      • Nuclear
      • Oil
      • Solar
      • Water
      • Waves
      • Wind
      • Renewable
      • South America
    • Lifestyle
      • Best chefs
      • Cocktail of the week
      • History
      • Influential women
      • Newshub long-read
  • Financial insights
    • Australia
    • Banking
    • Business of the week
    • Central Banks
    • China
    • Commodities
    • Corporate
    • Europe
    • Fin & tech
      • Tech
      • AI
      • Blockchain
    • Investment
    • Japan
    • Neobanking
    • South East Asia
    • UK
    • US
  • Africa
    • Africa finance
    • Burundi
    • Gambia
    • Senegal
  • Asia
    • Asia finance
    • Laos
    • Malaysia
    • South Korea
  • Caribbean
  • MSTRpay
  • Press releases
  • Global news
    • Climate & energy
      • Climate
      • Carbon
      • Coal
      • Disruptive
      • Gas
      • Nuclear
      • Oil
      • Solar
      • Water
      • Waves
      • Wind
      • Renewable
      • South America
    • Lifestyle
      • Best chefs
      • Cocktail of the week
      • History
      • Influential women
      • Newshub long-read
  • Financial insights
    • Australia
    • Banking
    • Business of the week
    • Central Banks
    • China
    • Commodities
    • Corporate
    • Europe
    • Fin & tech
      • Tech
      • AI
      • Blockchain
    • Investment
    • Japan
    • Neobanking
    • South East Asia
    • UK
    • US
  • Africa
    • Africa finance
    • Burundi
    • Gambia
    • Senegal
  • Asia
    • Asia finance
    • Laos
    • Malaysia
    • South Korea
  • Caribbean
  • MSTRpay
  • Press releases
No Result
View All Result
newshub
No Result
View All Result
ADVERTISEMENT

Is ChatGPT a robot con artist, and are we suckers for trusting it?

2023/02/17/08:23
in Lifestyle
Reading Time: 8 mins read
240 13
A A
Is ChatGPT a robot con artist, and are we suckers for trusting it?

few days after Google and Microsoft announced they’d be delivering search results generated by chatbots — artificially intelligent software capable of producing uncannily human-sounding prose — I fretted that our new AI helpers are not to be trusted. After all, Google’s own AI researchers had warned the company that chatbots would be “stochastic parrots” (likely to squawk things that are wrong, stupid, or offensive) and “prone to hallucinating” (liable to just make stuff up). The bots, drawing on what are known as large language models, “are trained to predict the likelihood of utterances,” a team from DeepMind, the Alphabet-owned AI company, wrote last year in a presentation on the risks of LLMs. “Yet, whether or not a sentence is likely does not reliably indicate whether the sentence is also correct.”

These chatbots, in other words, are not actually intelligent. They are lying dumbasses.

It didn’t take long for the chatbots themselves to prove the point. An ad last week for Google’s bot, Bard, showed it getting an answer to a question wrong; the company’s stock valuation took a multibillion-dollar hit. The answers that Bing’s bot, Sydney, gave in its open demonstration, meanwhile, have failed to stand up to even rudimentary fact checks.

That seems bad! Online search was already a nonstop battle against spam, search-engine-optimized gobbledygook, and the needs of advertisers. But search engines were nonetheless a societal mitzvah. They brought order to the chaotic dataspace of the internet and stood somewhere between information and knowledge, helping us transduce one into the other. We’ve learned to trust them.

And that’s how they got us. Chatbots are bullshit engines built to say things with incontrovertible certainty and a complete lack of expertise. No wonder the tech elite jokingly call them “mansplaining as a service.” And now they’re going to be driving the primary way humans acquire knowledge day to day.

So why do we buy their bullshit? Even though we know from the get-go that our new robot librarians are deeply flawed, we’re still going to use them millions and millions of times an hour and take action based on the answers they give us. What is it that makes human beings trust a machine we know is untrustworthy?

To be honest, nobody really knows why anyone believes anything. After millennia of debate, the world’s leading philosophers, psychologists, and neuroscientists haven’t even agreed on a mechanism for why people come to believe things, or what beliefs even are. So it’s hard to know how they work, or why one thing is more believable than another. But I have some speculation about why we’re going to fall for ChatGPT’s shtick. We humans love a slick grifter with impressive-sounding credentials. And the bots are only going to get more sophisticated at conning us.

Authority figures

Over the past couple of decades there’s been a lot of research into why people believe misinformation. Most of that work assumed we’d mostly encounter fiction posing as fact in the form of propaganda or social media. But that’s about to change. Misinformation will now be embedded in the search engines we use. And a Facebook post has a lot less credibility than an answer to a question you Googled.

Now, not all beliefs are strongly held, or even based on evidence. So maybe people will treat chatbot answers the way we treat any new information. Joe Vitriol, a political scientist at Lehigh University who studies disinformation, says he expects that people will “accept its output in a biased or self-serving way, like people do with any other source of information.” In other words, people will believe a chatbot if it tells them things that comport with their existing beliefs and opinions — just as they do with traditional Google results. Whether the chatbot is telling the truth or hallucinating won’t really matter. 

The packaging of those answers — in paragraphs not much different from the one you’re reading, but with the Google imprimatur — could tip the balance further toward credulity. We want Google results to be true, because we think of Google as a trusted arbiter, if not an authority. “Naïve users may assume the bot has credibility that human actors don’t have,” Vitriol says. “I wonder if people will be particularly likely to neglect or discount the possibility that the bot, especially in its current form, suffers from the same biases and errors of reasoning as humans.”

This is where I suspect a chatbot’s ability to generate prose, as opposed to a list of useful links, gets dangerous. People transmit beliefs socially, through language. And when a lot of us share a belief system, we form a more cohesive and harmonious group. But that’s a hackable system. Because ideas that are communicated well — using the right words and phrasing and tone — can seem more convincing. The bots use the first-person “I,” even though there’s no person. To a casual reader, Bard’s and Sydney’s answers will come across as human enough, and that means they’ll feel that much truer.

The power of story

Another possible explanation of why we’re suckers for chatbots is that we’re suckers for explanation. At some basic human level, it’s just really, really satisfying to swap befuddlement for certainty. It makes us feel smart, and in control of stuff we have no control over.

The problem is we don’t really know what makes people fall for one explanation over another. Some research suggests the explanations with the most power are those that are simplest and most broadly applicable. Other research has indicated that given a choice, people are more likely to believe stories that contain more detail. (Kieran Healy, a sociologist at Duke University, wrote a paper decrying our tendency to overcomplicate stuff; he titled it “Fuck Nuance.”) And a meta-analysis of 61 papers across five decades of research found that context is what matters most. In emotional areas, a dose of storytelling makes an explanation more believable. In less personal matters, like public policy, people prefer to have facts unadorned by narrative.

“I don’t believe there’s any consensus on what makes an explanation appealing,” says Duncan Watts, a sociologist who teaches a course at the University of Pennsylvania. And that, keep in mind, is from a guy who teaches a course called Explaining Explanations.

But whatever that certain je ne sais quoi is, AI chatbots seem to have it. Just a few days before Google and Microsoft announced their impending botification, a team of social scientists at Stanford published a fascinating preprint. They showed thousands of people short persuasive articles on hot-button subjects like assault-weapons bans and carbon taxes. Some versions were written by a GPT-3 chatbot; others by a human. Then the scientists measured how much people changed their opinions based on the articles.

The AI-generated messages, it turns out, were just as convincing as the human ones. But the wild part is why. When the researchers debriefed their human subjects, those who preferred the chatbot articles said the artisanal, human-made messages relied too much on anecdote and imagery. GPT-3 was more evidence-based and well reasoned. The very quality that made the robot less human made humans more likely to believe it. Just like their “Terminator” forebears, the chatbots didn’t feel pity or remorse or fear. And they absolutely did not stop, ever, until the human subjects were convinced. 

You lazy bastard

So the chatbots will lie and get things wrong. My biggest worry is that Google and Bing users will know this and simply won’t care. One theory of why disinformation and fake news spreads is that people are downright lazy. They buy whatever a trusted source is selling. If the chatbots get it mostly right most of the time, that’s good enough. Until, say, your flight doesn’t actually leave at that time from that airport. Or your house catches fire because you installed a light switch wrong.

A few weeks back, I asked Watts, the sociologist, for help on a story about why people believe kooky conspiracy theories. He suggested I read a 25-year-old paper by Alison Gopnik, a psychologist at the University of California at Berkeley, called “Explanation as Orgasm.”

Gopnik is best known for her work on the developmental psychology of children. She says toddlers create mental models of the world using observations to test hypotheses — the scientific method, essentially. But in her paper on explanations, Gopnik suggests that humans have two systems for figuring out how the world works. One is for wondering why things are the way they are — a “hmm system,” if you will. The other is for developing explanations — an “aha system.” Like our biological systems for reproduction and orgasm, Gopnik says, the two cognitive systems are related but separate. We can do one without doing the other. The second one feels good, and it’s a reward for doing the first.

But the aha system is trickable. Psychedelic experiences can induce a feeling that “everything makes sense,” even if they don’t produce an articulable explanation for how. Dreams can do it, too. That’s why when you snap awake at 3 a.m. and write yourself a note to remember some brilliant insight that came to you in your sleep, your scribbles make no sense the next morning.

In other words, the feeling of having something that looks like an explanation can feel so damn good, it can overwhelm the part of our mind that had the question in the first place. We mistake an answer for the answer. 

It’s not supposed to be that way. In 1877 a philosopher named William Clifford wrote an article called “The Ethics of Belief” in which he argues that belief has to come from patient investigation, not just the suppression of doubt. Our ideas are common property, he insists — an “heirloom” handed down to subsequent generations. It is “an awful privilege, and an awful responsibility, that we should help to create the world in which posterity will live.”

The temptation to dodge that responsibility is powerful. Clifford, like Gopnik, understood that explanations feel good even when they’re wrong. “It is the sense of power attached to a sense of knowledge that makes men desirous of believing, and afraid of doubting,” Clifford argues. Witness the race to explain all the unidentified objects being shot down over Saskatchewan. Better to believe in aliens than to live in fear of the unknown.

Clifford offers an antidote for this temptation. His response is basically: Not today, Satan. “The sacred tradition of humanity,” he says, “consists, not in propositions or statements which are to be accepted and believed on the authority of the tradition, but in questions rightly asked, in conceptions which enable us to ask further questions, and in methods of answering questions.” 

The bots will offer us easy answers. We just have to remember that’s not what we should be asking for.

Source: I N S I D E R

No Result
View All Result

Recent Posts

  • Bitcoin may drop 70% before $1m, MEXC’s ‘white whale’ apology
  • Exxon funded thinktanks to spread climate denial in Latin America, documents reveal
  • Africa urged to accelerate regulation as stablecoins surge in popularity
  • Mamdani leads in New York mayoral race as Trump launches stark warning
  • Xi cracks joke about spying while gifting phones to South Korean president

Recent Comments

    Archives

    • November 2025
    • October 2025
    • September 2025
    • August 2025
    • July 2025
    • June 2025
    • May 2025
    • April 2025
    • March 2025
    • February 2025
    • January 2025
    • December 2024
    • November 2024
    • October 2024
    • September 2024
    • August 2024
    • July 2024
    • June 2024
    • May 2024
    • April 2024
    • March 2024
    • February 2024
    • January 2024
    • December 2023
    • November 2023
    • October 2023
    • September 2023
    • August 2023
    • July 2023
    • June 2023
    • May 2023
    • April 2023
    • March 2023
    • February 2023
    • January 2023
    • December 2022
    • November 2022
    • October 2022
    • September 2022
    • August 2022

    Categories

    • Africa
    • Africa finance
    • AI
    • An diesem Tag
    • Asia
    • Asia finance
    • Australia
    • Banking
    • Best chefs
    • Biden
    • Blockchain
    • Burundi
    • Business of the week
    • Carbon
    • Caribbean
    • Central Banks
    • China
    • Climate
    • Climate & Energy
    • Coal
    • Cocktail of the week
    • Commodities
    • Corporate
    • Deutsch
    • Deutsch PR
    • Digital Banking
    • English PR
    • Europe
    • Financial insights
    • Focus on neobanking
    • Gas
    • Global news
    • Harris
    • History
    • India
    • Influential women
    • Invest and Rest
    • Italiano PR
    • Jamaica
    • Japan
    • Laos
    • Laos
    • Lifestyle
    • Metaverse
    • MSTRpay
    • Neobanking
    • News
    • Newshub long-read
    • newshub special
    • newshub-special
    • NFT
    • Nobel Prizes 2024
    • Nuclear
    • Oil
    • Press
    • Press releases
    • Pressroom
    • Renewable
    • Russia
    • Senegal
    • Solar
    • South America
    • South East Asia
    • South Korea
    • Stocks
    • Svensk PR
    • Tech
    • Trump
    • Trump trials
    • UFO
    • UK
    • UK News
    • Ukraine
    • US
    • US politics
    • Waves
    • WEX
    • Wind
    • World safety

    Meta

    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org

    Recent Posts

    • Bitcoin may drop 70% before $1m, MEXC’s ‘white whale’ apology
    • Exxon funded thinktanks to spread climate denial in Latin America, documents reveal
    • Africa urged to accelerate regulation as stablecoins surge in popularity
    • Mamdani leads in New York mayoral race as Trump launches stark warning
    • Xi cracks joke about spying while gifting phones to South Korean president

    Categories

    • Africa
    • Africa finance
    • AI
    • An diesem Tag
    • Asia
    • Asia finance
    • Australia
    • Banking
    • Best chefs
    • Biden
    • Blockchain
    • Burundi
    • Business of the week
    • Carbon
    • Caribbean
    • Central Banks
    • China
    • Climate
    • Climate & Energy
    • Coal
    • Cocktail of the week
    • Commodities
    • Corporate
    • Deutsch
    • Deutsch PR
    • Digital Banking
    • English PR
    • Europe
    • Financial insights
    • Focus on neobanking
    • Gas
    • Global news
    • Harris
    • History
    • India
    • Influential women
    • Invest and Rest
    • Italiano PR
    • Jamaica
    • Japan
    • Laos
    • Laos
    • Lifestyle
    • Metaverse
    • MSTRpay
    • Neobanking
    • News
    • Newshub long-read
    • newshub special
    • newshub-special
    • NFT
    • Nobel Prizes 2024
    • Nuclear
    • Oil
    • Press
    • Press releases
    • Pressroom
    • Renewable
    • Russia
    • Senegal
    • Solar
    • South America
    • South East Asia
    • South Korea
    • Stocks
    • Svensk PR
    • Tech
    • Trump
    • Trump trials
    • UFO
    • UK
    • UK News
    • Ukraine
    • US
    • US politics
    • Waves
    • WEX
    • Wind
    • World safety

    Archives

    • November 2025
    • October 2025
    • September 2025
    • August 2025
    • July 2025
    • June 2025
    • May 2025
    • April 2025
    • March 2025
    • February 2025
    • January 2025
    • December 2024
    • November 2024
    • October 2024
    • September 2024
    • August 2024
    • July 2024
    • June 2024
    • May 2024
    • April 2024
    • March 2024
    • February 2024
    • January 2024
    • December 2023
    • November 2023
    • October 2023
    • September 2023
    • August 2023
    • July 2023
    • June 2023
    • May 2023
    • April 2023
    • March 2023
    • February 2023
    • January 2023
    • December 2022
    • November 2022
    • October 2022
    • September 2022
    • August 2022
    newshub

    © 2023-2025
    MSTRpay AB
    Legal & Disclosure

    • Global news
    • Financial insights
    • Africa
    • Asia
    • Caribbean
    • MSTRpay
    • Press releases

    Welcome Back!

    Login to your account below

    Forgotten Password?

    Retrieve your password

    Please enter your username or email address to reset your password.

    Log In

    Add New Playlist

    No Result
    View All Result
    • Global news
      • Climate & energy
        • Climate
        • Carbon
        • Coal
        • Disruptive
        • Gas
        • Nuclear
        • Oil
        • Solar
        • Water
        • Waves
        • Wind
        • Renewable
        • South America
      • Lifestyle
        • Best chefs
        • Cocktail of the week
        • History
        • Influential women
        • Newshub long-read
    • Financial insights
      • Australia
      • Banking
      • Business of the week
      • Central Banks
      • China
      • Commodities
      • Corporate
      • Europe
      • Fin & tech
        • Tech
        • AI
        • Blockchain
      • Investment
      • Japan
      • Neobanking
      • South East Asia
      • UK
      • US
    • Africa
      • Africa finance
      • Burundi
      • Gambia
      • Senegal
    • Asia
      • Asia finance
      • Laos
      • Malaysia
      • South Korea
    • Caribbean
    • MSTRpay
    • Press releases

    © 2023-2025
    MSTRpay AB
    Legal & Disclosure