Beijing did a test run in Taiwan using AI-generated content to influence voters away from a pro-sovereignty candidate

China will attempt to disrupt elections in the US, South Korea and India this year with artificial intelligence-generated content after making a dry run with the presidential poll in Taiwan, Microsoft has warned.

The US tech firm said it expected Chinese state-backed cyber groups to target high-profile elections in 2024, with North Korea also involved, according to a report by the company’s threat intelligence team published on Friday.

“As populations in India, South Korea and the United States head to the polls, we are likely to see Chinese cyber and influence actors, and to some extent North Korean cyber actors, work toward targeting these elections,” the report reads.

Microsoft said that “at a minimum” China will create and distribute through social media AI-generated content that “benefits their positions in these high-profile elections”.

  • rayyy@lemmy.world
    link
    fedilink
    arrow-up
    29
    arrow-down
    3
    ·
    1 year ago

    TikTok

    TikTok seems so innocent but people do not understand how nefariously TikTok can be used. Most folks are are defenseless targets because they are not attuned to how psyops has been successfully used against them already.

    • jaschen@lemm.ee
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      It’s super targeted. I really wish they would ban it here like they are trying in the US.

      • TWeaK@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        1 year ago

        It’s super targeted.

        Exactly, and that’s also why it’s so effective. You can say whatever lies you like if you only say them to the people who will lap it up, there’s and no chance for anyone else to correct them as they don’t even hear the lie.

      • Bernie_Sandals@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        It’s a lot harder to add context with only 60 seconds of video. Stripping context is one of the easiest ways to flip a narrative. There are things unique to the TikTok format that are uniquely dangerous.