1 2
aircooled
aircooled MegaDork
4/11/24 5:02 p.m.

As a sort of add on to the topic about modern media (and a bit of side topic to the Ukraine war thread).  I thought I would share something I ran across. It is not posted by a news outlet (that I could tell).

I certainly don't expect many here to be solidly in defense of the current state of social media (!), but many may not be aware of what is likely heavily influencing it.  

 

Of note is that there are clearly some VERY contentious political / social issues / names noted in the post that of course is the point of why these operations are so successful.  To discuss those of course are not the point of the post, and if they upset you, that is an example of what is being pointed out, so PLEASE do not comment on those.

 

I present this more as information I think many will find useful, and many should know and some might find useful when discussing such things with their kids etc.

 

--------------------------

You're being targeted by disinformation networks that are vastly more effective than you realize. And they're making you more hateful and depressed.

You're being targeted by disinformation networks that are vastly more effective than you realize. And they're making you more hateful and depressed.

You know that Russia and other governments try to manipulate people online.  But you almost certainly don't how just how effectively orchestrated influence networks are using social media platforms to make you -- individually-- angry, depressed, and hateful toward each other. Those networks' goal is simple: to cause Americans and other Westerners -- especially young ones -- to give up on social cohesion and to give up on learning the truth, so that Western countries lack the will to stand up to authoritarians and extremists.

And you probably don't realize how well it's working on you.

This is a long post, but I wrote it because this problem is real, and it's much scarier than you think.

How Russian networks fuel racial and gender wars to make Americans fight one another

In September 2018, a video went viral after being posted by In the Now, a social media news channel. It featured a feminist activist pouring bleach on a male subway passenger for manspreading. It got instant attention, with millions of views and wide outrage from social media. Reddit users wrote that it had turned them against feminism.

There was one problem: The video was staged. And In the Now, which publicized it, is a subsidiary of RT, formerly Russia Today, the Kremlin TV channel aimed at foreign, English-speaking audiences.

As an MIT study found in 2019, Russia's online influence networks reached 140 million Americans every month -- the majority of U.S. social media users. 

Russia began using troll farms a decade ago to incite gender and racial divisions in the United States 

In 2013, Yevgeny Prigozhin, a confidante of Vladimir Putin, founded the Internet Research Agency (the IRA) in St. Petersburg. It was the Russian government's first coordinated facility to disrupt U.S. society and politics through social media.

Here's what Prigozhin had to say about the IRA's efforts to disrupt the 2022 election:

Gentlemen, we interfered, we interfere and we will interfere. Carefully, precisely, surgically and in our own way, as we know how. During our pinpoint operations, we will remove both kidneys and the liver at once.

In 2014, the IRA and other Russian networks began establishing fake U.S. activist groups on social media. By 2015, hundreds of English-speaking young Russians worked at the IRA.  Their assignment was to use those false social-media accounts, especially on Facebook and Twitter -- but also on Reddit, Tumblr, 9gag, and other platforms -- to aggressively spread conspiracy theories and mocking, ad hominem arguments that incite American users.

In 2017, U.S. intelligence found that Blacktivist, a Facebook and Twitter group with more followers than the official Black Lives Matter movement, was operated by Russia. Blacktivist regularly attacked America as racist and urged black users to rejected major candidates. On November 2, 2016, just before the 2016 election, Blacktivist's Twitter urged Black Americans: "Choose peace and vote for Jill Stein. Trust me, it's not a wasted vote."

Russia plays both sides -- on gender, race, and religion

The brilliance of the Russian influence campaign is that it convinces Americans to attack each other, worsening both misandry and misogyny, mutual racial hatred, and extreme antisemitism and Islamophobia. In short, it's not just an effort to boost the right wing; it's an effort to radicalize everybody.

Russia uses its trolling networks to aggressively attack men.  According to MIT, in 2019, the most popular Black-oriented Facebook page was the charmingly named "My Baby Daddy Aint E36 M3."  It regularly posts memes attacking Black men and government welfare workers.  It serves two purposes:  Make poor black women hate men, and goad black men into flame wars.  

MIT found that My Baby Daddy is run by a large troll network in Eastern Europe likely financed by Russia.

But Russian influence networks are also also aggressively misogynistic and aggressively anti-LGBT.  

On January 23, 2017, just after the first Women's March, the New York Times found that the Internet Research Agency began a coordinated attack on the movement.  Per the Times:

More than 4,000 miles away, organizations linked to the Russian government had assigned teams to the Women’s March. At desks in bland offices in St. Petersburg, using models derived from advertising and public relations, copywriters were testing out social media messages critical of the Women’s March movement, adopting the personas of fictional Americans.

They posted as Black women critical of white feminism, conservative women who felt excluded, and men who mocked participants as hairy-legged whiners.

But the Russian PR teams realized that one attack worked better than the rest:  They accused its co-founder, Arab American Linda Sarsour, of being an antisemite.  Over the next 18 months, at least 152 Russian accounts regularly attacked Sarsour.  That may not seem like many accounts, but it worked:  They drove the Women's March movement into disarray and eventually crippled the organization. 

Russia doesn't need a million accounts, or even that many likes or upvotes.  It just needs to get enough attention that actual Western users begin amplifying its content.   

A former federal prosecutor who investigated the Russian disinformation effort summarized it like this:

It wasn’t exclusively about Trump and Clinton anymore.  It was deeper and more sinister and more diffuse in its focus on exploiting divisions within society on any number of different levels.

As the New York Times reported in 2022, 

There was a routine: Arriving for a shift, [Russian disinformation] workers would scan news outlets on the ideological fringes, far left and far right, mining for extreme content that they could publish and amplify on the platforms, feeding extreme views into mainstream conversations.

China is joining in with AI

Last month, the New York Times reported on a new disinformation campaign.  "Spamouflage" is an effort by China to divide Americans by combining AI with real images of the United States to exacerbate political and social tensions in the U.S.  The goal appears to be to cause Americans to lose hope, by promoting exaggerated stories with fabricated photos about homeless violence and the risk of civil war.

As Ladislav Bittman, a former Czechoslovakian secret police operative, explained about Soviet disinformation, the strategy is not to invent something totally fake.  Rather, it is to act like an evil doctor who expertly diagnoses the patient’s vulnerabilities and exploits them, “prolongs his illness and speeds him to an early grave instead of curing him.”

The influence networks are vastly more effective than platforms admit

Russia now runs its most sophisticated online influence efforts through a network called Fabrika.  Fabrika's operators have bragged that social media platforms catch only 1% of their fake accounts across YouTube, Twitter, TikTok, and Telegram, and other platforms.

But how effective are these efforts?  By 2020, Facebook's most popular pages for Christian and Black American content were run by Eastern European troll farms tied to the Kremlin. And Russia doesn't just target angry Boomers on Facebook. Russian trolls are enormously active on Twitter. And, even, on Reddit.

It's not just false facts

The term "disinformation" undersells the problem.  Because much of Russia's social media activity is not trying to spread fake news.  Instead, the goal is to divide and conquer by making Western audiences depressed and extreme. 

Sometimes, through brigading and trolling.  Other times, by posting hyper-negative or extremist posts or opinions about the U.S. and the West over and over, until readers assume that's how most people feel.  And sometimes, by using trolls to disrupt social media threads that advance Western unity.  

As the RAND think tank explainedthe Russian strategy is volume and repetition, from numerous accounts, to overwhelm real social media users and create the appearance that everyone disagrees with, or even hates, them.  And it's not just low-quality bots.  Per RAND,

Russian propaganda is produced in incredibly large volumes and is broadcast or otherwise distributed via a large number of channels. ... According to a former paid Russian Internet troll, the trolls are on duty 24 hours a day, in 12-hour shifts, and each has a daily quota of 135 posted comments of at least 200 characters.

What this means for you

You are being targeted by a sophisticated PR campaign meant to make you more resentful, bitter, and depressed.  It's not just disinformation; it's also real-life human writers and advanced bot networks working hard to shift the conversation to the most negative and divisive topics and opinions. 

It's why some topics seem to go from non-issues to constant controversy and discussion, with no clear reason, across social media platforms.  And a lot of those trolls are actual, "professional" writers whose job is to sound real. 

So what can you do?  To quote WarGames:  The only winning move is not to play.  The reality is that you cannot distinguish disinformation accounts from real social media users.  Unless you know whom you're talking to, there is a genuine chance that the post, tweet, or comment you are reading is an attempt to manipulate you -- politically or emotionally.

Here are some thoughts:

  • Don't accept facts from social media accounts you don't know.  Russian, Chinese, and other manipulation efforts are not uniform.  Some will make deranged claims, but others will tell half-truths.  Or they'll spin facts about a complicated subject, be it the war in Ukraine or loneliness in young men, to give you a warped view of reality and spread division in the West.  
  • Resist groupthink.  A key element of manipulate networks is volume.  People are naturally inclined to believe statements that have broad support.  When a post gets 5,000 upvotes, it's easy to think the crowd is right.  But "the crowd" could be fake accounts, and even if they're not, the brilliance of government manipulation campaigns is that they say things people are already predisposed to think, exacerbating confirmation bias.  They'll tell conservative audiences something misleading about a Democrat, or make up a lie about Republicans that catches fire on a liberal server or subreddit.
  • Don't let social media warp your view of society.  This is harder than it seems, but you need to accept that the facts -- and the opinions -- you see across social media are not reliable.  If you want the news, do what everyone online says not to: look at serious, mainstream media.  It is not always right.  Sometimes, it screws up.  But social media narratives are heavily manipulated by networks whose job is to ensure you are deceived, angry, and divided.
Snowdoggie (Forum Supporter)
Snowdoggie (Forum Supporter) UltraDork
4/11/24 5:14 p.m.

Good post!

Knowing what social media is doing to you and is trying to do to you is the first step to stopping them.

I find myself spending less time on social media and being happier because of it. I haven't even logged into my account in Twitter/X in months. It's become a very toxic place. Going to a social media site is like going to a bar. If the surroundings are nice, the whiskey is good and the company is agreeable, I stay. If it's the electronic equivalent of an outlaw biker bar where the booze is watered down, people are fighting and somebody just threatened me with a knife, it's time to leave and not look back.

RevRico
RevRico MegaDork
4/11/24 5:22 p.m.

Just remember, it's not just the Russians or any other foreign boogymen, but the American agencies too. Everybody likes to forget the lengths our own government has gone and continues to go to to leave us arguing with ourselves instead of building guillotines to go after them. 

pointofdeparture
pointofdeparture UltimaDork
4/11/24 5:31 p.m.
RevRico said:

Just remember, it's not just the Russians or any other foreign boogymen, but the American agencies too. Everybody likes to forget the lengths our own government has gone and continues to go to to leave us arguing with ourselves instead of building guillotines to go after them. 

This is something I think about often with regard to the push to ban TikTok because it supposedly represents some kind of Chinese disinformation machine. It might be exactly that, but a ban will do just about nothing to combat disinformation on a meaningful level. It's really just an economic protection move.

If you think that X/Twitter and Meta are innocent just because they're homegrown, you do not have your thinking cap on. They all want to suck you in however they can so that you keep coming back and using their platform to make them money.

If you're constantly on Facebook getting enraged and arguing with people until you're blue in the face, they have you hooked...why would they want to change anything so that you spend less time on their service?

SV reX
SV reX MegaDork
4/11/24 5:39 p.m.

In reply to pointofdeparture :

I don't know about that.  The one big difference I've heard is that TikTok can log keystrokes both inside and outside the app.  That would include things like passwords.  I don't think some of the other social media sites are set up quite the same way. 
 

(I am not an authority on this)

dean1484
dean1484 MegaDork
4/11/24 5:40 p.m.

Being about your age Air this is not news. It is a very well written and concise summery of a big problem that is boiling under the surface of society.

This is a highly important issue but I am betting this thread gets locked rather quickly. This is why I gave up paying any attention to social media of any kind quite some time ago. I am much happier and a better person now. 

pointofdeparture
pointofdeparture UltimaDork
4/11/24 6:01 p.m.
SV reX said:

In reply to pointofdeparture :

I don't know about that.  The one big difference I've heard is that TikTok can log keystrokes both inside and outside the app.  That would include things like passwords.  I don't think some of the other social media sites are set up quite the same way. 
 

(I am not an authority on this)

I work in tech. TikTok has access to the exact same information as every other app on your phone.

All of the social media apps do it. Google, Meta and X just do a lot of lobbying to make TikTok the boogeyman to distract people from the fact that they are just as bad.

Consumer Reports article on how TikTok was actually a latecomer to these methods

Article about how Facebook and Instagram inject extra code to track your activity when you use their in-app browsers

Big Facebook app users might have noticed that Facebook has repeatedly pushed you to turn on a "link history" feature which enables enhanced tracking features, sold to you as a convenience feature, but really a thinly veiled means for them to see what you're doing in greater detail.

If you care about privacy and security, you should not be using any social media app, period.

Snowdoggie (Forum Supporter)
Snowdoggie (Forum Supporter) UltraDork
4/11/24 6:02 p.m.

I stay away from Tic Toc. I just don't trust the place.

aircooled
aircooled MegaDork
4/11/24 6:07 p.m.
pointofdeparture said:
 

....If you think that X/Twitter and Meta are innocent just because they're homegrown, you do not have your thinking cap on. They all want to suck you in however they can so that you keep coming back and using their platform to make them money....

This is certainly not to discount that aspect, but I feel pretty confident, the manipulation that China and Russia are doing are on a very different, far more dangerous level.  They are literally trying to tear the country apart and destroy it from within! 

I find the social / mental manipulation by China (TikToc) far more of threat than the information gathering aspect (which most seem to focus on).  I am actually surprised there is no (not that I know of) counter effort by US intelligence agencies (within the US that is).

Far easier to "win" when you can have your enemy do the fighting for you.  Russia has a lot of history and practice with this BTW.

pointofdeparture
pointofdeparture UltimaDork
4/11/24 6:15 p.m.

In reply to aircooled :

Oh, I agree with you, my perspective is ultimately just that the domestic companies' hands aren't clean either and you should have your guard up on every platform you choose to interact with. TikTok is factually a Chinese state apparatus to some degree. But X and Facebook also facilitate the same kind of manipulation because they profit off of the traffic it brings to their platforms, so they are just as dangerous IMO.

Snowdoggie (Forum Supporter)
Snowdoggie (Forum Supporter) UltraDork
4/11/24 6:24 p.m.

Amazon just wants to sell me crap. Sometimes I want to buy the crap they want to sell me and they can deliver it to my house in a day.

The Chinese have spies in this country and they mess with their own citizens in this country.

Putin is just evil. I don't know how else to describe him. I want nothing to do with Putin or his supporters.

aircooled
aircooled MegaDork
4/11/24 6:24 p.m.

In reply to pointofdeparture :

I guess you could say:  China and Russia are trying to destroy the US from the inside on purpose.  US social media companies are doing it as an unintended consequence of trying to drive their product / advertising etc.

alfadriver
alfadriver MegaDork
4/11/24 6:26 p.m.

After all of these years not being part of social media other than some very subject specific message boards.....

alfadriver
alfadriver MegaDork
4/11/24 6:30 p.m.
aircooled said:

In reply to pointofdeparture :

I guess you could say:  China and Russia are trying to destroy the US from the inside on purpose.  US social media companies are doing it as an unintended consequence of trying to drive their product / advertising etc.

So the "samething" isn't actually the same thing.  

Kreb (Forum Supporter)
Kreb (Forum Supporter) PowerDork
4/11/24 7:05 p.m.

As Ladislav Bittman, a former Czechoslovakian secret police operative, explained about Soviet disinformation, the strategy is not to invent something totally fake.  Rather, it is to act like an evil doctor who expertly diagnoses the patient’s vulnerabilities and exploits them, “prolongs his illness and speeds him to an early grave instead of curing him.”

 

That reminds me of an article I read recently that chronicled a Russian-based journalists long bout with an unknown illness. Finally, doctors in Berlin  figured out that she'd been poisoned - not to kill, but to disable. They mentioned that this has happened repeatedly. When someone dies of a poisoning, there's bad publicity. But when your enemies have chronic, mysterious, debilitating illnesses it can be even more effective.

GameboyRMH
GameboyRMH MegaDork
4/11/24 7:34 p.m.

In reply to Kreb (Forum Supporter) :

A very suspicious number of "troublemakers" under Soviet-aligned regimes died of some specific forms of cancer often with oddly similar timing, there have been pretty heavy rumors that this was inflicted by irradiation devices hidden in the walls of interrogation rooms etc. but no hard evidence...although that was a technique they were known to use:

https://nsarchive.gwu.edu/briefing-book/intelligence-nuclear-vault-russia-programs/2022-09-22/moscow-signals-declassified

spitfirebill
spitfirebill MegaDork
4/11/24 8:21 p.m.

In reply to GameboyRMH :

That falls nicely into line with the new "Cuba Syndrome".  

OHSCrifle
OHSCrifle UberDork
4/11/24 8:44 p.m.

I think social media and news media are both trying to baffle us with bullE36 M3, make us fight.

I turned off notifications on all social media and even Teams at work. My mental health is much improved.

MrJoshua
MrJoshua UltimaDork
4/11/24 9:36 p.m.

orrrrr-are we being manipulated to hate each other by our own government and blame it on the Russians/Chinese?

nderwater
nderwater UltimaDork
4/11/24 10:39 p.m.

It's not just social media--i scoff at some half-truth or misinformation in major media stories almost every day. It's shocking how much spin is constantly going on around us.

aircooled
aircooled MegaDork
4/11/24 10:59 p.m.

Yeah, that was kind of covered in the NPR thread.  That is why I wanted to make sure to highlight what I see as a major issue in social media (certainly not the only one BTW).
 

In reply to MrJoshua :

Orrrr... we are being manipulated to hate each other by the Russians and Chinese so that people will think that the government is trying to make us think we are being manipulated by the Russians and Chinese...

... never go in against a Sicilian when death is on the line!....

alfadriver
alfadriver MegaDork
4/11/24 11:38 p.m.

Here's what's really sad....  we are all more alike than not.  And I'm talking about ordinary people all over the world.

We want a good home (and a secure one).  We want a job.  We want opportunities for our kids.  We want to live a decent life.  

The hard part is when we get into a good situation and want to hoard it to ourselves.  Or if we get in a bad situation, we want to blame others for it (and it may be true, but often enough, it's not).

And because of that, we are open to manipulation for the goals of others.

ShawnG
ShawnG MegaDork
4/11/24 11:57 p.m.

I got rid of cable over ten years ago.

I stopped watching the news 5 years ago.

I gutted my social media stuff shortly after.

Left the city two years ago.

My mental health is drastically better.

My wife still watches all the cop shows on television.

When you stop watching that garbage for a while, you realize that every time you turn on the T.V. The show is usually: 1) Fictional shows about people trying to kill each other. 2) News that only focuses on bad stuff. 3) "Sports" which involves two big apes beating the hell out of each other. Or 4) Shows that make you feel "at least I'm not as fat/lazy/ugly/stupid as that person".

No wonder society is so damn messed up.

I'm done having my mind poisoned by it.

Boost_Crazy
Boost_Crazy Dork
4/12/24 1:56 a.m.

In reply to aircooled :

I touched on this in the other thread, mentioning as bad as the media is, social media is way worse. We are allowing our enemies access and they are playing us against each other like puppets. I just had another talk with my kids about it the other day. They are actively targeting our children, trying to sow discontent with our way of life. They get exposed to some crazy ideas, which are normalized to them. 

I heard one commentator say it's like a Trojan Horse, except it's made of glass and you can see the enemy hiding inside. And we are bringing it inside the gates anyway, because free horse. 

Hungary Bill (Forum Supporter)
Hungary Bill (Forum Supporter) PowerDork
4/12/24 2:32 a.m.

I have this theory that I've posted on social media, but I'm not sure I've posted it here or not.  Here goes:

My theory is that every modern age has brought with it some form of "pollution".

The industrial age with its coal burning and steam engine use turned our cities into literal cesspits.

The transportation age that followed dug up and burned MASSIVE amounts of fuels that literally turned the air brown.

and so on, and so on...

We're now in the information age and with it has come the pollution of "misinformation"

Much like how industries have largely switched to electricity (and in some cases, renewable energy), much like how we created emissions control and emission controlling devices (and in some cases, switched to electricity), we will eventually come up with ways to limit this new pollution.

The thing is, in the short term it's just so much more profitable to massively pollute.  It's cheaper, it moves you forward faster, and it makes you more money.  The problem is that in the long term, it's just not sustainable.  Eventually the Cuyahoga River catches fire and you end up with "Earth Day".  Eventually you get a hole in the ozone layer, and you lose your rad hair style.  Eventually Los Angeles turns brown and you get AdBlue...

This age is relatively new, we have a lot of work ahead of us, the bad guys are ahead of the game, but we'll get there.  We always do.

 

1 2

You'll need to log in to post.

Our Preferred Partners
RKp8Awl9APJotvRayGLvfpr32RzRaZqH6pzsJ5zKiNuyTCeRdmgkH1F48S4mtM6O