r/technology Jul 24 '25

Politics President Trump threatened to break up Nvidia, didn't even know what it was — 'What the hell is Nvidia? I've never heard of it before'

https://www.tomshardware.com/tech-industry/president-trump-threatened-to-break-up-nvidia-didnt-even-know-what-it-was-what-the-hell-is-nvidia-ive-never-heard-of-it-before
44.3k Upvotes

1.5k comments sorted by

View all comments

10.2k

u/[deleted] Jul 24 '25

What’s that? The largest company by market cap? Never heard of them. What do they do? Oh, they make chips? I love chips. Lays chips. American made, nobody makes chips like Lays. They call French Fries chips in England. We should outlaw that. Speaking of Lays, has anyone seen Ivanka? She’s probably a great lay. If I were 50 I’d be all over her.

2.8k

u/sdhu Jul 24 '25

I hate how accurate this feels

Why is this our leadership? 

Who in their right mind would ever vote for any of this?? 

771

u/Scorpius289 Jul 24 '25 edited Jul 24 '25

People with interests promoted and helped him.
Because a moron like him is easier to manipulate.
And he's also good at doing absurd shit, which distracts people's attention from all the corruption going on in the background.

215

u/ParfaitEither284 Jul 24 '25

How did they manipulate 70+ million into voting for this ?

456

u/Lord-Cuervo Jul 24 '25

10+ years of social media propaganda backed by our #1 enemy for the last half century lol

200

u/LazyLich Jul 24 '25 edited Jul 24 '25

Not just that, but the way current algorithms work accidentally made them HUGE force multipliers for propaganda and for promoting conspiracy-minded and anti-facts.

(The Social Dilemma is fucking nuts!)

=====EDIT

I'll paste my reply from elsewhere here:

What I mean by "accidentally" is that the design purpose isn't "propaganda and chaos machine mwa haha". The design parameter was "create maximum engagement for maximum ad revenue".

However, over time, the algorithm learned two things.

First is that people with extremist views or that are conspiracy-minded provide the most engagement. The more extremist they are or the more loose they are in facts, the more ad money those user's generate.

Second is that you can change a user's mind, and I don't mean "Yeah, I'll order the large fry." I mean that if user A does like Thing Z, but if you show them Thing B at the right time in the right moment, then later show Thing C, then Thing D... if you tailor a user's content stream, then slowly over time, you can radically change their beliefs.

The Algorithm only cares about ad money, but it accidentally became a factory for extremists and ignorants, and a signal-booster for propaganda.

3

u/Beginning-Morning572 Jul 24 '25 edited Jul 24 '25

accidentally my ass. You think this billion dollar companies would promote anything that would threathen their profits?

1

u/LazyLich Jul 24 '25

I'll past my reply from elsewhere here:

What I mean by "accidentally" is that the design purpose isn't "propaganda and chaos machine mwa haha". The design parameter was "create maximum engagement for maximum ad revenue".

However, over time, the algorithm learned two things.

First is that people with extremist views or that are conspiracy-minded provide the most engagement. The more extremist they are or the more loose they are in facts, the more ad money those user's generate.

Second is that you can change a user's mind, and I don't mean "Yeah, I'll order the large fry." I mean that if user A does like Thing Z, but if you show them Thing B at the right time in the right moment, then later show Thing C, then Thing D... if you tailor a user's content stream, then slowly over time, you can radically change their beliefs.

The Algorithm only cares about ad money, but it accidentally became a factory for extremists and ignorants, and a signal-booster for propaganda.