Reminds me of our President (SA) asking his colleagues to sign a pledge not to do corruption; when there's already a swearing in that does the same thing.
For those who live in the red-pilled real world, just don't trade on something where controlling insiders (also with a potential conflict-of-interest) can beat you at the game. This is different from bets with non-controlling insiders with no conflict-of-interest.
The key is avoiding the bets with controlling insiders, i.e. those that could have a potential conflict of interest. Even something as banal as weather data has some insider knowledge, but an insider has no practical control over it, i.e. the insider is non-controlling, with no conflict-of-interest.
Weather data in prediction markets can definitely be gamed. One example that exists in real prediction markets is that the contract specifies a single source as the source of truth. But that source rounds data during unit conversion twice (F -> C -> F), meaning there’s an unequal probability distribution, and some numbers have a 0% chance of winning.
Take a community with AI moderation like Reddit, I've been a participant for years. With the recent push to AI autocorrect and moderation, you can see the changes in language. New words, new ways of speaking, unconsciously editing yourself because you don't want to draw the eye of the bot. It doesn't feel subtle. It feels Orwellian.
It's particularly egregious on youtube, where people frequently use words like "unalived" or "self-deleted" instead of murder or suicide, lest they incur the wrath of the almighty algorithm.
That seems to me to be an example where the language is forced to change but the thoughts remain the same. Sure, people are using the "safe" terms, but they're using them to continue to subvert the rules, not to bow to them.
The problem is when that vernacular extends into regular life. I haven’t noticed it yet with unalive, but I’m sure there will come a day. Eventually if the censors continue suppressing the word suicide, we will end up with unalive taking suicide’s place both online and offline. Then, the censors will censor unalive, and a new word will be coined, and the cycle continues.
> On Friday, a social media user tweeted an image from the Nirvana exhibit at the Museum of Pop Culture in Seattle. A placard dedicated to the “27 Club” read, “Kurt Cobain un-alived himself at 27.”
I'm not fully comfortable with the shift in language either, but my point is that, even if the language is changed, the thoughts will remain. To use 1984 (is there a Godwin's law equivalent for this now?), the party taught that 2 + 2 = 5, which is changing thought. Social media is trying to do that, but failing. The danger is if it's one day effective, but to date it hasn't been.
Youtube comments is a separate genre itself. Due to youtube moderation policy - music video comments are all the same, same tired jokes, patterns. Not an AI slop per se, but feels the same.
I recently had a comment removed by reddit. It wasn’t even against the rules. It was anti establishment is all. I insulted the billionaire class in that comment. Class division style comments are now banned. Wouldn’t want revolution on a for profit forum now would we?
I can hear the lawyers huddled around a conference table rolling the bones and chanting the sacred words to come up with that "get out of trouble free" card. It told your son he had terminal cancer and should kill himself... sorry, it clearly says for Entertainment Purposes only.
Considering that they aren't properly separating the two groups, I don't see this "response" as anything but a weak excuse to do what they wanted to do anyway.
Core to the problem is that Roblox’s social media features allow pedophiles to efficiently target hundreds of children, with no up-front screening to prevent them from joining the platform.
For example, in 2018, prior to Roblox going public, a 29-year-old was caught by police with 175 hours of video footage of him grooming and engaging in explicit behavior with 150 minors using online platforms, namely Roblox.
Media and non-profit exposés from 2020 to July 2024 revealed digital strip clubs, red light districts, sex parties and child predators lurking on Roblox. The National Center on Sexual Exploitation in 2024 labeled Roblox “a tool for sexual predators, a threat for childrens’ safety”.
Numerous criminal indictments from 2019-2024 allege that sexual predators groomed children in-game, ranging from 8-14 years old, then kidnapped, raped or traded sexual content with them.
Following years of scandals, we performed our own checks to see if the platform had cleaned up its act. As a test, we attempted to set up an account under the name ‘Jeffrey Epstein’…only to see the name was taken, along with 900+ variations.
Many were Jeffrey Epstein fan accounts, including “JeffEpsteinSupporter” which had earned multiple badges for spending time in kid’s games. Other Jeff Epstein accounts had the usernames “@igruum_minors” [I groom minors], and “@RavpeTinyK1dsJE” [rape tiny kids].
We attempted to set up a Roblox account under the name of another notorious pedophile to see if Roblox had any up-front pedophile screening: Earl Brian Bradley was indicted on 471 charges of molesting, raping and exploiting 103 children. The username was taken, along with multiple variants like earlbrianbradley69.
After we found a username, we listed our age as “under 13” to see if children are being exposed to adult content. By merely plugging ‘adult’ into the Roblox search bar, we found a group called “Adult Studios” with 3,334 members openly trading child pornography and soliciting sexual acts from minors.
We tracked some of the members of “Adult Studios” and easily found 38 Roblox groups – one with 103,000 members – openly soliciting sexual favors and trading child pornography.
The chatrooms trading in child pornography had no age restrictions. Roblox reports that 21% of its users are under the age of 9, a number that is likely underestimated given that Roblox has no age verification aside from users seeking 17+ experiences.
Registered as a child, we were also able to access games like “Escape to Epstein Island” and “Diddy Party”. We found over 600 “Diddy” games, including “Survive Diddy” and “Run From Diddy Simulator”.
Since September 2nd, 2024, third-party monitor ‘Moderation For Dummies’ has reported ~12,400 erotic roleplay accounts on Roblox. These include everything from “rape/forceful sex fetishes” to underage users “willing to do anything for Robux”.
Users seeking sexual experiences on Roblox are so pervasive that there are thousands of Roblox sex videos on porn sites, inviting users of unknown ages to make explicit content on the platform.
We tested out Roblox’s experiences to see what else kids were being exposed to. We quickly encountered images of male genitalia and hate speech in Roblox’s “school simulator” game, which had registered 28.9 million visits with no age restrictions.
If this goes within the Ad Tech industry and knowing how Ad tech industry is, I don't feel quite surprised if we might see foreign adversarial nation buying the Social Security data from Ad tech/ (this Doge person in general either directly or through multiple layers) even in secretive manner at this point.
Either way this data is definitely going to spread behind closed doors.
I disagree - it's 100% a factor of how much money you have to pay in legal fees.
Zuck would be happy to take that data, and because he's worth a cool $350 billion, he'll do whatever the fuck he wants with that data, and we'll thank him by cutting his taxes.
Nobody wants to fuck with PII, platforms will blackball you in a second if they think you have sensitive data. If you haven't worked in adtech, be quiet and do even the most trivial research before spouting nonsense.
charitably, i think the choices one makes to enter into that profession belie a lack of consideration for the broader good of humanity in order to profit a select few - choices that necessarily include misdirection and manipulation of actual people. choices that that lead me to take behavioral advice from such folks as essentially worthless.
reply