ZH version is available. Content is displayed in original English for accuracy.
Advertisement
Advertisement
⚡ Community Insights
Discussion Sentiment
52% Positive
Analyzed from 1914 words in the discussion.
Trending Topics
#need#scrolling#algorithm#kids#adults#endless#children#content#don#same

Discussion (51 Comments)Read Original on HackerNews
Is adding advertisements an algorithm?
Is including likes an algorithm?
Is automatically starting the next video after a previous one has finished an algorithm?
Is infinite scroll an algorithm?
Etc
I'm not saying there aren't infinite edge cases and second-order effects - but we tolerate those already for many things. I'm not pretending this is simple or even desirable - I'm merely stating it's possible if we want to do it.
My biggest fear is that (like the UK Online safety act) this acts to favour the huge corporations because they are the only ones that can afford a team of lawyers. Any legislation should aim to carve out exceptions to avoid indirectly helping monopolies.
Just look at the malicious compliance that Apple and Google have around the App Store stuff, they’ll find a way to comply with the law and implement different addictive dark patterns.
I’m not saying that I disagree that these companies need to be regulated, I absolutely do. I just think it’s going to be a complicated process, and not “oh just ban everything that’s an algorithm”.
And I have absolutely 0 faith in companies like Meta willfully complying.
Does anyone know where it’s coming from? I can certainly believe that incompetent jurisdictions have a ton of issues with people misapplying the law and using loopholes.
The moment you add other entities to the list (e.g. ads inbetween posts), then it's also subject to the same restrictions.
And then we’ll end up with with another cookie-banner style law which had good intentions but actually missed the point entirely.
I suppose the answer could be that only platforms that do indeed allow spam or worse are impartial, but that is a tricky position to be in.
A lot of adults need this too. The addictive apps are very well designed, while most blockers are either too easy to ignore or too annoying to keep using.
I built a small iOS blocker because I had the same problem. Making it strict enough to actually work without making people hate it is the main challenge.
HN having pages instead of a feed or endless list is one of the things I really like about it.
The other thing I really love about HN is that titles are all supposed to be boring and to the point. The guidelines[1] for titles are excellent and I wish more of the web and honestly legacy media too would behave that way. Things that are of no interest to me are not trying to waste my time and attention.
[1] https://news.ycombinator.com/newsguidelines.html
The actual point is that they are designed to be addictive. "endless scrolling" is just an implementation detail. If you "ban endless scrolling", they'll still be using every other trick to make it addictive.
They are bad for everyone and if you’re willing to regulate them, make them illegal to be used on anyone.
It just says the platform who use such methods, often target kids.
Don’t get me wrong, if I had my way TikTok wouldn’t exist for anyone, adults included. It’s just so strange to me that so many parents hand their 7 year olds unrestricted access to TikTok and expect someone else to keep their kid safe.
I read a post about someone saying his wife worked for a snack company. They used MRI scans to see how much salt (or sugar) they should have in the snacks to maximize the response in the brain. Sounds disturbing right.
Well engagement engines are the same thing. It's artificial intelligence optimized to get people to react and stay addicted. Basically AI doing harm. It's not what is best for the individual in terms of health. It's what generates most money to the owner of the platform.
It should not be allowed to build a business around something that exploits humans brains. Basically biohacking our brains for profit.
In contrast, in Western Europe, my son is now in the sixth grade, more than half his class doesn’t have phones, phones are absolutely forbidden on school grounds and at school activities, and they are now doing a class trip where they were told that there’s a pay phone at the hotel, in case they want to call the parents - our son promptly informed us that he’ll rather buy a pack of Pokémon cards than call us and 3 days is not so much anyway.
And it is not only at school, he travels for tournaments with his team every other week and mobile phones are absolutely forbidden on the team bus. Children read, play games (including chess on a magnetic board), sing and change stories for hours at a time
Personally, I think some parents are afraid of their children growing to resent them for infringing upon their "freedom" in ways that keeping them away from the dangers that social media and other technologies present.
I agree with you, but only in theory. Because that's where we are now and it does not seem to work that well.
Maybe through more education? But then again I think reducing addictive tactics like endless scrolling could be part of a 2 prong attack.
With alcohol we have education on what happens, but we also have laws that regulate it.
Like adults spending their hours scrolling through infinite feed is somehow beneficial to the society?
I have a hard time understanding this.
We have plenty of adults with terrible social media addiction that is destroying their lives, and nothing being done about it.
They have to restore interop with noscript/basic html web engines (past/present/and future).
Then, they have to be carefull with their file formats, for instance you never give "carte blanche" to such a disgusting format like PDF, you are very careful at defining a, as simple a possible, subset of it (with some internal software for validation).
I'm very happy they're taking a stance. I've seen too many messed up kids and there's no doubt the addictive design plays a big role in the problem.
Look at age verification: it's very easy and very safe to say "nobody sane would think that it is a good idea to force people to show their ID to every website they want to access, it will obviously leak the IDs, that is very bad!". While it is not wrong, it is manipulative, though: that is not the only way to implement age verification. In fact, there is technology that exists that would allow age verification in a privacy-preserving manner: some service that already have access to your ID can give you a token that proves your age, and you can then use this token to access a website. The service cannot know where you use the token, the website cannot know your ID, and they cannot collude.
So the constructive debate around age verification is this: assuming we implement it properly (i.e. in a privacy-preserving manner), is that something that we want or not? Does it solve a problem, or at least does it help?
But we cannot ever reach that level of debate, because nobody can't be arsed to get informed about it.
Also, nobody voted for the Commission.
And some people see tech companies as worship worthy and trying to restrict them is kind of a blasphemy.