Back to News
Advertisement
Advertisement

⚡ Community Insights

Discussion Sentiment

50% Positive

Analyzed from 1712 words in the discussion.

Trending Topics

#click#video#sure#more#rust#safe#memory#https#making#cookies

Discussion (55 Comments)Read Original on HackerNews

jzebedeeabout 3 hours ago
Project description:

  dav2d is the fastest AV2 decoder on all platforms :)
  Targeted to be small, portable and very fast.
If you're out of the loop like me:

  AV2 is the next-generation video coding specification from the Alliance for Open Media (AOMedia). Building on the foundation of AV1, AV2 is engineered to provide superior compression efficiency, enabling high-quality video delivery at significantly lower bitrates. It is optimized for the evolving demands of streaming, broadcasting, and real-time video conferencing. 
- from https://av2.aomedia.org/
delfinomabout 2 hours ago
Telaneoabout 2 hours ago
They've done the same thing with AV1, and I can't see that having prevented adoption, nor can I imagine Sisvel wanting to poke the bear that is AOMedia unless they're certain their case is absolutely watertight.
walrus01about 2 hours ago
I see zero public evidence that they've filed any lawsuits against the members of AOM in any jurisdiction. I'm sure there's been a lot of threatening letters sent...
ronsorabout 1 hour ago
This is a thinly veiled extortion racket and any competent system would fine them into bankruptcy.
mort9613 minutes ago
We need a more efficient way to eliminate bullshit patents or bullshit patent infringement claims than "violate them then spend millions on lawyers to fight them in court".
walrus01about 2 hours ago
Sisvel is a patent troll. Take a look at the combined list of all companies that are the AOM and tell me with a straight face that all of their corporate in house counsel specializing in intellectual property law are wrong.
asveikauabout 1 hour ago
I don't know this stuff super well but I imagine it's not necessarily about the lawyers being right or wrong so much as what they can convince people of. The ideal scenario for the patent troll is they can intimidate you into licensing with them. Another good outcome for them (though more costly) is they can convince some non-expert in court. In either case the big players behind the codec can defend themselves but a small one just picking it up downstream as OSS can't.
tensorabout 3 hours ago
Not on topic, but wow the internet has very quickly devolved into: click -> "making sure you're not a bot", click -> "making sure you're a human", click -> "COOKIES COOKIES COOKIES", click -> "cloudflare something something"
threshabout 2 hours ago
We had to set it up on the parts of VideoLAN infra so the service would remain usable.

Otherwise it was under a constant DDoS by the AI bots.

nijaveabout 1 hour ago
While I do sympathetize with the AI DDoS situation, it'd be nice if there were a solution that allows them to work so they can pull official docs.

For instance, MCP, static sites that are easy to scale, a cache in front of a dynamic site engine

thresh18 minutes ago
Of course, static websites is the best solution to that problem.

Our documentation and a main website are not fronted by this protection, so they're still accessible for the scrapers.

hectormalotabout 1 hour ago
Maybe I’m naive about this, but I didn’t expect AI scrapers to be that big of a load? I mean, it’s not that they need to scrape the same at 1000+ QPS, and even then I wouldn’t expect them to download all media and images either?

What am I missing that explains the gap between this and “constant DDoS” of the site?

thresh22 minutes ago
You cant really cache the dynamic content produced by the forges like Gitlab and, say, web forums like phpbb. So it means every request gets through the slow path. Media/JS is of course cached on the edge, so it's not an issue.

Even when the amount of AI requests isnt that high - generally it's in hundreds per second tops for our services combined - that's still a load that causes issues for legitimate users/developers. We've seen it grow from somewhat reasonable to pretty much being 99% of responses we serve.

Can it be solved by throwing more hardware at the problem? Sure. But it's not sustainable, and the reasonable approach in our case is to filter off the parasitic traffic.

nijave38 minutes ago
I think there's a few things at play here

- AI scrapers will pull a bunch of docs from many sites in parallel (so instead of a human request where someone picks a single Google result, it hits a bunch of sites)

- AI will crawl the site looking for the correct answer which may hit a handful of pages

- AI sends requests in quick succession (big bursts instead of small trickle over longer time)

- Personal assistants may crawl the site repeatedly scraping everything (we saw a fair bit of this at work, they announced themselves with user agents)

- At work (b2b SaaS webapp) we also found that the personal assistant variety tended to hammer really computationally expensive data export and reporting endpoints generally without filters. While our app technically supported it, it was very inorganic traffic

That said, I don't think the solution is blanket blocks. Really it's exposing sites are poorly optimized for emerging technology.

Y-bar40 minutes ago
They are a scourge, they never rate-limit themselves, there are a hundred of them, and a significant number don’t respect robots.txt. Many of them also end up our meta:no-index,no-follow search pages leading to cost overruns on our Algolia usage. We spend way too much time adjusting WAF and other bot-controls than we should have.
nerdralphabout 1 hour ago
I highly doubt there is no other technically feasible option to block the AI bots. You end up blocking not just bots, but many humans too. When I clicked on the link and the bot block came up, I just clicked back. I think HN posts should have warnings when the site blocks you from seeing it until you somehow, maybe, prove you are human.
goobatroobaabout 1 hour ago
I'm sure there are many solutions for many problems, but expecting a small Foss development team to know or implement them all is rather unreasonable.

I think the world gains more if the VLAN team focuses on their amazing, free contribution to the world, than if they spend the same time trying to figure out how to save you two clicks.

We all hate that this is happening, but you don't need to attack everyone that is unfortunately caught up in it.

overfeedabout 1 hour ago
> I highly doubt there is no other technically feasible option to block the AI bots.

If you have discovered such an option, you could get very wealthy: minimizing friction for humans in e-commerce is valuable. If you're a drive-by critic not vested in the project, then yours is an instance of talk being cheap.

threshabout 1 hour ago
I'm all ears on how we can fix it otherwise.

Keep in mind that those kinds of services: - should not be MITMed by CDNs - are generally ran by volunteers with zero budget, money and time-wise

notenlishabout 1 hour ago
Nearly every single website I'm not logged into these days want me to "confirm I'm not a bot".

it is incredibly annoying but what can you do? AI scrapers ruined the web.

port11about 3 hours ago
The internet is such a Tragedy of the Commons… its citizens that act selfishly and in bad faith will slowly make it unusable.
codedokodeabout 2 hours ago
No, it is because citizen allow treating them like this.
essephabout 2 hours ago
> its citizens that act selfishly and in bad faith will slowly make it unusable

It's rarely been the citizens that have been the problem, but the governments and companies that seek the use the network connection for their overwhelming benefit.

Re (above):

> Not on topic, but wow the internet has very quickly devolved into: click -> "making sure you're not a bot", click -> "making sure you're a human", click -> "COOKIES COOKIES COOKIES", click -> "cloudflare something something"

fastballabout 1 hour ago
wat. The protections in place that the OP is talking about are almost entirely due to (not government and company) bad actors.
honktimeabout 2 hours ago
Its pretty explicitly not a tragedy of the commons. Its a tragedy of the ruling class abusing the resources of the 'commons' to extract value. There is nothing 'commons' about trillion dollar companies extracting all available value from the labor of the working class. That's just the tragedy that'll bring around the death of society, the same tragedy that brings all other tragedys
throw-the-towelabout 2 hours ago
The commons in question is the internet itself.
amusingimpala75about 2 hours ago
Thank you for describing the tragedy of the commons
dyauspitrabout 2 hours ago
There’s definitely lots of problems with the ruling class and wealth disparity. Perhaps the defining problems of our current age.

That being said, so many of the plebs suck. Like 2% will ruin everything for everyone.

pixelpoet44 minutes ago
No one's even clicking anymore, everything implores me to tap or swipe these days, and everything is optimised for humans with one eye above the other.

Then I press the X to close the all-caps banner commanding me to install the app, upon which I get sent to the app store. Users of the website refer to it as an app.

tomwheeler25 minutes ago
At least this one was significantly faster than Cloudflare and required no action on my part.
rayinerabout 2 hours ago
Wow I’m glad it’s not just me. I thought my IP block had gotten caught up in some known spamming or something.
tostiabout 3 hours ago
I get exactly none of that. Is your adblocker still working?
oybngabout 2 hours ago
renders your gigabit connection pointless
kylecabout 1 hour ago
I wonder if the author is a Dave2D fan?

https://www.youtube.com/@Dave2D

d3Xt3r33 minutes ago
7beesabout 1 hour ago
I think it's an increment on this: https://www.videolan.org/projects/dav1d.html
foo-bar-baz529about 1 hour ago
They’re more of a D4vd fan
Telaneoabout 2 hours ago
Glorious. Really looking forward to seeing how much better than AV1 it actually turns out to be. It's a shame it'll take a while before we'll have a decent encoder (it took an annoyingly long time until SVT-AV1 was usable).
Zopieuxabout 3 hours ago
>video decoder implementation

>look inside

>it's C

tux3about 2 hours ago
Not just C, dav1d and dav2d are actually mostly written in ASM! Then there's a bit of C as the glue or for functions that don't have optimized ASM yet.

Since dav2d is newer it has a higher fraction of C, but not enough for it to be the main language in the codebase :)

Almondsetatabout 2 hours ago
What are you even implying?
notenlishabout 1 hour ago
I think they mean that video decoders and encoders tend to have custom assembly code for speedup.
Almondsetat19 minutes ago
And? It's common knowledge that the "reference" or "research" version of any codec is always quite high level to get development going and actually produce a working bitstream
IshKebababout 1 hour ago
That codecs should be written in safer languages given that they usually process untrusted files. There have been a number of serious hacks from file parsing bugs due to them being written in unsafe languages.

There's literally a DSL designed for this purpose (Wuffs) so it would be interesting to hear why they didn't use it.

brigadeabout 1 hour ago
There's an order of magnitude difference in speed requirements between file format parsing and image decoding, then another order of magnitude difference to video decoding. Even rav1d reuses dav1d's assembly (most of the actual runtime) to approach its speed.
sergiotapiaabout 1 hour ago
muh rust muh safety muh Safe Code
sylwareabout 2 hours ago
I would even remove the C code and lower the usage of the assembler pre-processor to a basic C pre-processor.

Happy, AV2 decoding already here.

:)

dcsommerabout 2 hours ago
We must not continue to develop media codecs in memory unsafe languages. Small, auditable sections can opt-out perhaps, but choosing default-unsafe for this type of software is close to professional negligence.
fguerrazabout 2 hours ago
Cryptography and video codecs are notable exceptions, they put a lot of effort to making the code provably memory safe: no recursion, limited use of stack variables, no dynamic allocations, etc. As a result, memory safe languages bring nothing but trouble by making it non deterministic, that’s especially true for crypto where compiler “optimisations” guarantee you side channels attacks.
WhatIsDukkha14 minutes ago
Thank you for mentioning this.

I wonder IFF Rust had an effects system that a Jasmin MIR transform (ie like SPIRV is for shaders) would be useful?

https://github.com/jasmin-lang/jasmin

kllrnohj34 minutes ago
For the codec itself, the majority of it is performance sensitive and often has a significant amount of assembly even, so a memory safe language doesn't change much.

However for the container/extractor... those should absolutely be in a memory safe language, and those are were a lot of the exploits/crashes are, too, as metadata is more fuzzy.

As a practical example of this see something like CrabbyAVIF. All the parser code is rust, but it delegates to dav1d for the actual codec portion

fishgoesblubabout 2 hours ago
Of the 3 software AV1 encoders, the only one that is fully dead is the Rust encoder (rav1e). If people truly wanted memory safe encoders/decoders, they would fund and develop them.
vlovich123about 1 hour ago
Fully dead in what sense? Seems like it still has active development to me.
fishgoesblubabout 1 hour ago
It hasn't had any proper quality/speed improvements in years. Only thing that has changed is updating deps and some bug fixes.
essephabout 2 hours ago
> If people truly wanted memory safe encoders/decoders

Really? How many codecs have your neighbors contributed money for the development of, just curious.

computerbuster27 minutes ago
I think these conversations are directed by the parties funding the efforts. Example: "we (large company) want a fast AV2 decoder" -> they pay a specialized team to do it -> this team works in C for the most part, so it is done in C. If there were financial incentives to do it in Rust, they'd pay more for a Rust decoder.
Telaneoabout 1 hour ago
Given Netflix's involvement with SV1-AV1, (not even that) indirectly, at least 1.
maxloh22 minutes ago
Decoders written in Rust will be a lot slower than the equivalents in assembly.