ES version is available. Content is displayed in original English for accuracy.
Advertisement
Advertisement
⚡ Community Insights
Discussion Sentiment
57% Positive
Analyzed from 3122 words in the discussion.
Trending Topics
#web#standard#html#don#xhtml#without#scripting#browsers#browser#own

Discussion (56 Comments)Read Original on HackerNews
This is what XHTML was, and it was a complete disaster. There's a reason almost nobody serves XHTML with the application/xhtml+xml MIME type, and that reason is that getting a “parser error” (this is what browsers still do! try it!) is always worse than getting a page that 99% works.[0] I strongly believe that rejecting the robustness principle is a fatal mistake for a web-replacement project. The fact that horribly broken old sites can stay online and stay readable is a huge part of the web's value. Without that, it's not really “the web”, spiritually or otherwise.
[0] It's particularly “cool” how they simply do not work in the Internet Archive's Wayback machine. The page can be retrieved, but nobody can read it.
And it's still unambiguous. You can cringe at what some people do, but it would be strictly a taste issue rather than a technical one, as the parse would still be unambiguous. And if you think you can fix taste issues with technical specification, well, you've already lost anyhow.
If it did somehow happen that a good deal of interesting content was published using the standard, the most popular client would probably be nonconforming, ignoring the rule to not render ambiguous content.
Protocols used to be limited by technology, now they're defined by ideology.
However, I don't see it that clearly that this cannot be done since the start so that the expectations are right since the beginning. For example, I don't see the same problem in other formats like JPEG or PNG where you expect the image to work perfectly or fail with a decoding error.
Other than implementing it and see how it goes, can you propose a feasible experiment to see how an new strict spec will measurably fail?
tried it right now - took a PNG and a JPEG, opened them in a text editor, literally deleted the second half of the file, saved, and dragged them into both Firefox and Chrome - they are displayed instead of erroring out.
there is a classic article why a minimal version of the web with features removed will fail - you removed 80% of the features that YOU think are not important. thats a classic fatal mistake
search the web for different proposals for a minimal web and you will understand - they will have removed some feature they think is bloat but which you kept in your proposal because you consider it critical. which is why you created a new proposal - their minimal proposal is not the right one for you
https://www.joelonsoftware.com/2001/03/23/strategy-letter-iv...
Today, when writers are using visual editors (or Markdown), few are writing their own HTML any more. A web standard requiring compliance would work differently today.
I'd say it was a minority of writers that were handcrafting XHTML. And it was the case that everyone or their handcrafting or using tools could validate their compliance using a browser which made it very easy to adjust your tools or your handcrafted code. We are now in a situation where there is no schema for HTML.
I, for one, am very much in favor of forking the web with a document format with a schema. It really seems like a small and simple change to me.
Now, they enable applications to exist without going through app store gateways.
A new document-only protocol aligned the Web's original intention would be very useful simply for security reasons. I liked Gemini because, by design, a Gemini document is not executable in any way; there's no popups, plugins, or even cookies; all this is out of the box without having to manage settings, and Gemini documents are very readable without an app at all.
But replacing the modern browser rather than being another option will actually lock in people further than they already are-open protocols require apps which are all behind a gateway now on the primary computing device of users: phones.
It probably won't matter in a few years as the Web will likely be equally locked down soon, though.
What? You could deploy software without dealing with Microsoft back then and you still can today. Unless you meant avoiding building for Windows natively.
Most of this document reads to me like that's the problem they're trying to solve, not just chrome's huge marketshare, so simply not targeting it doesn't serve their purpose.
in real democracies the populists (facebook, tiktok, chrome) always win. because that's what the masses want
Is Friedrich Merz a populist? Was Angela Merkel a populist? This theory seems to have considerable limits.
What you want is to have scripting with capabilities -- preferably on top of WebAssembly (JS is a sin).
The best part is this improves the experience of noscript users -- rather than nice graphical widgets being broken, instead, they can just run scripts without any "network" capability -- which should forbid the scripts not only from accessing the network, but make it so anything they modify becomes "tainted" and is not allowed to show up on a network call (so e.g. if they encode some data in a form, trying to later submit that form somewhere else on the app will give a warning).
Now -- most people don't care and don't want this. And that's a good thing -- capabilities put the power in the hands of the user agent where they belong.
More interestingly-- capabilities can be shimmed! Rather than "you are not allowed to access my GPS", it should be a first-class feature to feed the WASM a GPS stream of your choice.
it's as if nothing was learned from the XHTML debacle
Then html5 came along, providing all kinds of shiny goodies and saying not to bother with the tags. In the end, a more rigid standard would have been nice.. (Though this is mostly about the skin deep part of the standards.)
It failed because the smallest error by a client after the fact was like a server crash. Plus it would have created a mild barrier to entry when learning html at all.
and what new capabilities does this new proposal provide?
You can certainly make something with it, but I can't imagine most people finding a use for it.
Modern Internet is 45% appearances and 50% search traffic optimizations. For better or worse we lost all usable registries of websites, we lost appearance-less and traffic considerations-less websites. Information-focused Web is pretty much dead.
Maybe these ideas did not scale and did not monetize that well, but we will never really know what information-focused version of Internet would have looked like because evolution took it elsewhere. Unless we try building another one with different principles and limitations at the core.
Perhaps what's needed is for an alternative search engine. Assert that you will only index a site that meets some strict set of limits. If that's what people want they will use that engine. If it's popular sites will have have to find ways to get listed, e.g. "simple.amazon.com" which supports that standard.
For me, the information-sharing part of the internet now is the shadow libraries. I can get access to all (well, still not quite all) journals and university-press publications from the last century? Awesome. Vastly more informative than some blogger who nowadays is probably trying to monetize my attention.
Even so, those who want to share and access information can already do that via the Web. Nobody has to use scripting. Nobody has to use The Google as their search. Nobody has to rely on an LLM. If there is demand for simple webpages that are free of scripting, they can be built and shared today. Because of this, the proposal comes off as very out of touch and deep within the HN bubble. Strict grammar for declaring documents is merely a fetish. If there's no scripting, then there's no reason for a document to break for some silly reason.
> A published version of the standard NEVER, EVER, EVER, EVER changes.
WhatWG does have per-commit snapshots of the standard. They're just not semantically versioned because it is a living standard.
I think what the author wants is something like Gemini instead of HTML, but that has its own set of problems. My plea for Dillo would be to instead just support a text/markdown mime-type natively and we can try for adoption in more browsers.
> The objective is not to create a feature-by-feature clone of the Web, but to create an specification that allows humans to exchange knowledge, notes, and other forms of information without the imposed requirement of having to run a full blown VM to read it.
Markdown in browsers fits your objective! Only gotcha is commonmark extensions, and they can work with sub-type declarations in the mimetype.
Maybe I'm just stupid, but I don't really know what the author is talking about here. What parts of the standard? HTTP? HTML? DOM APIs? What?
"as soon as a monopolistic entity can build a mechanism to extract revenue from it, there will be an incentive to capture the standard and change it to for their own benefit"
Personally I'd love a simple semantic versioned subset of the web. The required traction and buy-in from existing key players (browser vendors, web hosting platforms etc) makes it largely a non-starter though. I'd love to be wrong though.
Instead of "forking", it may be more prudent to extend or revive something more like Gopher, so you don't constantly get baraged by incompatible sites (like you would in a forked web)
While HTML serves its purpose, especially for documents, the modern web is a giant mess of that legacy, combined with unfriendly ergonomics and glue/hacks built on top just so we as developers can have better DX for creating complex software on top of it.
Building a browser means having to deal with all that legacy, wether we like it or not, so most of the browser market got captured by the big players who have enough manpower to cover all those edge cases. That also means we have to deal with whatever technical choices or bloat they make, causing an infinite stream of issues, from memory usage, to size, to limitations that don't make sense in 2026 but are still there because someone 20 years ago decided to write them like that. As I deal with mobile webviews a lot in my daily work, I unfortunately had to get familiar with quite many gotcha's and edge cases, and some are just... absurd in this day and age.
I believe we need a separation between an application layer and the document layer, and especially between the UI language and the actual application code - script tags serve their purpose, but again, they are a hacky solution with its own bag of tricks, and those tricks impact all of the software built upon it.
Now, a bit of a shameless plug I've been working on something to fill that gap, at least for myself and hopefully for others who encounter the same issue - it's called Hypen (https://hypen.space) and it's a DSL for building apps that work natively on all platforms, with strict separation of code/UI/state, and support for as many languages and platforms as I can maintain, not "just javascript". While currently it's focused on streaming UI, it's built with Rust and WASM at it's core and will soon allow fully "compileable" apps.
While it may not be the future of software, once you get into building something like that, it becomes obvious that the way we are building now is at least wrong, and at best kafkaesque.
Edit: actually it looks like w3m was ‘95 and Dillo was ‘99.
> If you are under 13 years of age, you are not authorized to register to use the Site.
(By the way, are you aware that the largest bakery company in the US is named “Bimbo”? Tee hee! You should tell them to change their name!)
Gemini protocol?
> No scripting
How is will it be possible to go back? The average ecom presence usually relies heavily on JS. I haven't checked in a long time that any relevant sites work without JS. I think going back to more basic approaches could even improve user experience, as many usage patterns probably would converge and simply look and function as intended. But considering that the whole web world is so fixated to solve everything with JS seems like targeting the highest resistance target you can find. Don't get me wrong, I hate this situation and we must not have a single language that dominates everything.
I also don't believe is that enthusiasts will create a significant shift. They can surely provide the fundamentals, but if there isn't a huge mainstream impact, it will not change anything.
The standards that make my life miserable at times are the secondary standards like GDPR and WCAG as well as the de facto "standard" systems we are forced to participate in such as Cloudflare, the advertising economy, etc.
It's easy to say "WebUSB is bloat" and I'd certainly say PWA is something that could only come out of the mind that brought us Kubernetes, but lately I've been building biosignals applications and what should my choice be: write fragile GUI applications for the desktop that look like they came out of a lab and crash from memory leaks or spend 1/5 the time to make web applications that look like they belong in the cockpit of a Gundam and "just work"?
How so? PWAs are awesome! Democratizing for users. Democratizing for developers. They work well for the right class of apps. They would go much further if there weren't forces actively resisting them. Think of all the electron type-apps out there. Now imagine if the average Joe could just install them from the web with 2 clicks.
(Regular ole bookmarks get you a decent percent of the way but clearly something extra than that was needed.)
It would be great to differentiate between "static" and "dynamic" pages based upon scripting, IMO.
oh and also https://xkcd.com/927/