RU version is available. Content is displayed in original English for accuracy.
Advertisement
Advertisement
⚡ Community Insights
Discussion Sentiment
68% Positive
Analyzed from 1387 words in the discussion.
Trending Topics
#latex#typst#tex#things#system#bad#ecosystem#https#page#using

Discussion (35 Comments)Read Original on HackerNews
There are probably good reasons for all of that, but it is just both bad DX and bad UX. It feels like you need to be a hardcore LaTeX expert or consult with one, in order to accomplish the most mundane things. Especially in a reliable way, that won’t break upon making seemingly unrelated changes, or won’t break other things itself.
I used Typst for a few weeks. It already feels much more understandable, consistent, hackable, and customizable. I guess that is the difference between an ad hoc macro system and an actually thought through programming language.
The only drawback I can see is the ecosystem being smaller and less mature. That is, however, counteracted by being able to do things on your own, without immersing yourself deeply in LaTeX for years. Also, it will improve with time.
LaTeX is great, don’t get me wrong. But its heritage and historical baggage is really dragging it down.
Posts/discussion I found interesting:
- http://www.goodmath.org/blog/2008/01/10/the-genius-of-donald...
- https://tex.stackexchange.com/q/24671
- https://news.ycombinator.com/item?id=15733381
In particular it's interesting how people seem to think TeX itself is actually quite nice to use but its popularity and LaTeX packages created a huge mess of a system.
another part is many people built their own solution to their own corner of this domain, and not all of them had the deep appreciation for how the rest of the TeX system works.
I hear similar complaints about "Make web page look good", which is popular but also a huge mess of a system.
[1]: https://github.com/rikhuijzer/phd-thesis
[1] https://huijzer.xyz/files/f72fa09561f20162.pdf
It's worth noting that TeX was developed in the same time period that the details of lexical scope were being nailed down by Guy Steele in the Rabbit compiler for Scheme. It's not that TeX is an ad hoc system; it's more the case that people didn't actually know how to implement a better system at the time.
What is Tony famous for? Well, lots of things, including his very important comparison sort algorithm Quicksort, but, in this context how about the Billion Dollar Mistake ? That's a pretty nasty booboo in many programming languages for which Tony blames himself because it was his idea.
Like your parent said, TeX shipped a long time ago and we learned a lot since then, it is not a surprise that we know how to do better today, in fact it would be a serious black mark for Computer Science if we couldn't.
But I think the main things it has going for it are that it: produces nice output, and all the journals accept it. Does there exist a tool that renders Typist to LaTeX? That could play nicely with the existing ecosystem.
That's why people take the math subset of latex and use it in other contexts - exactly like this product.
This seems like the _perfect_ use for and LLM: systematically porting over as much of the "ecosystem" to Typst as possible. Is anyone doing that?
It's like the JSX of Latex: markup in a programming language, not a programming language pretends to be markup.
> The only drawback I can see is the ecosystem being smaller and less mature. That is, however, counteracted by being able to do things on your own, without immersing yourself deeply in LaTeX for years. Also, it will improve with time.
Most "matches KaTeX" claims I've seen in the wild rely on screenshot eyeballing, which collapses on edge cases like spacing primes, integral subscripts, and matrix delimiters that scale.
One thing I'd be curious about: how are font fallbacks handled when the same Rust core ships to platforms with different system font availability?
KaTeX bundles fonts and assumes they load cleanly; CoreGraphics and Skia bring their own glyph caches and metrics.
Does the display list carry metric snapshots from the host text shaper, or does the core compute layout from a bundled metric file independent of the backend?
The webpage also does read like it was at least heavily LLM assisted, which makes it a bit hard to trust it.
That all said, this is definitely something I'd be interested in using for Zulip if is indeed going to be a well maintained open source project.
(We currently have a node server component that the Zulip server runs only the render LaTeX).
After a bit of tinkering and understanding the idiosyncracies of Typst, the joy of having reliable, consistent, beautiful, data-driven resumes and cover letters is not measurable. It basically lifted any barrier to applications, while whatever I had before I had always considered a burden.
On top of that, I can add hiring process data directly to the yaml file to run further analysis.
Can LaTeX do this? Most probably, but the learning curve is the difference.
I guess you should mention how much is WASM, right?