RU version is available. Content is displayed in original English for accuracy.
Advertisement
Advertisement
⚡ Community Insights
Discussion Sentiment
64% Positive
Analyzed from 4223 words in the discussion.
Trending Topics
#ada#language#languages#used#types#features#more#programming#compiler#hardware

Discussion (85 Comments)Read Original on HackerNews
I think that is the biggest factor of all.
Given the sophistication of the language and the compiler technology of the day, there was no way Ada was going to run well on 1980’s microcomputers. Intel built the i432 “mainframe on a chip” with a bunch of Ada concepts baked into the hardware for performance, and it was still as slow as a dog.
And as we now know, microcomputers later ate the world, carrying along their C and assembly legacy for the better part of two decades, until they got fast enough and compiler technology got good enough that richer languages were plausible.
The third validated compiler ran on the Western Digital “Pascal MicroEngine” running the UCSD p-system with 64K memory. The MicroEngine executed the byte code from the p-system natively, which was an interesting approach.
I think more research is warranted by you on this subject.
Ada includes a number of critical abstractions that require either dynamic runtime code (slow runtime) or the proverbial sufficiently smart compiler (slow compile-time).
These were for good reasons, like safety and the need to define concurrent systems within the language. But they were too heavyweight for the commodity hardware of the era.
Nowadays, languages like Go, C++, Java, Rust, … have no trouble with similar abstractions because optimizers have gotten really good (particularly with inlining) and the hardware has cycles to spare.
In that era, the largest blocker for Ada was it ws viewed as having a lot of overhead for things that weren't generally seen as useful (safety guarantees). The reputation was it only mattered if you were working on military stuff, etc.
It had other warts the string handling wasn’t great, which was a huge problem. It was slow too in a time where that mattered more (we had c and ada in our code base.). I remember the concurrency not using the OSs so the one place we used it was a pain. HPUX had an amazing quasi real time extensions, so we just ran a bunch of processes.
I also wish there were concrete code examples. Show me what you are talking about rather than just telling me how great it is. Put some side by side comparisons!
So there was considerable borrowing from PASCAL, CLU, MODULA(-2), CSP. It's possible that the elaborate system for specifying machine representations of numbers was truly novel, but I'm not sure how much of a success that was.
There are features common to Ada and Modula, but those have been taken by both languages from Xerox Mesa.
The first version of Modula was designed with the explicit goal of making a simple small language that provided a part of the features of Xerox Mesa (including modules), after Wirth had spent a sabbatical year at Xerox.
Nowadays Modula and its descendants are better known than Mesa, because Wirth and others have written some good books about it and because Modula-2 was briefly widely available for some microcomputers. Many decades ago, I had a pair of UVPROM memories (i.e. for a 16-bit data bus) that contained a Modula-2 compiler for Motorola MC68000 CPUs, so I could use a computer with such a CPU for programming in Modula-2 in the same manner how many early PCs could be used with their built-in BASIC interpreter. However, after switching to an IBM PC/AT compatible PC, I have not used the language again.
However, Xerox Mesa was a much superior language and its importance in the history of programming languages is much greater than that of Modula and its derivatives.
Ada has taken a few features from Pascal, but while those features were first implemented in Pascal, they had been proposed much earlier by others, e.g. the enumerated types of Pascal and Ada had been first proposed by Hoare in 1965.
When CLU is mentioned, usually Alphard must also be mentioned, as those were 2 quasi-simultaneous projects at different universities that had the purpose of developing programming languages with abstract data types. Many features have appeared first in one of those languages and then they have been introduced in the other after a short delay. Among the features of modern programming languages that come from CLU and Alphard are for-each loops and iterators.
I worked with JOVIAL as part of my first project as a programmer in 1981, even though we didn't even have a full JOVIAL compiler there yet (it existed elsewhere). I remember all the talk about the future being ADA but it was only an incomplete specification at the time.
JOVIAL had been derived from IAL (December 1958), the predecessor of ALGOL 60. However JOVIAL was defined before the final version of ALGOL 60 (May 1960), so it did not incorporate a part of the changes that had occurred between IAL and ALGOL 60.
The timeline of Ada development has been marked by increasingly specific documents elaborated by anonymous employees of the Department of Defense, containing requirements that had to be satisfied by the competing programming language designs:
1975-04: the STRAWMAN requirements
1975-08: the WOODENMAN requirements
1976-01: the TINMAN requirements
1977-01: the IRONMAN requirements
1977-07: the IRONMAN requirements (revised)
1978-06: the STEELMAN requirements
1979-06: "Preliminary Ada Reference Manual" (after winning the competition)
Already the STRAWMAN requirements from 1975 contained some features taken from JOVIAL, which the US Air Force used and liked, so they wanted that the replacement language should continue to have them.
However, starting with the IRONMAN requirements, some features originally taken as such from JOVIAL have been replaced by greatly improved original features, e.g. the function parameters specified as in JOVIAL have been replaced by the requirement to specify the behavior of the parameters regardless of their implementation by the compiler, i.e. the programmer specifies behaviors like "in", "out" and "in/out" and the compiler chooses freely how to pass the parameters, e.g. by value or by reference, depending on which method is more efficient.
This is a huge improvement over how parameters are specified in languages like C or C++ and in all their descendants. The most important defects of C++, which have caused low performance for several decades and which are responsible for much of the current complexity of C++ have as their cause the inability of C++ to distinguish between "out" parameters and "in/out" parameters. This misfeature is the reason for the existence of a lot of unnecessary things in C++, like constructors as something different from normal functions, and which cannot signal errors otherwise than by exceptions, of copy constructors different from assignment, of the "move" semantics introduced in C++ 2011 to solve the performance problems that plagued C++ previously, etc.
https://xcancel.com/Iqiipi_Essays
There is no named public author. A truly amazing productivity for such a short time period and generously the author does not take any credit.
Most longform readers will assume an author has deep expertise and spent a lot of time organizing their thoughts, which lends their ideas some legitimacy and trust. For a small blog, an 8,000 word essay is a passion project.
But if AI is detected in the phrasing and not disclosed, it begs a lot of questions. Did AI write the whole thing, or just light edits? Are the facts AI generated, too, and not from personal experience? What motivated someone to produce this content if they were going to automate parts of its creation; why would they value the output more than the process?
I really enjoyed the essay, only checked afterwards when I started reading comments.
I hate that I'm starting to develop a media literacy immune system for blog posts of all things.
I've written almost 50 blog posts in the last 3 years. All in draft, never published mostly because a crippling imposter syndrome and fear of criticism. But every now and then I wake up full of confidence and think "this is it. today I'll click publish I don't give a fuck. All in". Never happens. Maybe this author was in the same boat until a month ago. I know there's a high chance that's just a bot but I can understand if it's not and how devastating has to be to overcome the fear of showing your thoughts to the world and being labeled a bot. If it's not already obvious English is not my first language and I've used LLMs to check my grammar and improve the style. Maybe all my posts smell like chatpgt now and this just adds to the fear of being dismissed as slop.
The main problem with this article is that it appears to have been basically written out of whole cloth by the LLM, there’s no novel insight here about Ada beyond what you could fit in a short prompt + the Wikipedia article.
"These are not positions. They are proposals — structures through which a subject might be examined rather than verdicts about it."
The entire site is AI written.
While true, that doesn't mean that other language's sum types originated in Ada. As [1] states,
> NPL and Hope are notable for being the first languages with call-by-pattern evaluation and algebraic data types
and a modern language like Haskell has origins in Hope (from 1980) through Miranda.
[1] https://en.wikipedia.org/wiki/Hope_(programming_language)
John McCarthy, the creator of LISP, had also many major contributions to ALGOL 60 and to its successors (e.g. he introduced recursive functions in ALGOL 60, which was a major difference between ALGOL 60 and most existing languages at that time, requiring the use of a stack for the local variables, while most previous languages used only statically-allocated variables).
The "union" of McCarthy and of the languages derived from his proposal is not the "union" of the C language, which has used the McCarthy keyword, but with the behavior of FORTRAN "EQUIVALENCE".
The concept of "union" as proposed by McCarthy was first implemented in the language ALGOL 68, then, as you mention, some functional languages, like Hope and Miranda, have used it extensively, with different syntactic variations.
If Rust didn't have (C-style) unions then its enum should be named union instead. But it does, so they needed a different name. As we work our way through the rough edges of Rust maybe this will stick up more and annoy me, but given Rust 1.95 just finally stabilized core::range::RangeInclusive, the fix for the wonky wheel that is core::ops::RangeInclusive we're not going to get there any time soon.
Yes, it's verbose. I like verbosity; it forces clarity. Once you adjust, the code becomes easier to read, not harder. You spend less time guessing intent and more time verifying it. Or verify it, ignore what you verified, then go back and remind yourself you're an idiot when you realize the code your ignored was right. That might just be me.
In small, purpose-built applications, it's been pleasant to code with. The type system is strict but doesn't yell at you a lot. The language encourages you to be explicit about what the program is actually doing, especially when you're working close to the hardware, which is a nice feature.
It has quirks, like anything else. But most of them feel like the cost of writing better, safer code.
Ada doesn't try to be clever. It tries to be clear, even if it is as clear as mud.
And every time I fail.
https://ada-lang.io/ https://alire.ada.dev/
IDE: https://github.com/AdaCore/gnatstudio
Nowadays, you should be able to use anything from Linux under WSL.
In the past using Ada was more painful, because you had to use some old version of gcc, which could clash with the modern gcc used for C/C++/Fortran etc.
However, during the last few years these problems have disappeared. If you build any current gcc version, you must just choose the option of having ada among the available languages and all will work smoothly.
In the beginning Ada has been criticized mainly for 2 reasons, it was claimed that it is too complex and it was criticized for being too verbose.
Today, the criticism about complexity seems naive, because many later languages have become much more complex than Ada, in many cases because they have started as simpler languages to which extra features have been added later, and because the need for such features had not been anticipated during the initial language design, adding them later was difficult, increasing the complexity of the updated language.
The criticism about verbosity is correct, but it could easily be solved by preserving the abstract Ada syntax and just replacing many tokens with less verbose symbols. This can easily be done with a source preprocessor, but this is avoided in most places, because then the source programs have a non-standard appearance.
It would have been good if the Ada standard had been updated to specify a standardized abbreviated syntax besides the classic syntax. This would not have been unusual, because several old languages have specified abbreviated and non-abbreviated syntactic alternatives, including languages like IBM PL/I or ALGOL 68. Even the language C had a more verbose syntactic alternative (with trigraphs), which has almost never been used, but nonetheless all C compilers had to support both the standard syntax and its trigraph alternative.
However, the real defect of Ada has been neither complexity nor verbosity, but expensive compilers and software tools, which have ensured its replacement by the free C/C++.
The so-called complexity of Ada has always been mitigated by the fact that besides its reference specification document, Ada always had a design rationale document accompanying the language specification. The rationale explained the reasons for the choices made when designing the language.
Such a rationale document would have been extremely useful for many other programming languages, which frequently include some obscure features whose purpose is not obvious, or which look like mistakes, even if sometimes there are good reasons for their existence.
When Ada was introduced, it was marketed as a language similar to Pascal. The reason is that at that time Pascal had become the language most frequently used for teaching programming in universities.
Fortunately the resemblances between Ada and Pascal are only superficial. In reality the Ada syntax and semantics are much more similar to earlier languages like ALGOL 68 and Xerox Mesa, which were languages far superior to Pascal.
The parent article mentions that Ada includes in the language specification the handling of concurrent tasks, instead of delegating such things to a system library (task = term used by IBM since 1964 for what now is normally called "thread", a term first used in 1966 in some Multics documents and popularized much later by the Mach operating system).
However, I do not believe that this is a valuable feature of Ada. You can indeed build any concurrent applications around the Ada mechanism of task "rendez-vous", but I think that this concept is a little too high-level.
It incorporates 2 lower level actions, and for the highest efficiency in implementations sometimes it may be necessary to have access to the lowest level actions. This means that sometimes using a system library for implementing the communication between concurrent threads may provide higher performance than the built-in Ada concurrency primitives.
A reconciliation between these 2 camps appears impossible. Therefore I think that the ideal programming language should admit 2 equivalent representations, to satisfy both kinds of people.
The pro-verbose camp argues that they cannot remember many different symbols, so they prefer long texts using keywords resembling a natural language.
The anti-verbose camp, to which I belong, argues that they can remember mathematical symbols and other such symbols, and that for them it is much more important to see on a screen an amount of program as big as possible, to avoid the need of moving back and forth through the source text.
Both camps claim that what they support is the way to make the easiest to read source programs, and this must indeed be true for themselves.
So it seems that it is impossible to choose rules that can ensure the best readability for all program readers or maintainers.
My opinion is that source programs must not be stored and edited as text, but as abstract syntax trees. The program source editors and viewers should implement multiple kinds of views for the same source program, according to the taste of the user.
To be clear Ada specifically talks about all this in the Ada reference manual in the Introduction. It was specifically designed for readers as opposed to writers for very good reasons and it explains why. It's exactly one of the features other languages will eventually learn they need and will independently "discover" some number of years in the future.
I don’t think you really understand what you’re saying here. I have worked on an ada compiler for the best part of a decade. It’s one of the most complex languages there is, up there with C++ and C#, and probably rust
As a matter of general interest, what features or elements of Ada make it particularly hard to compile, or compile well? (And are there parts which look like they might be difficult to manage but aren't?)
The subset required for hardware synthesis/design, cannot be unified completely with a programming language, because it needs a different semantics, though the syntax can be made somewhat similar, as with VHDL that was derived from Ada, while Verilog was derived from C. However, the subset used for simulation/verification, outside the proper hardware blocks, can be pretty much identical with a programming language.
So in principle one could have a pair of harmonized languages, one a more or less typical programming language used for verification and a dedicated hardware description language used only for synthesis.
The current state is not too far from this, because many simulators have interfaces between HDLs and some programming languages, so you can do much verification work in something like C++, instead of SystemVerilog or VHDL. For instance, using C++ for all verification tasks is possible when using Verilator to simulate the hardware blocks.
I am not aware of any simulator that would allow synthesis in VHDL coupled with writing test benches in Ada, which are a better fit than VHDL with C++, but it could be done.
https://soft.vub.ac.be/amop/
CTM: https://en.wikipedia.org/wiki/Concepts,_Techniques,_and_Mode...
1: SPARK is a formally verifiable subset of Ada: https://en.wikipedia.org/wiki/SPARK_(programming_language)
2: https://arxiv.org/pdf/1805.05576
https://learn.adacore.com/courses/advanced-ada/parts/resourc...
The way Ada generally solves the same problem is by allowing much more in terms of what you can give a stack lifetime to, return from a function, and pass by parameters to functions.
It also has the regular « smart pointer » mechanisms that C++ and Rust also have, also with relatively crappy ergonomics
It highlights the often perplexing human tendency to reinvent rather than reuse. Why do we, as a species, ignore hard-won experience and instead restart? In doing so, often making mistakes that could have been avoided if we’d taken the time or had the curiosity/humility to learn from others. This seems particularly prevalent in software: “standing on the feet of giants” is a default rather than exception.
That aside, the article was thoroughly educational and enjoyable. I came away with much-deepened insight and admiration for those involved in researching, designing and building the language. Resolved to find and read the referenced “steelman” and language design rationale papers.
Humanity moves from individual to society, not the reverse.
Some knowledge moves from the plural to the singular, top to bottom, but the regular existential mode is bottom-up, which point The Famous Article (TFA) makes in the context of programming languages.
Children and ideas grow from babe to adult. They do not spring full grown from the brow of Zeus other than in myth.
1. Would never work on "missile tech" or other "kills people" tech.
2. Would never work for (civ) aircraft tech, as i would probably burn out for the stress of messing something up and having a airplane crash.
That said, im sure its also used for stuff that does not kill people, or does not have a high stress level.
What?
#1 JavaScript doesn't have formal types. What does it even mean by "representation"?
#2 You can just define a variable and not export it. You can't import a variable that isn't exported.
There are several little LLM hallucinations like this throughout the article. It's distracting and annoying.