HI version is available. Content is displayed in original English for accuracy.
Advertisement
Advertisement
⚡ Community Insights
Discussion Sentiment
71% Positive
Analyzed from 2915 words in the discussion.
Trending Topics
#language#programming#learn#code#languages#course#program#students#teach#don

Discussion (53 Comments)Read Original on HackerNews
My new uni completely rejected my previous uni's Intro CS curriculum. They forced me to retake their version: A Lisp-then-Java cycle that was a theoretical whiplash, which initially I resented. Who wants to re-take a class for topics they "already" learned?
By my second year, I saw the light. I understood the approach.
While my highschool friends at other unis had courses to learn {{topic}} and separate courses to learn {{language}}, the uni I was at had a policy of "languages are incidental" for most courses. Professors had a list of the languages that they would accept for projects. "Anything but Perl" was common, although "systems" courses required C. For every class with a required language, it was not taught during class. There would be an optional lab that you could attend to learn at least one of the languages allowed for the course. The expectation was that you could learn enough in two or three weeks before the first project kicked off, and the TAs really were there to help you figure it out (most courses had 20-30 students for 2-3 TAs).
The "bookend" intro courses that taught extreme FP and extreme OOP served as a forcing function that made me realize "Oh, languages really don't matter. Just pick up the details for the paradigm, and get on with the work."
As much as I resented having to "redo" an intro course cycle at first, I am forever grateful for the fearlessness that the overall experienced instilled in me.
We did have the oblig "learn language X" course. Which used to be C and had just switched to Java at our uni. But it was just to learn a language.
The Operating System class used Minix in a VM to teach concepts. You were expected to just use a Minix vi (that was fun!) or somehow figure out yourself how to best use something outside while still having a fast enough feedback loop. Nobody taught us C (since we were already a Java language year. You were just expected to figure it out. Quite a bunch of people couldn't and literally quit CS entirely over that course. I loved it!). We had to add a memory compactification algorithm to Minix to pass the course. That was a fun learning experience! And made me appreciate modern (well, I guess this was like 25 years ago now lol!) operating systems.
Scripting languages course: Pick one and do a project and turn it in. That was it. (I turned in something I had written in Perl as a side-job for hire and got an A without doing any course work at all (except for the two initial attendance checked classes)
I would probably be the one to choose x86 Asm or APL... or even a mix of the two.
True. The best programming books also reflect this ethos. Some people think Introduction to Functional Programming (1989) by Bird & Wadler is the best programming book ever.
The language-specific details (Miranda) are limited to a three-page appendix! The rest of the book is language agnostic, provided that your target language is functional.
Because of the era, I was able to be the lead engineer (as a freelance consultant) at a number of startup companies which tragically no longer exist (for reasons of product-market-fit or product-channel-fit). Eventually I started my own, which is a B2B2C marketplace, a blend of ecommerce and event bookings. So the largest project is "a whole darn company that has survived a decade", including a near-death experience from COVID. From a technical perspective, it is not a very interesting project ;) People at Shopify or Eventbrite have seen more technical complexity in both spaces than I have.
While we use AI for many internal processes, we aren't doing anything particularly cutting-edge in the product itself. It's mostly about brand strategy, market alignment, and customer experience. My day-to-day is sometimes technical, but mostly management. And that's the trap, yeah? You get good at something you enjoy, and then you find out that there's more money in doing something different, helping other people do the thing you used to enjoy.
I apologize if that is not a satisfying answer. I am grateful for the very unique university experience I lucked into, but I am no savant.
It always struck me as strange that universities never had a course that would teach open source code. As in: grab a repo of a popular open source project, read part of it and do your best to create a contribution in it.
The lectures should be about different open source projects and their design choices.
Like, literally: "Error message has string 'abdc-1234-something-whatever'". They can barely figure out to maybe search the code base for that error message. Unfortunately they can't find the full string. Now they're stuck and can't think of anything else to try.
So, effing, amazing. How do these people ever get through (coding) life? Ever heard of substring search coz error messages frequently have parts that are concatenated/variables inserted? Search for parts of it until you find something. It's not that hard dude, yes the 1234 probably is some dynamic id, so search for just something-whatever and you'll instantly find the relevant code and you can debug further.
But no, this "Senior" can't think of anything when not finding the full string anywhere in the codebase and would rather throw up their hands and let others figure it out.
Either a really dumb "Senior" that somehow got through so far at previous companies or they're silent quitting during probation period already.
If this continues it's not gonna be silent.
I could see LLMs affecting that though. Their ability to output shitty and yet somewhat functional skeletons to work on manually further is just spot on.
There's problems like do you teach Python or C? It sounds silly but the difference is not about languages but how much you teach about systems. Teaching Python you get people going and they can produce faster, which does help students get less discouraged. But teaching C forces learning about the computer system and enables students to dive deeper to teach themselves many different subtopics that no 4 year program can.
What I think is generally missing and would be good to implement is code review and teaching how to understand a large existing codebase (all that grep, find, profilers, traces, tags, and all that jazz). This often gets taught in parallel (e.g. have students review each others code) but it's hit or miss, a lot of extra work, and not everyone does it.
Here's the shitty part: I was often told by peers and people higher up "don't look at student's code, just look at output and run tests." I always ignored, because that advice is why we're failing so many students. But I also understand it because professors are overburdened. There's too much work to do and teaching isn't even half the job. Then every new administrator or "office assistant" they hire, the more work you have (seriously, it'll take days to book a flight because you have to use some code but it takes 2 days for someone to tell you the code and 5 more to tell you that it was the wrong code and it's clearly your fault because you clicked on "book flight" and not trips > booking > flights > schedules > trips > access code > flights > search available flights. Honestly, I think all this llm agent stuff would sound silly if people actually just knew how to design things...)
Learning to program without knowing the language is useless and counter-productive.
Of course, this doesn't mean you have to learn 10+ languages first... but you have to learn a real programming language (not a toy one) before you can learn to program.
Edit: * a language
Which language is the language? A competent programmer can think about programming and reason about programs written in most languages without having to know that particular language intimately (with some exceptions that push outside the normal algorithmic language notation of the Fortran, C, Java, JS, Common Lisp, Rust, Go, etc. family of languages; but those are minority languages and a competent programmer shouldn't need more than a short period of time to become literate, if not expressive, in it).
> A competent programmer can think about programming and reason about programs written in most languages without having to know that particular language intimately
That's because the programmer already learned how to program.
But when they started, they definitely didn't write only pseudocode that wasn't runnable (to see the results) for months/years.
GT started students that way and it worked well for years. A full semester (number varied, but was the CS 101 course, 1301/1311/1501 or something like that), taught with only pseudocode. They got rid of it because of appearances, trying to be like every other school out there. Eventually settling on Python, I think, after a brief stint with Scheme (which ended after a major cheating scandal).
You need to learn to leetcode in psuedocode first.
I never see anyone learning to program using pseudocode (which isn't runnable to get feedback).
If they used pseudocode, were they just run the program in their heads?
In Software Tools Brian Kernighan and P.J. Plauger describe a pseudo-language called RATFOR (Rational Fortran), and then throughout the book implement RATFOR in itself.
Getting feedback while learning to program has a lot of value, but so does learning to think through code in your head. People old enough to remember when you had to wait a day to run your program and get results back (very slow turnaround) know the value of that skill, we used to call it "desk checking" -- reading through your code and running it in your head and on paper.
This is itself a skill people need to learn, that I'm not sure is possible with pseudocode and no prior experience. Too easy to gloss over details without actually running it to learn where your blind spots are.
I did this workshop a decade or so ago where I learned my co-workers don't do this, and never did learn how they understand code otherwise. One of them mentioned he didn't even realize this was a thing.
I haven’t seen this pedagogical practice in any other introductory course I’ve seen since. I believe it’s a holdover from the early days of computing, when programmers didn’t have access to personal computers or even interactive computing, which meant that programmers needed to spend more up-front time on design. Think of the punchcard era, for example.
I teach introductory programming in C++ at Ohlone College in Fremont, and I have my students write C++ on Day 1, starting with “Hello World” and going from there without flowcharts.
?
This isn't a plug/whatever - just good content from an old friend.
https://www.youtube.com/@ProfessorHankStalica
Data, data, data :)))
Some basic notions to know: Input → Computation → Output
Information is omnipresent (this is just an intuition, not a claim). It serves as both input and output.
Computation—also known as a procedure, function, set of instructions, transformation, method, algorithm, or calculation.
In my early days, I ignored the fundamental notion of data and procedures. But eventually, it clicked: Programs = Data + Instructions
Watch Feynman on computing—he even starts with the same concept of data, introducing computers as information processing systems. And processing requires algorithms (i.e., instructions or procedures).
Programming is simply writing instructions for the computer to perform computations.
A computer is just a machine for computing.
Computation is a general idea: a transformation of one form of information into another.
https://en.wikipedia.org/wiki/IPO_model
Richard Feynman Computer Science Lecture: https://www.youtube.com/watch?v=EKWGGDXe5MA
Old documentry on programming: https://www.youtube.com/watch?v=dFZecokdHLo
George Hotz video: what is programming? https://www.youtube.com/watch?v=N2bXEUSAiTI
https://denninginstitute.com/pjd/GP/gp_overview.html
https://htdp.org/2003-09-26/Book/curriculum-Z-H-5.html#node_...
https://www3.nhk.or.jp/nhkworld/en/shows/texico/
Now they teach language but you just ask agents to check the accuracy of code and rarely read it.
Only few devs wrote new algorithms and only few devs will now write the actual new code. These few devs don't need courses but all other devs need to pretend that they are part of these "few" so they need all the courses, just in case...
I can see why many people wouldn’t like it but I like it because of this. It’s not my style and I wouldn’t do a site like this for myself but it represents a time when the web was an extension of personal expression.
This appears to be the author’s personal style and taste. I’d wager the software he writes for others doesn’t look like this so this is probably creative outlet. I don’t need to look at yet another shadcdn, tailwind cookie cutter sass when reading someone’s blog.
Look on my works, ye Chrome Heathens, and despair!
Just "hey nobody can understand why that line is the way it is, what should we do about that" is probably one of the basic building-block skills of developing on a team, and you teach it wholly by abusing prima donna cowboys until they write something legible or quit.
If you want to be a well-paid developer, learn leetcode.
The more reasonable coding interviews I've had, on the other hand, I could have solved before even going to college. Those companies were (rightfully) just trying to get a sense for whether you can write and debug code in the real world, not memorize algorithms.
You can get all these fundamentals for free and probably better from an LLM.