Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is pretty ridiculous, man. I don't think I know a beginner programmer who would be so stuck on "every character matters." (Which isn't even true, to some level, in many langauges - ; in JavaScript and Python? Whitespace in languages besides Python?)

The way I would explain it is to have them take a imagine writing a code tokenizer and interpreter of a simple language themselves. That's what the intro CS class I took at Berkeley, 61A, had us code with a subset of Lisp, with a lot of help, of course. I don't think we needed to know how to use anything but strings, functions, and arrays, although it did involve recursion. This problem will never be broached again once they realize there's code reading their code. Of course it's arbitrary.

(project, in case you're curious: http://www-inst.eecs.berkeley.edu/~cs61a/fa14/proj/scheme/)



You're exhibiting exactly the phenomenon the OP was talking about.

> The way I would explain it is to have them take a imagine writing a code tokenizer and interpreter of a simple language themselves.

For instance, the OP had a part about how a beginner doesn't know that `|` is called "pipe". It follows pretty logically that they are significantly less likely to know what a tokenizer or interpreter is, let alone be able to imagine writing one. Your intro CS class at Berkeley where you did this stuff in Lisp was catering to (in the OP's parlance) early programmers, not beginner programmers.

The whole point is that there's a difference between people who know nothing and people who know things but need practice applying them. There are lots and lots of people who don't know anything about programming, whether you think it's ridiculous or not.


Indeed, CS 61A had a prerequisite placement exam which tested students' ability to write a recursive program. Everyone in the class had already passed that, otherwise they were sent to CS 3. So those students were definitely "early programmers" in this sense.


> This is pretty ridiculous, man. I don't think I know a beginner programmer who would be so stuck on "every character matters." (Which isn't even true, to some level, in many langauges - ; in JavaScript and Python? Whitespace in languages besides Python?)

I guess YMMV...but most beginners I've worked with are confounded by why code interpreters are so literal. The double-equals sign versus an equals sign is one prominent example...it's not that they can't understand why rules exist...but they seem to think the negative consequences (complete program failure) seem to outweigh the tininess of the error.

After dealing with `=` vs `==` errors in beginners' code...something that I almost never screw up on my own as a coder...I've begun to respect the convention in R to use `<-` as the assignment operator...


Notably, pascal uses `:=` and is one of the better first languages in my opinion for many more reasons (easy to grasp language core, simple non-null terminated strings, no actual need to learn pointers until the very advanced stages). Today I mostly advice other people to start with python though, because pascal feels somewhat dated and undertooled.


I think Python's behavior of disallowing assignment in expressions is good enough to avoid those mistakes; it's an anti-pattern anyway in the vast majority of cases.


> I guess YMMV...but most beginners I've worked with are confounded by why code interpreters are so literal.

I can relate. I remember when I first started programming (a little less than a year ago) I spent 2 hours trying to debug a simple function which was working fine in my IDE as I stepped through it but was failing a unit test provided by my class on R. It turns out that the instructor never explained the concept of ending functions with return statements before giving the assignment. I vividly remember screaming something along the lines of "why the fuck can't it just understand what I'm trying to do".


"what's a tokenizer? what's an interpreter?", said the beginner.


I think of even greater note is:

> I don't think we needed to know how to use anything but strings, functions, and arrays

Strings, functions and arrays comprise a huge amount of information. Many C programs of rather frightening complexity and functionality could be written with just those primitives.

When we think of a beginner, we have to imagine that they have the computer science knowledge of a child. Would you ask a child about strings and functions?


When I was new to programming, and almost 100% self taught, I wanted to make a game.

It was Q-BASIC, in about 1998. I made a top-down space shooter where you fly a ship, and meteors and other objects come down the screen and you shoot them.

Somehow, I had missed or glossed over the part of my self teaching that included arrays. I didn't know what that was. My program was written with variables like $ax1, $ax2, etc (asteroid X position 1), and collision detection was a big pyramid of "if $ax1 > $ax2 AND $ax1 < $ax2 + $awidth ...".

I was always wondering how someone would write a program where there was some configurable number of things on the screen? What if I wanted to crank up the difficulty and have 20 simultaneous things moving!?

So yeah, you can get a lot done with some really basic stuff.


The way I would explain it to them is to imagine the compiler as being dustin hoffman's rain man. He'll do exactly what you tell him to do, provided you say it in exactly the right way, because he takes everything 100% literally.


I can't edit my original comment but I'm surprised it was unpopular and the repeated concerns are so petty. I can respond to most of the things you guys brought up:

@sanderjd, you may think I'm exhibiting what the article was talking about, but I'd like to know if you have any other qualms besides using the terms tokenizer and interpreter. +@kevinschumacher. Obviously I'm not going to just tell them "ok now let's write a tokenizer and interpreter." The beginners I know know what (non-programming) tokens are and I'm sure they would be 100% comfortable with an explanation. Same thing with an interpreter - they're using the god-damned thing from the start, if they start with anything other than C or Java.

@danso, @Thriptic - I think I know what's happening. You're confounding frustration with confusion. I guess I shouldn't have wasted my time. Every beginner programmer, especially, but hell, even an experienced programmer (! think about this - of course we do), gets frustrated with the character-by-character exactness of programming sometimes. That doesn't mean they don't understand or think it's totally unreasonable. Very different.

Thanks guys for explaining the downvotes. Let me end by reiterating something I've said a few times now since the beginning - this is based on beginners I know. I.e., my bio and econ-major friends taking the same 61A class I did. I'd like to know, @sanderjd, whether you have some extra information about the class that I don't? Because I took the class 3 years ago and I know at least half the students are beginners. This is the first class people learning how to program seriously take. I don't know what else I need to say to convince you.

@danso the reason why I'm spending so much time on this is because I'm a student teacher thinking about going into programming education as well. I think it's very important to make the distinction between confusion and frustration. Otherwise, your efforts may be futilely spent on explanation when all they want is to get things working. If they're getting confused on things like "=" vs "==" or why that missing semicolon annoys the compiler, don't go back and explain that the interpreter is this thing with very strict constraints and everything you type matters. That's not the point. Explain what's wrong, why, and how to fix it!!! "Oh, in this language, we terminate lines with semicolons except blocks like if's and loops," "= is for assignment and == is for comparison." They'll get better with practice.


bwy, I am writing this to you, from a current teacher to maybe a future teacher. I have a big issue with this line of your response:

Otherwise, your efforts may be futilely spent on explanation when all they want is to get things working.

Now this attitude is fine in a work environment, or many other places. But this is death for learning. Learning is not about getting things to work, it is about understanding why things work, so you can apply that understanding elsewhere, to unrelated fields even.

So, for example, I do agree with you when you say: don't go back and explain that the interpreter is this thing with very strict constraints and everything you type matters But I disagree with what you say next: That's not the point. Explain what's wrong, why, and how to fix it!!!

What would be better, in my experience, is to lead the student to find out, for themselves, what is wrong, you can supply the why, and get them to figure out how to fix it. These are what we in teaching call teachable moments, random events which present an opportunity to give the student a deep learning experience, one which will stick with them for a long time.

Your 'explain what's wrong, why, and how to fix it!!!' can be done via google, doesn't add to a real learning experience, and can turn people into cargo cultists.


Sure. Of course explaining the why is most important.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: