1

I may tutor a student who just started learning "school C", by which I mean that strange kind of C seen only on school books (and that in my opinion "cripples the mind", so to say, but I digress). I need to quickly understand her misconceptions, as she is saying that she has many doubts, she says "the char function", "the for function", "no idea what ++ is" and the like.

She looks brilliant nonetheless, besides having great confusion on cs things (and a decreasing engagement of course), therefore I'd prefer a very direct and practical way of getting to her errors, not spending a lot of time repeating concepts that might bore her. I'm quite good at understanding my pupils' mind maps given their answers and fix them but not so good at creating questions/problems that pose the right challenges and let the misconceptions provoke errors in their solutions, mostly because it all seems too simple to me whereas it looks like many students find this topic complex. The only thing that comes to my mind is generating a lot of situations hoping to see her wrong somewhere, clearly not a systematic approach.

Let's say, in our terms, that I don't know well how to identify the corner cases that I ought to be testing and I feel like risking to follow only the happy paths of her thought processes.

For example, I read somewhere that a common misconception is to think that given something like while (relation) { ... } then the student thinks that whenever the relation becomes false something happens (break? I don't know), anywhere in the midst of the block. (I have a compelling feeling of why this specific case might happen).

What are, from your experience (hoping not to make the question too subjective), or from literature, the most common misconceptions and false beliefs that beginning students have? What problems can I give her to extract the most errors?

I think both reading and writing problems, from tracing the execution to fizzbuzz or rainfall style, are there other important exercise types that I shouldn't forget? Anything without code?

Sorry if the question seems boring, I looked for examples a lot, found only a few articles but not really satisfying. Seems that I couldn't find a good catalog of misconceptions and relative inquire methods.

user9137
  • 424
  • 3
  • 9
  • 1
    Have you established that there is a sufficiently operable mental model of how the computer itself works? Need not know about registers and low-level concepts, per se, but do need something to explain the command:effect relation. [Computer Science without a computer](https://csunplugged.org/) might be a good resource. – Gypsy Spellweaver Jan 24 '20 at 00:50
  • No I haven't established that. I am wondering an efficient method to make her wrong assumptions surface, *whichever they might be*. I can easily start explaining how things are and that's all, but I would like to make her see something herself, not just hear from me. – user9137 Jan 24 '20 at 22:42
  • It the beginning you say that your student says char function, for function. Ask why not. for can be a function or procedure. Most experienced programmers incorrectly call procedures functions. – ctrl-alt-delor Jan 25 '20 at 21:42
  • A common misconception is that mutation is a good idea (see `++`, `i=i+1`, `i=2; ... ; i=3` etc. It is not. It is confusing for novices, they just don't get it. And remains one of the biggest sources of bugs for experienced programmers (that continue to use it a lot). – ctrl-alt-delor Jan 25 '20 at 21:43
  • Common problem is scope, in `if (a – Michel Billaud Jan 27 '20 at 06:54

1 Answers1

1

I may add to this as ideas occur to me, but here is a starter version.

You don't state the general academic level of the student, so I'll assume secondary school or beginning university. The student has studied Algebra at least and has some background in math.

The second really big misconception is that a program is just like math. In particular, in C, the misconception that a statement like

x = y + 1;

is an equation to be solved, rather than an operation to be performed. But this is based on the first misconception that a "variable" in algebra is the same sort of thing as a "variable" in programming (in C like languages). In fact, in algebra, a variable doesn't vary, it is a fixed but possibly undetermined value. In C, the value can vary. In algebra, the above represents a relationship. In programming it is an operation. So, first, solve those two possible misconceptions with examples, exercises, and explanations as necessary. Once you do that ++y makes some sense as an operation.

The third big misconception is just that a computer executes an algorithm, which is a sequential execution of defined steps. The program counter, PC, keeps track of the "current state" of the algorithm. In the while loop, the test only gets executed when the PC comes to it in an unrolling of the loop into a sequence of operations.

My personal preference is to teach "variables" as "references to values" not as contents of memory locations. So

x = 5;

means that x refers to 5, not that it is 5. Then you avoid the problem that if the next statement is

x = x + 1;

that the student thinks that five has magically become six. It is just that now x refers to a different value.


Caveats.

Note that I'm assuming C-like languages. In lisp-like languages algebra variables and lisp "variables" are actually the same - undetermined/unspecified but immutable values. Also in Fortran IV, you could actually make five magically turn into six. History lesson there, though.

And, of course, modern high performance computers aren't strictly sequential anymore, with speculative execution at branch points.

In fact, with a bit of finesse you can discuss a C program with no reference to any actual machine at all. It defines an algorithm. Full Stop. The finesse part is just about the PC mentioned earlier.

Buffy
  • 35,808
  • 10
  • 62
  • 115
  • Your last comment on "without reference to actual [target] machine" is how I generally prefer to explain things, I go as far as believing that the commonly used reference to a (usually unspecified) target machine is responsible for a big part of students' difficulties. I haven't explicitly specified the level of the student, it's at the beginning of high school, algebra knowledge in development, not fully formed. – user9137 Jan 24 '20 at 22:14
  • Yes I must admit that the "assignment looks like algebraic equation" being one of the common ones was known to me, anyway I don't really know what to do with it, it simply doesn't make any sense. What effectively do the students usually deduce from such a meaningless thing? – user9137 Jan 24 '20 at 22:25
  • You need to show them what it means. The right hand side is evaluated first, then (and only then) the left hand side is given that value (or made to reference that value, if you like). Show how this makes sense out of x = x + 1 which is nonsense in algebra. – Buffy Jan 24 '20 at 22:28
  • Yes of course, I show it and explain it at length, if needed. What I am wondering is how would they put to use a wrong understanding of assignment as algebraic equation. How can they use it to solve problems? What problems would immediately make them see that they are believing a crazy thing? How would they interpret a piece of code if they think it is equation instead of assignment? – user9137 Jan 24 '20 at 22:37
  • _How would they interpret... instead of assignment?_ Wrongly. That's one of the fundamentals which must be established, cleared-up, or corrected, if writing, or reading, code is to be successful. If `x=x+1` doesn't make sense, they do not yet have the correct concept to work with, and you know one issue to "debug". – Gypsy Spellweaver Jan 25 '20 at 01:21
  • 1
    I know they are out of style, at the moment anyway, but the magic breaking of a while loop is probably cleared up with a simple flow chart. Of course, the flowchart makes a handy "student debugger" as well. Teach the creation of a flowchart, then have the student "chart" some code. The picture they create should clearly show where they "don't get it" from reading the code. – Gypsy Spellweaver Jan 25 '20 at 01:26
  • For the unfortunate tradition of "=" for assignment, just insist on the fact it is a **usual notation** for programming languages, with very little connection to any mathematical notion of equality. Same for variables, names for parts of the context whose value/content may vary during execution. Make the students follow sequences of assignements, until they don't mix a=b with b=a anymore. – Michel Billaud Jan 25 '20 at 16:39