On programming

My bugbear with this, though, is that it seems reasonable to suppose that someone who has delved into highly abstract notions such as Turing machines, Kleene’s theorem, the Halting Problem, P- and NP problems, non-computable functions, and so on, would also have the savvy to apply a similar kind of high-level abstract analysis to practical programming constructs in order to streamline algorithm designs. The situation is somewhat akin to an electronics engineer designing a control circuit for a manufacturing process using thermionic valves instead of semiconductor components. He can readily be expected to know better.

But clearly that’s asking too much in some cases.

It’s probably true that the universities or academic institutions are partly to blame for not sufficiently emphasising such practical aspects. I think that the greater part of the problem actually lies with the individual students who do only the bare essentials required to pass. It should be within a university’s head-of-department’s powers to tell a student to study something else if the student is not suitable for the course. Unfortunately, education has been turned into a consumer product for which the student (usually) pays, and this disallows the exercise of such powers. The university’s staff can rationalise it away by saying that the student is wasting his or her money. But it gets worse when industry experiences a shortage of specific skills because such slipshod individuals will still find favour based purely on their qualifications. It’s a pervasive problem, especially here in SA.

'Luthon64

For me it’s more like having the engineer draw up a schematic of the circuit and not requiring him to build it at all. Then not telling him the process of manufacture, or explaining the steps to get the circuit to production, production streamlining, component cost analysis, etc… I have a feeling engineering depts are way better at this, witnessed by the fact that a lot of awesome developers I know actually came from the engineering sciences (usually electronic).

In the past I may have felt averse to calling software engineering … “engineering”, but I lost that conception a long time ago… There are processes, procedures, guidelines, safeguards, ancient lessons lost in tomes, things that exist outside of “coding” and even programming theory… that don’t get instilled in Computer Science courses. Or maybe get presented (back in the day we had some highly theoretical lectures on methodologies), but not demonstrated as a practical implementation or required for completion.

There’s also a general feeling I get that computer “people” tend to think they are the hottest shit since JC himself, and take it upon themselves to “revolutionize” something that is perfectly stable and well engineered, with their own half-assed hack that throws away a lot of established wisdom. Then going “IT’S REVOLUTIONARY! EVANGELISE!”, when in reality, it’s the same, just a bit different.

I’m always in awe of the computing pioneers, who foresaw and prototyped things that a lot of average software people today think is “new”, but is in fact old hat.

I give a programming course to bridge the gap between graduates and business needs. Students are required to write 4 programs in 4 days. They’re relatively easy, program 1 for example is write an algorithm to calculate the standard deviation of a set of numbers from a file. They must write their own algorithm and not say a call to something in the standard library like in java.util.Math.

I fail between 30% to 80% of the class every time. Worse are the programmers already employed and earning a shitload of money, are failing. Interestingly one employer has attempted to apply pressure on me to be more lenient. You would think business would be happy with an opportunity to dodge a bullet.

My results suggest that universities are pushing people through with no aptitude for programming.

Er sorry, but that’s scary. All of it. Do you give your students a mathematical formula that they then need to implement as an algorithm? Or do you take them through a theoretical example which they then essentially have to mimic in slightly altered form?

'Luthon64

I give them the mathematical formula that they have to implement as an algorithm along with an example taking them through the steps in the standard deviation formula. They can’t do it and yet they are “programmers”!

They complained to management because the course is not about the algorithms, it’s supposed to be about how they plan their work, gives estimates to management, how to do designs, best practises like automated unit testing, report on progress and so on. The students can’t even do the algorithm so they lose the ability to focus on the course content. The minimum requirement is they must submit a working program then I can evaluate all the other parts they must submit.

I stood my ground and told management that if they can’t write simple programs I will not pass them.

A few years ago I was chatting to a student who had just been hired by CSIR. This chap had done a BSc degree at Wits and his main subject was industrial maths (I think that’s what it was called). So I asked him what language he used to solve the industrial maths related problems. In utter amazement he looked at me and said: “English”.

Gah, one can but hope that anecdotes like these are the exception rather than the rule.

benguela, do you prepare your own course materials or do you use a curriculum compiled by an institute/authority you work for? The reason I ask is because within the scope of four programming assignments, one could come up with less highbrow but similarly instructive tasks than implementing a standard deviation calculation. It is a law of the universe that 98% of people’s eyes will glaze over when they see something like σ2 = (Σxi2)/N – (Σxi/N)2.

'Luthon64

Count me in the 98%, you horrid superior beings. :wink:

The course is from Carnegie Mellon University so I can’t change it.

What I do is run through the exercise using excel to demonstrate the formula. I type numbers into a column then the students watch me entering the formula into an excel field. I do not know how to make it simpler than that.

In another assignment they have to write their own linked list, i.e. they can’t use java.util.LinkedList. I go into detail what a linked list is and show how it’s different from an array because you can insert in the middle without having to shuffle everything along. This has been even worse than the standard deviation. This assignment seems to be particularly difficult. The programs do get “harder” :slight_smile:

’T weren’t no criticism. Just an observation, Sahib. Hosses fo’ cosses an’ all dat. :smiley:

Bugger… :wink:

Actually, this one’s probably more useful. Hmm, the only way I can think of to create a linked list using Java is if each node is itself an instance of the same object class that references the node after it (and the node before if it’s doubly-linked). Being strictly OO, Java doesn’t recognise pointers, which is how I’d do this normally. Then again, everything in Java except operators is a class — and thank Sun Microsystems/the JRE for garbage collection.

'Luthon64

See, not that difficult. But wow do they struggle with this.

Forgive my rude assumption about your knowledge here but actually Java has non-object types, there’s both a primitive int type, and an Integer class. A lot of the primitives you’d find in C/C++ are available as types in Java too: short, long, char, etc…

Also, (memory don’t fail me now…) unless specifically coded otherwise, object variables are just references, so you can get your “pointer like” linked list by accident… A fact many Java novices don’t grok at first sight, until it’s “pointed” out.

Are you saying that Java actually supports pointer types, even if only inadvertently? Surely you wouldn’t be able to dereference such a pointer explicitly in Java! And wouldn’t all of that fly directly in the face of the abstractions that OOP is all about!?

I’m hardly a Java boffin, not having used the language much at all. I could be wrong here, but it is my impression that in Java even “primitive” data types like int come bundled with some methods, e.g., “toString()”. If so, they’re hardly primitives in the sense of C, Pascal, BASIC, etc. Or is that the “Integer” class I’m thinking of?

As for object variables being references, that is true for all OO languages I’m aware of. It’s just that the languages’ syntax rules hide the pointer nature of object variables from the programmer. A disassembly of Delphi or C++ code (or Java to byte code or C# to MSIL) makes it quite plain that objects are always passed by reference in the compiled code. If the programmer actually intends passing an object by value to a function, the function transparently creates a local copy (not on the stack, though!) of the object from the original which it then operates on and later destroys, but the original is still passed by reference, i.e. as the object’s base pointer. What I was driving at is that Java, like C#, adheres very strictly to OO standards, whereas other OO languages not so much. Although not good practice, one can mix OOP with straight procedural code in C/C++ and Delphi. As far as I know, in Java or C# you cannot do this, or other wild things such as for example directly accessing an object’s VMT, but you can in Delphi or C++ if you know how to use pointers and how the objects are represented in memory. (You might actually want to do the latter for detailed on-the-fly debugging purposes in certain circumstances.)

'Luthon64

Nope.

Or is that the “Integer” [i]class[/i] I’m thinking of?

Yup.

It’s just that the languages’ syntax rules hide the pointer nature of object variables from the programmer.

That’s my point, to make a new object you need to invoke “new” explicitly (iirc). My comment was about novices who fail to realise that “MyObject b = a;” is not a value copy, but results in two references to one object (if it wasn’t a primitive).

omg … reading this thread just made me realise again how glad I am I got out of programming a looong time ago.

Thanx guys!

P.S Used to program in Cobol in the late 90’s and doddled a bit with java and c# while studying in the mid 00’s (nothing hectic though … reading the two posts above was like (wtf!!))

you programmers talk funny.

Okay, thanks for clearing that up. As said, I’ve not used Java much, and that was long ago.

Yes, I now see what you mean. When using pointers, the novice easily confuses the pointer (i.e. the reference) with the thing it points at. Assigning one pointer to another doesn’t magically create another copy of the thing the first pointer references; rather, one now has two pointers referencing the same thing. Just goes to show how easily familiarity lets one forget these teething problems. ;D

No, we do ~ :stuck_out_tongue:

'Luthon64

I’m busy giving my course at one of the 4 major banks, as I mentioned before, the first assignment is to read in a set of n numbers from a file and implement the formulas for calculating the mean and standard deviation.

After 3 hours of “coding” this is what I got from a “programmer”, earning a shitload of money, today.

mean=(160+591+114+229+230+270+
128+1657+624+1503)/10;

std_dev = (sqrt(((160-mean)(160-mean)) +
((591-mean)
(591-mean))+
((114-mean)(114-mean))+
((229-mean)
(229-mean))+
((230-mean)(230-mean))+
((270-mean)
(270-mean))+
((128-mean)(128-mean))+
((1657-mean)
(1657-mean))+
((624-mean)*(624-mean))+
((1503-mean)))/10);

He didn’t bother reading from a file, didn’t cater for “n” numbers, just decided on hardcoding 10 numbers, didn’t loop through a data structure like say an array (wtf!!)

Seriously, this is the calibre of paid professionals that attend my classes. >:(

For a supposed professional, that is truly shocking*. Does the offender have any other formal programming qualifications that you know of? I hope that you have sufficient grounds to fail this pathetic attempt on the grounds of not reading the instructions (read: “software specification”) attentively enough, if nothing else.

Then again, I look at the anorexic capabilities of some individuals in certain IT departments and I am forced to conclude that to get and keep their jobs, they must be hellspawn in league with Lucifer himself, come to wreak havoc and disruption… ::slight_smile:

'Luthon64


* When carrying code over several lines, an operator (like a plus sign) should always lead each new line. :wink:

Why?