Hey I'll be appearing for Job Interviews and wondering if anybody of you appeared for a Python Interview Or if on the other end as an interviewer.
Can you please share the questions asked?
That will be of great help :)
I've been an interviewer many times, but the approach
we use is a tehnique called, amongst other things,
"situational experience" and involves questions such
"Tell me about a time when your project was running late."
"So what did you do to rectify things?"
"And what was the result?
"So what did you learn that you would do diferently?"
And so on.
A more technical example might be:
"When you are programming in <language> what is the most
common error you make?
"So what do you do to avoid it?"
"And what is the result?
"So what did you learn that you now do differently?"
The key thing we are looking for is the individual's own contribution.
We don't want to hear "as a team we solved world hunger", we want
to hear what/how that individual directly contributed - they
planted 3 tonnes of miracle rice, or whatever.
Not sure if that helps?
Author of the Learn to Program web site http://www.alan-g.me.uk/
: Hey I'll be appearing for Job Interviews and wondering if anybody
: of you appeared for a Python Interview Or if on the other end as
: an interviewer. Can you please share the questions asked?
: That will be of great help :)
I would point out that there are many types of interviews. There's
the technical screen, which is what it sounds like you are asking
about, but there are other types of interviews that tend to focus on
drawing out your approach to problems or your mindset. With the
latter type of interview, I would only suggest that you know
If however, you are worried about the technical content of an
interview, it is possible that having my list of questions may help
you. It may also hinder you, because the set of questions that I
ask may differ dramatically from another technical interviewer. We
are a fickle lot, prone to ask questions we (think we) know the
answers to, which may differ from what you know .
when interviewing candidates for a technical screen--I would rarely
ask all of these.
* What's your favorite stdlib module? (And, why?)
* Distinguish a dict() and a set(). When would I use which?
* Distinguish a tuple() and a list(). When would I use which?
* What's the risk of 't = sys.stdin.readlines()'?
* What's an iterator? Why would I care?
* When should I use 'with'? Is there any advantage?
* What's a regex? Why not just use string matching?
* What does os.stat() return? For what is this useful?
* What's WSGI? Why would I use it?
* What are ElementTree and lxml?
* What's a decorator?
* What (unit) testing tools exist and how would I use them?
* What does 'raise' do? What does 'pass' do?
* Describe Python's inheritance model.
And, some others that are geared more toward those who have written
network (or SQL) applications:
* What's a file descriptor?
* What's a socket?
* How do I create a listening socket in Python?
* What's a signal?
* How do I talk to a SQL DB from Python? Any other DBs?
* What tools are available for calling an external process?
* What's a queue?
* What's a thread? Are there any (special) concerns about
threads I should have as a Python programmer?
If you have some familiarity with Python (particularly in a
Unix-like environment) many of these questions would be familiar to
you. I would get some idea of your facility with the language and
the underlying operating system from the accuracy and comfort with
which you answered. You might also find one or two of these
(mis)leading and might want to tell me about corner cases that you
as a developer have faced. That would also be interesting to me as
a technical interviewer.
Martin A. Brown http://linux-ip.net/
To expand on Martin's questions, when I've interviewed in the past, I've
asked (or been asked as an interviewee) questions that investigate
critical thinking, like those silly-sounding questions of "how many golf
balls would fit in a bus" or "how many windows are there in Seattle".
Those kinds of questions are meant to gauge how you think through a
problem, not necessarily coming up with the correct answer.
When giving an interview, I ask interviewees to write samples of code. A
popular one is to write the Fibonacci algorithm in code, or how to write
a quick-sort or merge-sort algorithm. Being familiar with some of these
computer science principles will show your familiarity with lower level
understandings of how software works. It also helps identify people who
are self-taught and only know higher level operations, and those who
have gone to college/university for computer science or formal software
Another favorite of mine was asking a candidate to write a piece of code
that took a paragraph of text as a parameter, and while maintaining the
order of the sentences, reverse the order of the words in each sentence.
So "The sky was blue. The grass was green." would become "blue was sky
The. green was grass The."
Good luck on your interviewing.
Would you ask your code samples for a python (or XXX language) position doyou have them code it in...C or actually code it in Python (or XXX languag)?
I ask because this would be fairly easy in Python (admitted my interview slution would have problems with handling punctuations), but probably a lotmore complex in something like C.
Is C still the standard interviewing basis for the computer science basics(e.g. data structures, algorithms, etc)?
Ramit Prasad | JPMorgan Chase Investment Bank | Currencies Technology
712 Main Street | Houston, TX 77002
work phone: 713 - 216 - 5423
If I were interviewing for a Perl or PHP position, then yes. However, if
I just wanted to see if they knew the algorithm, I'd let them use
whatever language they were most comfortable in, provided those of us
interviewing also knew the language.
I think C++ is more common now for data structures and algorithms.
In the UK at least it is almost universally Java nowadays.
C (and C++) are still yused in industrial settings, especially
in embedded systems, but Java has come to dominate
academia and busness applications.
Guide to Interviewing." I'd hate to give the impression that all earthly
wisdom flows from Joel, but he makes some excellent points; even if you're
the job applicant rather than the person doing the hiring, I definitely
recommend reading it: http://www.joelonsoftware.com/articles/GuerrillaInterviewing[..]
Indeed, I personally dislike Java, I think it encourages
some very bad programming design habits, especially
in the OOP area, but sadly it is the de facto standard...
(And increasingly, so are the bad habits! :-( )
I despise it root and branch... but his point is a little different: Java
just isn't a hard enough language to separate great programmers from
plodders (neither is Python, for that matter) because pointers and memory
allocation are taken care of automagically. When you're hiring programmers,
(Joel says) you want people who understand what the computer is actually
doing under all the chrome, and you want people who are smart enough to have
actually passed classes where they had to do that stuff.
I don't want to sound elitist - I wish everybody would learn to program,
and I think Python is both a great learner's language AND a great language
for Getting Stuff Done - but when you spend your hard-earned money for
commercial software, or trust your computing life to an operating system,
you want to know that it was written by people who knew what the hell they
were doing, rather than people who scraped by in a Java School 'cause the
classes weren't too hard. We've all used software that was written by
non-programmers - I'm struggling with just such a system at the moment - and
life is just too damn short.
I fundamentally disagree with his stand on this.
I have employed quantum physists and geoligists who had no
idea of how computers worked but they were plenty smart and
made good programmers.
A good programmer is someone who can think logically,
analyse and abstract a problem and express it clearly in
an implementation language (regardless of what language
it is, they will probably use several in their career!)
There is a very small set of programming tasks where you
need to undertand the machine - developing an OS and
device drivers etc - but they are such a small part of the
industry that mostly we can be thankful that modern
languages hide the machine and let us focus on the
really hard stuff - understanding the customers world
and translating their requirements into code. Fred Brooks
identified this as far back as the 80's with his famous
"No Silver Bullet" article.
then machine code then assembler and finally Pascal.
Then I found C and so on. But I am profoundly grateful
that I no longer have to worry about which register to
store the result of an addition or which memory mode
I need to use in a subroutine call. There are equally
complex challenges in higher order programming than
there are in programming the machine. There is a role
for both, but the macho "I can do it faster in assembler"
attitude that sometimes arises is no more than ignorance
of the challenges elsewhere. I've worked with highly
technical programmers who couldn't understand how a
Corporate General Ledger accounting system worked
and so couldn't program solutions for it. But they
could explain in detail how the multi-threaded kernel
in the computer operated.
So language preferences are fine if they are based on
the language features. Computing and Programming are
something else again.
And this is another matter again. But if you are talking about
share dealing systems or traffic control systems or factory
automation I'd rather the programmer understood algorithms
and the business functions than the difference between
page switching and banked memory access.
But I definitely want him/her to understand computing, and programming
in depth. I want them to have studied the subject deeply and have
a wealth of experience. Studying computing because its an easy option
is not an option because its never easy. And anyone who starts down
that road will be weeded out very quickly.
Software can only be written by programmers, its the definition
of the term. The issue is about whether the programmer was
trained in computing/engineering or whether it was someone
who just knew a programming language. Comp Sci was originally
a branch of math, and many of the best programmers I've worked
with came straight into the industry from math - but they had
to learn about defensive programming etc. But their algorithm
design often meant they had less to defend!
Seems to be my week for ranting... :-)
And I meant to add that this includes learning about the
virtual machine - the execution environment, the differences
between interpreted and compiled code etc. Also
understanding OS concepts like scheduling and file
systems etc are necessary. It's just the low level memory
management/register access type stuff that I don't believe
And I do agree that we are seeing "programmers" who
don't understand the basics of computing even at a user level
and that is not good in an industry concept. It's fine for
hobbyists but not for industrial grade programming.
Author of the Learn to Program web site http://www.alan-g.me.uk/
May I direct the interview question to another direction?
are there some tests (I mean, just like the examination test) of
python for beginner?
which asked some basic but important questions (I do not want to learn
python all of the stuff, it might be so huge for me.)
and most important those practice gave answers.
I learned it on and off and hard to achieve something (I mean practice
in daily working). But got a wish to learn. so any advice will be
Not sure what you're saying here Alan -- are you saying you consider Java
"hard enough language to seperate great programmers from plodders" or are
you saying that you don't agree with the distinction drawn (that some
programmers are plodders and others are not) or what?
Well, FWIW I would tend to side with Joel/Marc on this one (depending on
interpretation of what you're both saying -- maybe I fall somewhere in the
middle... whatever...) To try and clarify, I'd perhaps rephrase/paraphrase
their view as, "you want people who have some understanding of how the code
you've written will actually ends up being executed by the computer in terms
of the effective time and space complexity (as well as other potentially
problematic aspects) of the code."
To me that seems to be a largely isomorphic expression of what they're
saying and perhaps closer to what they're actually trying to get at. Either
way, the point I'm trying to make is that even if you have for example some
awareness of the /apparent/ time/space complexity of *your* code, it's still
very easy to fall into a "Schlemiel painter's" type algorithm by not really
having an understanding of how (for example) some of the libraries or basic
functionality of the language you're using are implemented. To belabor the
point: The same pseudocode/algorithm realised in 2 different languages can
have wildly different performance characteristics depending on the nature of
the underlying languages and the exact form of the realisation. And you as
programmer will never even know unless you have a more than superficial
understanding of the language you program in and have some awareness that
these types of issues exist and what the different performance
characteristics of various algorithm classes are. And yet, many programmers
don't apparently have even a superficial awareness of attributes of the code
that they write, never mind how a such a superficial analysis (e.g. ignoring
the platform/language) may differ from what really happens when executed,
and why such a difference exists.
Anyway, best regards and have a good weekend all,
I've been learning Python on and off for the past 3 years, as a hobby.
I am over 50 years old, so will never be a programmer. However:
1/ I've done a bit in Project Euler and have found many algorithms to get
prime numbers. There is one that is 10 times faster than any other that I
have found, it uses numpy. Unfortunately, I don't understand it at all.
However, neither would I understand Python's sort method, but I still use
2/ I used be able to take a car to pieces and put it back together. Today, I
wouldn't stand a chance.
I suppose that what I'm trying to say is that there will always be a need
for "experts" that know different OS's and how a computer works inside, and
there will also be a need for coders who code a programme that is needed at
a certain time.
Is there really a time that knowing that "list" is interpreted as
"10010001000100010010000100010010" is important these days?
Please flame me
Well just because you're 50 years old doesn't mean you will never be a
It's this awareness (that you clearly already have0 that I submit is often
(sadly) lacking in many programmers. So there mere fact that you're
pointing this out tells me that you already have, at least, a feel/awareness
that not all algorithms are equal, regardless of whether you understand
No, but that wasn't IMHO the point being made. It's more about how (for
example) lists in general behave (e.g. having at least a feel for the cost
of various operations etc) and (by contrast) also how **Python's** list
implementation behaves (which is not the same as a classical linked list.
Python lists (CPython at least) are IIRC actually implemented as dynamic
arrays of pointers, which means that some operations don't cost as much as
they would do with a "true" nodular linked list implementation, while other
operations cost more etc. etc.)
Best wishes (hoping this was not perceived as a flame as it wasn't intended
That was the point that Joel and I were making. The CS programs that have
become Java schools now make the curriculum as easy as possible because they
used to flunk lots of students, or lose them to other majors - they
obviously saw that as a Bad Thing, but it actually wasn't. A degree from a
school that flunks a lot of students actually means something; a degree from
a school where everybody passes is about as meaningful as a Participation
Not quickly enough! They should be weeded out IN SCHOOL, or before they
even commit to a computer-science track. It's cruel to students,
inefficient for business, and disastrous for consumers if they don't get
weeded out until they're already employed as programmers.
You knew what I meant; don't be coy. Anybody with a wrench and some pipe is
a plumber. Doesn't mean I'm letting him work on my dishwasher.
The point I was trying to make, which apparently I didn't state clearly
enough, was: Professional programmers - I really supported the effort, years
back, to protect the term "software engineer" - should be familiar with the
ins and outs of computers, not just with the quirks of the language they are
employed to use. To use my dishwasher analogy from a moment ago, I'm sure
we've all been visited by the appliance repairman (or auto mechanic, or
whatever) who only knows how to replace a single component, and who
therefore sees every malfunction as requiring a new control board. I don't
want him either! I want the guy who's worked on lots of appliances - not
just dishwashers, not just my model - because he's going to have a better
idea of how it all works when it's working, and what can go wrong when it's
At the same time - coming back to the theme of this group - I'm enthusiastic
about the idea of people learning to fix their own dishwashers, and - if
they love it, and get really good at it - becoming employed as appliance
repair professionals. I have now officially over-worked this analogy.
There were a couple of other points I wanted to answer, but I'm out of
time. It does seem that we mostly agree - certainly we agree that Java
The concept that knowledge/ability to use a language doesn't indicate
quality is one I agree with.
Here I disagree. A certain level of base knowledge beyond the
requirements of your language is required, true, but
a) I think that can be taken too far. I suspect a ton of truly great
programmers have never have to memalloc() and they are still good.
b) I think this is placing the cart before the horse.
To expand on that second point, I see a good programmer as someone
that thinks abstractly, that can bounce between big picture and
details, that considers concepts like reuse and flexibility without
extra effort. They are lazy enough to want to take advantage of
existing libraries and diligent enough to fix things the first time.
They have curiosity and insight.
A person like that will, in time, learn enough about the environment
and foundations of their tools to reap all possible benefit from it.
Studying those foundations will not make you one of those people, nor
will testing for knowledge of those foundations necessarily find you
one of those people.
And, frankly, I suspect a great many of those people will never
wrestle with when exactly their compiler performs tail call
elimination. But that's just my suspicion.
One issue I've not seen discussed is some of the specific habits the
language encourages. I've never been one to trash a language,
believing it's a poor workman that blames his tools, and that almost
all tools have their strengths, but having worked with Java (and Java
developers) for a while now I've really come to dislike some of the
practices that are becoming common: Stacked patterns without
understanding the purpose, premature and excessive abstraction,
elimination of verbs, and horrendous naming habits. I'm curious to
see if any of these habits change if/when Java adds functions as
Brett Ritter / SwiftOne
Yes, I'm saying the language just isn't that significant.
No, there are plodders and greats, but I'm disagreeing about
what constitutes great.
Thats where I disagree, you might occasionally need a few
of those, but not often and not many.
And I do want them to have studied the subject and be
qualified - either by exam or by experience.
The efficiency of an algorithm is one thing. The way a computer
executes it? That's much harder to guess since much depends
on the compiler/interpreter. A good algorithm is usually independant
of those things. (So an algorithm that creates zillions of objects in
memory is a bad algorithm and you need to be aware of the impact,
but you don't usually need to be aware of how the computer is creating
those in memory.)
I would say slight differences depending on language, the same
algorithm will generally have the same *relative* performance
regardless of language. The quality of the optimiser is likely
to be far more important. And in most real world scenarios
the quality of data structure design and database choice and
network usage are far more likely to cause performance issues
than the code. I'd rather have someone who can design a good
code structure than someone who can write "tight" code any day.
I'd have agreed with that 10-15 years ago. Nowadays thats
rarely an issue. I haven't had to deal with those kind of issues
in a project for at least 10 years. I've had lots of performamce
issues to resolve, but the code is the last place I look.
Here we agree. Redesigning the algorithm (and the data
structures) are far more likely to solve a performance
issue than tightening the code to suit the CPU
characteristics. Tightening code can save a few seconds
tightening the algorithm/data will save minutes or even hours.
I'd like to point out a blog post by Jeff Atwood asking why programmers
can't program. Not program well, but program *at all*.
Actually, the blog post title is provocatively wrong, since Jeff isn't
*actually* saying that 99.5% of working programmers can't program, but
that 99.5% of applicants for programming jobs can't write a simple
beginner program. Or sometimes even a single line of code.
(The correct figure for working programmers is more like 80% *wink*)
Hilariously, about half the people writing in to say they easily solved
the FizzBuzz problem actually *got it wrong*.
One commenter defended the can't-program-programmers by saying:
"The people who couldn't solve the fizzbuzz test you describe in your
article, might be great at solving well defined problems."
Presumably he meant well-defined problems *except* the fizzbuzz problem.
Follow the links in Jeff's post for more controversy, hilarity and/or
analysis of the problem of finding good programmers.
Not here in Australia you're not. You can only call yourself a plumber
if you are properly licensed and certified. That usually means having
gone through *at least* a five(?) year apprenticeship and that they at
least know that water flows downhill.
Well, that depends on what you mean by "familiar". I like to think that
any good programmer should understand that there *are* hardware issues
to consider, even if they don't actually know how to consider them. They
should know about Big Oh notation, and be able to explain in general
terms why bubblesort is so slow and quicksort is usually fast but
sometimes becomes slow. If they can actually calculate the best/worst/
average Big Oh running times for a function, that's a bonus.
A good programmer should be aware that hardware caches invalidation will
make your code run slow, even in a high-level language like Python. A
*great* programmer will be able to tell you exactly which code will
invalidate the hardware cache, and what to do to stop it.
But let's not mistake ignorance with stupidity. An ignorant programmer
may have merely never learned about the difference between O(1) and
O(2**n) running times, by some accident of education and work
experience, but still be a good programmer. A stupid programmer still
writes Shlemiel the painter's algorithms even after having them pointed
out again and again.
Not such a good analogy, since modern consumer goods are getting to the
point where they are almost unrepairable except by replacing the control
board. It often costs you *more* to fix a broken widget than to throw
the machine away and buy a new one, e.g. monitors, TVs, DVD players...
That's also often the case with computers unless you value your time
very low. In my day job, if I have the choice in paying one of my junior
techs more than 4 hours to diagnose a flaky piece of hardware, I'd
rather just hit it with a hammer and replace the likely suspects
(memory, motherboard... whatever is likely to be causing the symptoms).
Obviously its a sliding scale -- I don't replace a $14,000 server
because a hard drive is broken, but neither do I spend three days trying
to be absolutely 100% sure that a $60 power supply is flaky before
Coming back to programming, sometimes the right answer is to throw more
memory at a problem rather than to write better code. A GB of RAM costs,
what, $100? That's like what, 1-3 hours of developer time? If it takes
you three hours to lower your application's memory requirements by half
a gig, you might be throwing money away.
That's partly why we program in Python: use a relatively heavyweight
language environment (at least compared to C or assembly) that allows us
to be 10-30 times as productive for the cost of 10 times slower code and
twice as much memory.
Sorry Alan, you confuse me. Do you mean Java isn't that *insignificant*?
I think that depends on what you mean by "understand".
If you mean, should all programmers be like Mel:
then Hell No!!!
But I do believe that all programmers should understand the limitations
of the machines they're running on (in Python's case, there's a virtual
machine plus the real one), or at least understand that those
limitations exist, so they can avoid making silly mistakes or at least
recognise it when they do so.
I'm not talking about them knowing how to write assembly code, but
little things like knowing why the recursive versions of factorial
function and the Fibonacci sequence are so damn slow.
This is often harder than it sounds in Python, because the C built-in
functions are so fast compared to those written in pure Python that for
any reasonable amount of data it often is faster to use a O(n**2)
function using built-ins than O(n) code in pure Python.
I kept going way, way too long with the dishwasher analogy, but the actual
incident that was stuck in my mind was automotive: a few years ago, my
brakes started pulsing whenever I tried to stop, and the brake light was
"You need a new anti-lock brake computer. That'll be $1000." That seemed a
bit steep to me, so I took it to Midas Brake and Muffler. They took a look,
$600." That seemed more reasonable, but when you have two clocks that don't
agree you should consult a third, so I went to my regular mechanic (where I
the wire from the right front sensor to the computer. We spliced the wire,
wrapped it in heat-shrink tubing and sealed it. That'll be $15."
Now, try to re-imagine my analogy with those three mechanics in the place of
programmers. Which one should I hire?
This approach may be acceptable for in-house development, or a case where
you and three other people use your program. When Microsoft and Apple adopt
this philosophy, it makes me incredibly angry - multiply that $100 by all
the computers running their crap software, and eventually it adds up to real
money. I truly think that one of the tragedies of modern software is that
the developers at places like MS, Apple, Adobe, etc. get their computers
replaced on a shorter lifecycle than most of the rest of us. I mean, really
- have you used Outlook or iTunes, or FSM help us Acrobat, recently? Makes
me want to open a vein.
And that gets to the point I was trying to make. I am ALL FOR hobbyist and
part-time programming - I would not describe myself as a genius programmer,
so it's a good thing that it's not my full-time job (although it's my
favorite part of my job!)
I damn well want geniuses, and nobody else, working on the software that I
have to use to make my living. It pisses me off beyond belief to have to
use some Schlemiel's efforts when I'm trying to put food on my family.* And
that is why, if I were hiring developers, I would be strongly tempted to
skip the resumes from Java schools (even if, FSM help me, my shop actually
developed in Java) - there may very well be great programmers who went to
those schools, but someone else can find them; I want the ones who've been
pre-sifted for me.
At no time have I advocated developing in assembler. I think that
programmers should Get their Stuff Done in the most efficient manner
possible. BUT: if you work for me, I want you to have sprained your brain
learning how the flippin' machine works. Then, when you come to work for
me, I will be ever-so happy to let you work in Python - because I know it's
the best way to harness your talent. But (although I love this list, and
wish everybody on it well) I would never hire a programmer who had only ever
used Python, even if I ran a Python shop.
Full disclosure: I am currently cranky on the subject of crap software (and
the crap programmers who produce it) because, in one of my non-Python gigs,
I've been struggling to update some templates in an Electronic Health Record
system (which shall remain nameless.) The template editor was clearly
written by a loosely-affiliated team of mental defectives, and it raises my
blood pressure every time I get near it. So I may be a little unreasonable
on the subject of quality software...
* pace G.W.
Surely he is saying that it doesn't make much difference to this which
language you are using?
I appear to have confused several folks :-)
I also have gone back and read Joel's article again and
although I still disagree with his insistence on pointers (and
recursion) as critical items I don't disagree with the general
drift of his article. I do think he over-rates pointers and recursion
though. Some of the greatest programmers I've worked with
come from a COBOL background with neither pointers nor
recusion in sight. But they could do stuff with file I/O that
would make your hair curl! And they knew every trick in the
book for processing data including tweaking database execution
plans and raw data file access tricks that most DBAs have
never dreamed of.
But the point I'm making is that being a great programmer is
about the ability, as Joel says, to think at multiple levels of
abstraction, but the machine memory level doesn't need to
be one of them. (Recursion I'm prepared to allow since its
a more generic skill and applicable to whole classes of
problem that are almost intractable without it - even if you
do have to unravel it later for performance or scalability
And for that reason I have no issues with Java being used
as a teaching language any more than I have COBOL or
Fortran or BASIC. I don't like any of them for my personal
use but I've used all of them in anger (except Fortran) and
none of them offer any fundamental obstacle to me building
any algorithm I want, some just make it a little easier that's
all. So they can use any language they like to teach stuff,
so long as they are teaching the right stuff. And that is
where many CS classes are failing - and, I think, what
Joel is really bemoaning - they don't actually teach CS
they teach "programming" in a particular language
(whichever it is).
There was a time when every programmer needed to be
aware of the resource usage of every bit of code, but those
days have long gone unless you are working on very small
embedded devices. The simple fact is that modern OS's, tools,
hardware and networks make those kinds of optimisations
premature at best and suboptimal at worst (many optimising
compilers can out-optimise most programmers given
straightforward code, but give them "optimised" code
and the end result is worse not better!). On the very few
cases you need to optimise at machine level you can take
your time and learn as you go, or recruit an old-hand who
remembers that kind of thing (in the same way the old
hands have to recuit the new-blood to grasp web concepts
and declarative languages etc)
Meanwhile, programmers are being asked to produce code
that is flexible and configurable more than efficient. It will
need to be highly maintainable because it will change
many times in its life (No room for "Mel" here) and it must
be done at minimum cost (software engineering not computer
science). A great programmer nowadays has to deliver on
a completely different set of demands than a great
programmer in the 70's or 80's. The goalposts have moved
and so must the standards by which we judge greatness.
There are common skills that are still needed. I'm just not
convinced that manipulating physical pointers is one of those
common skills (maintaining references OTOH is still valid
regardless of whether the reference is a memory pointer!),
or that it is the best way to teach those skills that are still
valid. Or even that they are the only way to teach multi-level
abstraction - how did my COBOL colleagues learn?
And is chasing an IBM ABEND any different from debugging
Author of the Learn to Program web site http://www.alan-g.me.uk/