No-one disputes that Edsger W. Dijkstra was a genius, and a legend of programming. (Well, no-one except Rob Nagler.) It seems incredible to me that his idea of structured programming was once considered a foolish affectation, or even that there was ever any dispute about it at all: but I remember all too well reading arguments about it in the early-eighties issues of Practical Computing and Personal Computer World. Maybe it was an idea waiting to happen, or at least waiting to be popularised, but it seems we have Dijkstra to thank that the languages we do our work in are largely expressed in block-structured loops and conditionals rather than a maze of twisty little GOTO statements, all alike.
Dijkstra’s Wikiquote page is full of gems, like this one (which accords very nicely with Kernighan and Plauger’s key quote):
“The competent programmer is fully aware of the limited size of his own skull. He therefore approaches his task with full humility, and avoids clever tricks like the plague.”
I am right on board with that. I’m not so sure about his claim that “Object-oriented programming is an exceptionally bad idea which could only have originated in California”, but I find it amusing so I am willing to give him a pass on that one.
But here is where I think he badly misread a common situation:
“It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration”
First, to be fair, I should note that this quote is taken from a tongue-in-cheek short paper, How do we tell truths that might hurt? (1975) — the same paper that contains the much-misquoted observation “The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offence”, which is usually given as being also about BASIC. The whole paper is only 670 words long and well worth the couple of minutes it takes to read. It’s pretty clear from context that Dijkstra was exaggerating for comic effect; but it’s also clear that the thing he was exaggerating was something he really felt: that learning BASIC is of net negative value towards becoming a programmer.
But that conflicts with what we actually observe, and however beautiful a theory is it really ought to be checked against reality every now and then. I know plenty of good programmers who cut their teeth on Microsoft BASIC on late-1970s and early-1980s micros. (That’s how I started out myself.) So why haven’t we been mentally mutilated? Dijskstra would probably say that we have been, and that had we escaped early exposure to BASIC, we’d be better then we are. I suppose that’s possible, and it’s not something that can ever be resolved by experiment. But I think the excellent ex-BASIC programmers that I know are excellent because of their exposure to that language.
I’m not arguing here that BASIC was better than it’s been given credit for. It wasn’t. It was horrible. The only available looping construct was the arithmetic FOR I=1 TO 10 loop; the only conditional contstruct was an emasculated single-line IF…THEN that couldn’t control a block; all variables were global and their names were restricted to two significant characters. If we wanted a WHILE loop, we had to build it using IF…THEN GOTO. Subroutines were sort of supported, through the GOSUB statement, but RETURN couldn’t yield an argument, so subroutine protocol had to be via changes in global state. In the snow. Uphill both ways.
And, kids, if you’ve used Visual Basic .NET or some other not-really-BASIC-at-all, please realise that we are not talking about:
We’re talking about:
0 sys866:goto10
1 c$=c$+" ":k=0:d$=n$
2 k=k+1:a$=left$(c$,k):ifright$(a$,1)<>" "then2
3 b$=mid$(c$,k):ifleft$(b$,1)=" "thenk=k+1:goto3
4 ifb$=""thenn$="***":goto6
5 b$=left$(b$,len(b$)-1)
But I’m arguing that the horribleness of BASIC was its virtue. It forced us to think around corners. It made us think through what the control structures really were, and how they were implemented. Most of all, it made us hold a huge amount in our minds at once. Programming in a language with no local variables means that you need to have a clear mental model of what every single variable is doing at every point in execution, for every possible code path. I am not saying that’s a good way to program. It’s not a good way to program. It’s a horrible way to program, and I am delighted that I don’t have to do it any more. But it’s a fantastic exercise. It develops mental muscles that you’re going to need all through your career and which you’re not likely to develop while doing any of the more productive activities you spend time on.
I think of it like tennis players weight-training. When a top player lifts weights, he’s applying more force than he will ever need to apply in a match. The point is not to make him able to lift, I don’t know, 16 tons or whatever it is that these guys can lift: the point is that when they have to apply a much lesser force in the course of a match, they can do it effortlessly. They have that side of things covered, so their concentration can go into higher-level matters like shot selection and anticipation.
Or think of a classical pianist. Someone playing Chopin’s Etudes will be going through technical contortions that no normal piece would demand; but because he has mastered those etudes, he’s better able to deal with the demands that performance pieces throw at him. (Admittedly this analogy is a bit spoiled by the fact that Chopin was a superhuman genius and his Etudes are glorious pieces of music in their own right as well as being technical studies.)
Anyway: like a weight-training tennis player, or like a pianist who has mastered the Etudes, so if our minds are stretched by dealing with programs stitched together with global variables and GOTOs, we’ll be better equipped to cope with the lots-of-stuff-at-once complexity that our real programming demands we deal with.
Not only that, but I think programming in BASIC stretches the mind in a specific direction that other exercises don’t, and that is a particularly relevant direction in modern programming environments. What I mean is that while the exercise of, say, writing a correct quicksort is valuable for developing the ability to focus right down on a single complex problem, it doesn’t really touch on the ability to think about a hundred things at once. And I feel like I have to do that all the time, thanks to the innumerable libraries and languages that have to come together to make a modern web application. The other day, I found myself working on a hunk of Perl embedded in HTML (by the Mason module), where the role of the Perl was to dynamically generate a fragment of a JavaScript function which was to arrange for specific elements of the HTML form to be set to various values, which were themselves taken by Perl from a database. I don’t think that kind of thing is unusual. But to do it right, you need to think about a lot of quoting levels at once — HTML encoding, Perl’s string encoding, JavaScript quotes. I don’t think any amount of structured programming helps with that kind of frankly messy problem. It’s hard to get right. (As anecdotal evidence in support of that assertion, I offer the information that I got it wrong twice myself before getting it right the third time.)
“But Mike, what about all the bad habits that you learn from a language like BASIC?” Yes, they are a problem. Those bad habits do need to be unlearned, and the trick is of course to dump those spaghetti-code techniques while still retaining the mental muscle-memory that you built up while using them. I guess most of us didn’t find that too hard: programming with WHILE loops comes pretty naturally.
So join me, please, as we raise a glass to the horrible BASIC dialects we grew up with; we curse the bad habits they left us (which we have hopefully long ago left behind) but thank them for having broadened our mental capacity.
And finally …
I leave you with John McCarthy, the creator of Lisp. (From here.)
Update (10 March 2010)
A couple of discussions of this article elsewhere on the web:
Not much happening at Hacker News.
Basic was my first language. In jr. high it would come off my fingers in a stream of consciousness: “Sure, all I have to do is for x = 1 to …” I wrote a lot of Ms. Basic programs in the 80s and early 90s that implemented stack frames in arrays.
Once you’ve exposed yourself to enough of the awful consequences of something, you come to appreciate the connection. You also develop a hightened sensitivity to things that will keep you out of hell, like the idea that a loop should have an invariant? “Oh. Ohhhhhhh, thank you Jesus, thankyou thankyou thankyou thankyou blessed savior kiss kiss kiss kiss grovel!” & I’m not religious & you know which savior I’m thinking of (or it could be David Gries). Actually, hmm, both A Discipline of Programming and The Science of Programming are stacked on my CRT at the moment.
“Hey! Saint Peter?
Before you ring your bell,
Just been down to New York Town,
Done my time in hell,
Done my time in hell.”
(Flash and the Pan)
I think the reasons you should learn BASIC are the same for learning assembler: It gives you a habit of Spartan thinking and how to work your way to real solutions with highly constrained resources.
That being said, I was born too late to have really programmed in BASIC (I vaguely remember half a high school course on BASIC) and my first real exposure was with C++ and Java. I can’t say if BASIC would make me a better programmer.
I started programming on a VIC-20. Your post made me realise how many details of BASIC I’d blocked out of my memory since then!
I do agree with your comparison with training.
I remember when I was programming in RPL on the HP 48.
The language was fun, but it did require you to hold the entire state of your program in your mind, since pretty much everything was on the stack.
Ditto for assembly language, you had to keep your eye on the ball…
I agree with Basu: OO and other computer paradigmas do not challenge to program for highly constrained resources. I remember programming Basic in the mid seventies. The language was intrigueing because of its uptil then unknown similarity to human language. However, I was so much more intrigued to get to the computer intestines, by peeking and pokeing around.
Basic is a human readable language constructed similar to assembler. When programming for speed and low memory usage, you need to be able to throw aside rules of well behaved programming, so I would argue that programming Basic actually is very beneficial. My professor at computersciences pointed out that every mathematical problem can be programmed in a procedural language, without using fancy stuff like recursion etc. In fact -dare I say- I think programming has become too much language oriented. As if a simple task has to be written as literature. In my opinion a good programmer never forgets that it is bytes and opcodes that make a computer tick, not components and late binding.
Pingback: The Value of BASIC As a First Programming Language | JetLib News
In the same way as a former AI/Prolog lecturer of my acquaintance referred to C as “portable assembly language” (in a remark that was intended to be disparaging towards C and its users), BASIC is kinda-sorta like non-portable assembly. It generally maps pretty closely onto common processor instruction sets (assuming either a fairly rich BASIC, or the absence of SIMD extensions like MMX).
As a result, I believe that it’s pretty easy for someone who has programmed BASIC at some point in their career to get a good idea of what their algorithms will probably be forcing the processor to actually do, because they’ve probably written similar algorithms the hard way in the past.
Didn’t Knuth say something like “It’s possible to write programs without an understanding of the limitations of the underlying architecture, but your programs will look pretty strange”?
I started off with BASIC as well…in the early ’90s. I can’t say if it made me better, worse, or whether II subconsciously hate myself for not writing more of it nowadays. All I know is I did a lot of programming on it, wrote a couple games, learned a lot about hardware, computer architecture and data structures to let it go away.. :-)
Without wanting to start a pointless flame war – I wholeheartedly, and very strongly, agree with Dijkstra’s referenced point of view on the mind crippling nature of BASIC syntax.
My first (circa 1969) programming language was Algol 60 – an elegant and beautifully structured language, which recognised the distinction between assignment and equivalence!
Since then I have, amongst other things, spent a great deal of time (actually over two contiguous decades) undertaking remedial work on ‘BASIC’ systems cobbled together by ‘how hard can it be?’ individuals enthused with the warm glow of blissful ignorance that arises from the prophylactic use of inane constructs such as ‘On Error Resume Next’, not to mention the problems of Evil Type Coercion (ETC) and Unimess!
Such remedial work – often under extreme time pressure – certainly paid the bills, but at the personal cost of having to look at great deal of very ugly code. Worse than neglect, what can only be described as ‘code abuse’.
As an example of what a ‘beautiful mind’ can devise – take a look at the Eiffel language developed by Bertrand Meyer.
Much too hard for very basic (VB) programmers!
ps. I feel very strongly about this!!!
I first learned fortran and basic on the mainframe at my dad’s office (it’s what I needed to learn to tweak the games there ;-)
when I started formal classes it was basic first, then machine code (not just assembly, machine code by flipping switches for individual bits on an altair 8000)
this training lets me keep in mind both what I want to do and what the computer can efficiently do and figure out a good compromise for getting the work done.
most people I work with (specifically including almost all of the professional programmers) have a much harder time understanding the entire picture of what’s going on and what the impact of scaling is going to be (both down to small hardware and up to monster hardware with hundreds to thousands of users)
I’ve always been grateful to get that early training.
Maybe I was lucky to have worked on DEC Basic-Plus which did have structured constructs WHILE and UNTIL, statement modifiers (which Perl borrowed), virtual (disk based) arrays and multi-line defined functions. Most MS-BASIC’s were a bit of a come-down.
My first exposure to Basic was BBC Basic in late 1981. It was well structured, had procedures and functions, local variables and even the ability to assemble machine code. You could write a program with nary a GOTO / GOSUB or RETURN.
In VB.Net is value1/value2 cast to floating points before making the div?
I’d never write this:
result = (value1 / value2) * 100
but this:
result = value1 * 100 / value
Ha, kids today with their fancy BBC BASIC!
Yes, I do remember that, although I never owned or worked on a BBC Micro myself. It seems that they were taking pretty big steps away from the BASIC of Evil that I’m remembering here. If memory serves, the BBC B was a better computer than the C64 in pretty much every way, and it was primarily Commodore loyalty that made me go for the C64 — that and the fact that I’d already accumulated a lot of Commodore lore from the VIC.
Thanks, Graham, for a different perspective. But remember, I was not at all arguing that the use of Horrible BASIC produced good program; only that it produced good programmers.
And: “Take a look at the Eiffel language developed by Bertrand Meyer. Much too hard for very basic (VB) programmers!” Can a language really be classified as good if it’s too hard for most people to learn?
I started out with QBASIC on my stepfather’s old 486 (yeah, well, I’m a little younger. also, this was in the late 90s). In a way QBASIC got me started with programming, because it was there to play with and because it was simple enough to be grasped just by reading the help.
But what I do have to thank QBASIC the most for is boring me rather quickly, so that I turned to Assembler (again, by reading the help of some IDE I had pulled off the Internet or something).
Although I never actually did much in Assembler, the knowledge I gained about the inner workings of a computer are still with me in every single line of code I write. The mind-twisting necessity of re-building each and every code construct through conditional jumps makes understanding “modern” languages quite easy. Having worked with actual memory addresses makes C pointers trivial.
Alex, I laughed out loud when I read “… so BASIC is kinda-sorta like non-portable assembly”.
It reminds me of Bill Thacker’s observation: “C combines the power of Assembler with the portability of Assembler” :-)
I started on ZX Basic and moved to Assembler. By the time I finished with programming as a career I was designing hardware with discrete logic and interfacing directly with the processor ( building my own in-circuit emulators and even adding external commands to the CPU in some cases ).
I tried C but just couldn’t get my head into it. I understood it just fine, but the idea of struggling with it’s awkward syntax and the metal separation from the system drove me nuts.
Only recently did I start working with Basic again. With Freebasic ( modern cross-platform structured basic compiler ) I found that I could use the same skills I developed 20 years ago and write some helping applications that run well on modern systems.
Basic has progressed from a clunky language to to something that can stand side by side with newer languages, such as C, without sacrificing what made it so familiar. Sure, you can go back to the old syntax if you like, but there’s nothing to stop you writing a well structured application either. And it still feels the same even when you do.
Basic’s like family. It’s nice to go back there once in a while, and it never forgets you ( nor you it ).
I too started with BBC Basic, so was shielded to a large degree from the tangle of ‘gosubs’ and ‘gotos’, not to mention that you could define local variables within procedures.
I think I turned out ok, but I wonder if it was nothing to do with the language and all down to the fact that resources were so tight – when you only have kilobytes to play with you adopt a mindset which I think still serves me well as a Java developer on server-side applications.
Back then as a BBC developer I always thought Commodore 64 BASIC programs looked unreadable :-)
having learned BASIC (in my case, two BASICs: Applesoft BASIC and Commodore BASIC) gave me a perverted passion for global variables which, to this day, I find it difficult to hold down.
But, the memory saving tricks necessary to work in BASIC on really small machines turned out to be useful knowledge when, relatively recently, I started playing with microcontrollers (16Ks of RAM! It is the Eighties all over again!).
What I like about Basic on old computers is that it was operating system too. It had ‘immediate mode’. You could write ‘print 10’ and it did print 10.
Simple programs were really simple. You did not have to open projects, specify platforms and options, or anything.
What I see to day is that people not only don’t program well .. they do not program at all. They don’t start. The first step they would have to do is too large.
Also when I saw my first running 8bit computer, it was drawing some graphs, painfully slow .. and that is when I decided to be programmer. You generally could not do anything with the computer but to program.
These days people know games first .. they do like computers .. they do like to play with them .. but they have games for that.
At the old times, programming was the game.
Dr. Sid: you are so, so right. I may have to steal your comment and rework it as a blog post :-)
I started with BASIC in class 8 (year 8 equivalent). The particular flavour was GW-BASIC. Although, it’s hard to recollect the syntax now, most of what we did was fun – generating christmas trees, make the PC speaker play funny sounds and all those kind good fun stuffs. I remember it was fun using GOTO and it was easy to program in BASIC back then – more so because I think we didn’t have real understanding of software and we just fooled around with simple problems.
Some years pass by and one fine day I read that GOTO is considered harmful. Oh ho.. now now hold on, that’s exactly what we were using though – We’d have a section of code and for some logical condition, it’d pass the control back to the line using GOTO statement. The stuffs that we did at junior school were going obsolete by the time we graduated high-school; but it was never meant to be a software for writing production code or at least that was how I and my friends perceived it as. We moved on to C++ in high school and that was as we know another can of worms. Nonetheless, BASIC did make me think in certain logical way, or rather i’d say it did help lay some level of foundation for further studies.
Completely agree. Taught myself BASIC. Wrote spaghetti code. Lost track of things because of the spaghetti. Pretty much hit my limit as to what I could do because I wasn’t able to keep track. When I learned structured programming via Pascal, it fundamentally taught me how to organise code, and then I could see how I could write programs much bigger than I was able to in unstructured BASIC.
Pingback: Lightly Organized Chaos » Blog Archive » Perspectives on Programming
Hmm, the 8-bit BASICs I used in the 80s had WHILE…WEND constructs and arbitrary length variables, and local variables to subroutines. They also allowed for spaces in the code so it didn’t look terrible like your example (line 0? wtf! They go 10, 20, 30…). Look at BBC BASIC, widely considered to be one of the better BASICs, but even Locomotive BASIC wasn’t too bad.
BASIC was a good solution for limited 8-bit computers with small memories. Sure, these computers also got Pascal, Forth, etc, but not built in to the computer.
Most people remember it for typing in three pages of listings from a magazine, and then spending the rest of the week fixing bugs and typing errors by instinct alone (well, “Syntax Error in 1020”). Tell me that isn’t good for debugging skills!
Pingback: Exposure to BASIC, Boon or Bane? « Hopewell Valley Linux Blog
I started programming in Basic, writing really large (800 lines!) programs as part of a fuel rod editing suite on an HP85, since the VAX was usually unavailable. That was fun, until I found that I had to _maintain_ the things as well. At that point I suddenly got converted to really rigid discipline in comments and syntax; it was the only way to keep the program structure understandable. I’ve stayed that way ever since.
OTOH, I developed some quirks that are less understandable; I now never use ‘I’ as a variable, for reasons related to a long two days ebugging …
I started with QBASIC, and I learned some very important concepts. For example, statements are executed in order. If you want to run the same code twice, you have to get execution back there somehow. Some algorithms are faster than others to do the same thing. Variables store data.
In today’s introductory programming courses, these concepts are usually still covered, but they’re covered in regards to a specific language. I’ve actually seen a student ask “Are these variables [in C++] like the ones in Java?”
I am glad to have started in BASIC. It serves as a nice example language, for teaching the concepts of programming without being hindered by suitability for large projects.
I completely agrre….
I started programming with 4, when my father came with a MC-1000 (a idiossincratic, half-APPLE II half Tandy, brazilian computer) and I stated to programming a little, going to a CP-400 (a brazilian Tandy Radio Shack Color clone) and a Hotbit (a MSX 1.0), all of them programming on BASIC. My only other contact with other programming languages was a little of COBOL (on a computer school) and a little Assembler (from a book, to MSX Z80 CPUs), until I came across Clipper Summer ’87 and left BASIC behind.
But so many things I learned with BASIC: how to work my way on programming constructions (iterators, conditional loops, etc…), how to declare and initialize variables, etc… I believe I learned more with BASIC on ‘ 80s than with any other language any time (with exception when I learned C and C++)
I think that people nowadays are turned lazy with the all-so-cool all-so-ready languages like Java or Python or PHP. You don’t need anymore to take care of memory usage and kwoning about the inner working of stacks or heaps or other structures. You use what language offers you and be glad with the all so might Masters that offered it to you.
And this is sad AND problematic.
Think when you need to programming for, eg., cell phones (with little memory) or mainframes (with all the complexities on resource usages), and work it out like you’re programming a Webapp or Desktop app. It’s not the same thing. And things could go REAL AMOK!
Sorry about my poor english, and it’s just my 5cents (or R$ 0,15, as I’m brazilian)
If it had GOTO it was a bad language to teach.
Basic was my first language too. But one of the factors that is required to make a good programmer was already in the language. Take this construct from your basic snippet.
k=k+1
I had read a research paper a few years back, that suggested the ability to differentiate the math “=” and the programming “=” (assignment), was an indicator of what will make a good programmer out of high school graduates.
(Mathematically k=k+1 is obviously wrong, and the programming assignment operation is a difficult idea for some children to grasp.)
So if you figured that out those days, you were on your way to becoming a good programmer!
I’ve never understood what is supposed to be so difficult about k=k+1. It takes maybe twenty seconds to explain what’s going on, and then makes perfect sense. As for differentiating between assignment and equality: all programming languages do that: some by syntax (e.g. Pascal := vs. =, C = vs. ==) and some by context (e.g. BASIC).
The article and discussion kind of remind me of this:
http://www.pbm.com/~lindahl/real.programmers.html
plenty of good programmers who cut their teeth on Microsoft BASIC on late-1970s and early-1980s micros. (That’s how I started out myself.) So why haven’t we been mentally mutilated?
What makes you so sure you aren’t?
Each programming language, like each human language, shapes the perspective of those who learn the language and use it. BASIC shaped people’s perspective into “I must write short, programs with large numbers of goto statements” because they had to, and all of the example programs were written that way.
Our brains are not muscles that get stronger through exercise (this is a retarded sentiment held by Nintendo DS owners) …they are pattern recognizers. Programmers who learned to code in BASIC try to follow the BASIC pattern in all programs they write, regardless of language. Thus, learning BASIC does not help you learn something like safe coding practices in Java. Essentially, unless you would be using BASIC, you shouldn’t learn to program in BASIC. You will just end up a worse programmer in the language you actually use.
Can a language really be classified as good if it’s too hard for most people to learn?
Can a piece of music, say a Chopin Etude, be classified as good if it’s too hard for most people to play?
Tim: touché!
:-)
But although I LOLled at your comment, it’s not truly applicable: a piece of music can be judged as good or not by its effect on the listeners, not just on the person playing it. Whereas we judge a programming language primarily on what it’s like for the poor programmer.
Design Patterns are the new Goto for me. I’ve seen some serious spaghetti design. Good programmers do not get stuck in a paradigm. They figure out how to make the best with the tools they’re using.
Tim & Mike: The analogy is a bit off. Chopin’s music is the equivalent of the software. The instruments are the equivalent of the language. So the music complexity is associated with the complexity of the software design.
The ideal language for the problem-space would be capable enough to implement the design, while at the same time presenting the lowest possible barrier to entry.
Did you read Kurtz’s interview in the O’Reilly book?
http://oreilly.com/catalog/9780596515171/#toc
I think your remark about the development of mental muscle is relevant, but there is a limit to how much weight we can lift at any one time. Studies have shown that the human brain manages information in chunks, and that the average person can comfortably handle 7 ± 2 simultaneous chunks. I think this is the primary reason why unstructured programming felt so uncomfortable: you had to remember where all those trails of GOTOs led in order simply to understand the purpose of a bit of code. The legacy of Djikstra is that by reducing structural complexity we are better able to comprehend higher level abstractions, which should lead to simpler and more elegant solutions in code.
Dr. Sid, you are so right. I get really irritated whenever I read some stupid newspaper piece about some new generation of kids being “tech-savvy”. They are not. They are worse off than my generation with technology exactly because of the issues you raised.
Anyone under about age 25 or so has to really make an effort to be exposed to the command line, and most do not. Most schools do not teach any programming at all as far as I can tell. For all its faults, exposure to BASIC on the vic-20 and commodore helped me immensely compared to the students I teach now.
“I don’t think any amount of structured programming helps with that kind of frankly messy problem. It’s hard to get right.”
That is not a big argument, luckily today we have solid javascript engines, plenty of ajax libraries and tons of MVC samples.
You could easily dump your database-driven Perl-generated Javascript functions, with a couple of javascript functions pulling formatting/instruction from your db via AJAX and manipulating DOM accordingly.
It is not hard to get it right, nor more complex than the actual implementation… probably the root of evil your app itself by being tightly integrated to Mason.
Rod Hughes: sounds like the ideal “language” for performing the Chopin Etudes, then, would be a CD player :-)
My comment is that good programmers write good code and bad programmers write bad code.
I started in BASIC back in the day and have since learned dozens of languages. And I personally have written some bad code, it doesn’t happen as often as it used to, but I have seen bad programming in all of them. No matter what language you are using if your a bad programmer you are going to write bad code no language protects against that. It can be corrected with education and effort but that doesn’t change the fact it was bad code when written.
I feel I have improved dramatically over my career and I see myself getting better in the years to come. Programming languages aren’t good or bad. They just are and each has strengths and weaknesses. Use them to maximize their strengths and minimize their weaknesses.
Personally I am glad I learned BASIC as my first language because the syntax was easier to understand then getting all of the semi colons in the right place. So instead of getting caught of in syntax I was coding.
Q, to my shame I had never even heard of that book (“Masterminds of Programming”). Looks very interesting, so I’ve added it to my to-buy basket. Thanks for the tip-off.
Why, what does he say?
Mike Taylor: LOL on the CD player! Sometimes COTS is the right solutions :)
The world is not full of excellent programmers, it just so happens of the programmers those who started with BASIC tend to be excellent. TBH I think this is more related to BASIC being a language people picked up on their own to learn how to program rather than “learning” how to program in university.
I started with a mix of Apple II Integer BASIC and a BASIC language on the T.I.E.S. timesharing system, and I thought BASIC was nice for learning for two reasons:
(1) It gave you immediate feedback. You had no distracting compilation stage, since most BASIC implementations in those days weer interpreters.
(2) It forced me to organize my thoughts when working on a larger program. I used to set up a comment at the top showing the line number ranges for each routine so I knew exactly where to go.
A few years later I learned Fortran, Pascal, and a number of other languages, and they made me realize how poor BASIC was, but BASIC gave me a chance to further appreciate the structure that a language like Pascal could provide.
I never thought about the mental exercise keeping track of the paths, and global variables was. I now feel like my early days with Basic were not a waste of time, but rather an extremely beneficial mental exercise.
TRS-80 Users Unite!
I can only add an anecdote: I learned to program on Commodore BASIC, and then many years later, I took Dijkstra’s Mathematical Methodology course while I was an undergrad at UT Austin; he didn’t seem to hold it against me. :)
Mike: your comment on Eiffel implies that we should only use languages that are suitable for beginners. If teams of beginners typically produce large systems that are of high quality, reliable, safe, secure and correct, then this would be a reasonable position, but there is plenty of empirical evidence that the opposite is true: producing good software requires a certain expertise. Eiffel is an example of a language having features that, while they might complicate the task of writing small programs and require some in-depth understanding of programming, can be used advantageously by teams writing large-scale, mission-critical software. We accept that bridge- and tunnel-building requires expertise, why can’t we see that it’s the same for software?
Oh, you are mostly such children! I was enthusiastically going along with the pro-BASIC argument when I realised that my first language was FORTRAN, coding sheets mailed thousands of miles to be run because my country had no accessible computers; followed by Pascal, when being very polite and friendly might get your job run in as little as 4 hours, though the operator would remind you that you weren’t entitled to anything less than 48 hours. Oh, the joy of maxing out two of our credit cards for our own TRS-80 and the wonders of BASIC being interpreted instantly.
I wonder if at that date (1975) Dijkstra was mostly complaining about the harm done by getting instant feedback from an interpreted language (of which BASIC was the only common example) instead of the detailed deskchecks you did if you had to wait hours before getting the list of error codes which were supposed to explain why your program failed to compile. It didn’t take many such cycles to get a program debugged — or a potential programmer discouraged for life.
I could go on for hours about why BASIC is a good first language. The chief black mark against it used to be that there were so many bad programs written in it — but that generally isn’t held against C in the same way.
[Oh, and let’s not forget what we learned from programmable calculators.]
Dr. Sid you are so right. Just yesterday I needed to create a program to perform a simple task. What would have taken maybe 30 lines or so in Basic took a solution, a project, a form and a save dialog in DotNet.
Andrew Raybould, you make a very good point (about Eiffel, but it obviously applies much more generally). Easy to learn is not always the same thing as easy to use once learned: a language like Eiffel may have a steeper learning curve than BASIC, or (to pick a more realistic example) Python or Ruby; but may bring correspondingly, or even disproportionately, greater rewards once that curve has been climbed. (And some languages don’t have a learning curve at all, they have a learning cliff. Yes, Lisp, I’m looking at you!)
Still and all, Graham Hay’s comment, which introduced Eiffel to this conversation, introduced it as “Much too hard for very basic (VB) programmers”, almost as though that were a point in its favour. It’s not, of course: if the learning curve is steep, that is not a reason to dismiss the language out of hand, but it is a strike against it, and it had better have plenty of Awesome to offer in exchange for requiring that large up-front investment.
BASIC is one of those languages that really should be part of any aspiring programmer’s background. Sure, C++/Java etc can teach you to program and be useful to an employer,… but BASIC was good for teaching you to be a programmer.
I think that is one of the failings of modern programming classes. They focus too much on teaching technologies that can be applied directly to jobs. A good teaching language is one that you HAVE to drop and never use in a production environment.
Correlation not causation?
Agree with you 100% that most programmers who cut their teeth on basic later went on to become good ones, I went the same path.
But someone presented another theory: kids who started out programming in 80s and early 90s, when basic was still used, were dealing with computers that whose complexity was still not too great to understand. You knew your VIC-20 and early PC inside out, and by knowing most everything about how your machine works you could program it well. But someone who started in 2001 with Windows XP and DirectX8 on had little chance to understand his system. So he learned programming by copying patterns of others — like we did — but rarely matched that with understanding of what goes inside, unless he was exceptionally smart and had plenty of time. I know I wasn’t, though I did have a lot of time so I managed fine.
If that’s true, we’re the last generation of programmers where being good was not an exception!
You are so right about this! BASIC existed (and still exists, inside very cheap computers you can buy from online auction sites) for very good reasons. Beginner’s All-purpose Symbolic Instruction Code. It may have been the most popular programming language in the 1980s but it was initially designed for beginners. Guess what: there are still beginners! And, there are very few simple programming languages to start with.
Natural progression I’ve observed:
BASIC -> Assembler -> C
This is the correct progression, all of the pieces are necessary to a complete understanding of what you are actually -causing to happen- when programming.
The funny thing about assignment vs. equality is that BASIC actually got it right!
BASIC used the syntax “LET k=k+1” which is really pretty clear about what’s intended. It’s other languages that have a problem with ‘=’ being confusing.
Davor Magdic, excellent observation — I should have said in the first place that the correlation we see might not be due to causation, so thanks for mopping that up. Yes, it might be as simple as this — that in the days of BASIC, the only people who programmed at all were those with a natural aptitude for it, whereas these days the market is flooded with people who have learned a certain amount of programming because it’s a lucrative career, rather than just for the joy of it. In the 1980s, those people would never have learned programming at all. Maybe that’s it.
I started learning to program in 1964, first Fortran With Format and then COBOL before moving on to 360 Assembler. To date I have used at least 35 different languages. Lets face it, programming is just a matter of putting the comma in the right place. These arguments are like saying that you can’t ever learn German if you started out in English. One can be a bad programmer in any language whether or not he uses a GOTO statement. I would hate to think that after 46 years in the business I had not progressed beyond the first languages I learned.
BASIC is like erecting a building made of cheap duct tape.
C is like building super strong but highly flammable duct tape out of canvas and adhesives and then using that to erect a building.
C++ is like building super strong but flammable duct tape out of canvas and adhesives and then using that to make flame-retardant puppets who form themselves into a building.
This is something I remember from the 80’s WAY too funny.
http://www.pbm.com/~lindahl/real.programmers.html
Having cut my teeth on TI-Basic (yes, from the 99/4) in the early 80’s, I’d just like to comment that Basic itself wasn’t really the issue – its the fact those older languages were simply not the library-rich Object Oriented languages we use today. I was a ‘plain’ C coder for 15+ years (until .Net happened!)… so the hardest thing for me was learning proper OO-style usage coming from the old-school ‘linear’ approach to app design. Permanently damaged from using Basic? – I say only ‘temporarily impaired’ because OO requires a different mindset/methodology.
Pingback: » links for 2010-03-10 (Dhananjay Nene)
I agree Matt M,
Once I understood Structured-Programming and OOP models, then even writing BASIC using user defined functions could be done in a structured approach. OOP of course was harder to impliment, but a set of global variables that were only referenced by specific subroutines (and ignored everwhere else) you could simulate the OOP model. Now I write in AutoLISP and in .NET and those early years of GOTO and GOSUB are all but forgotten. I had learned Time Share BASIC and FORTRAN IV back in 1973 when I was 15. Programming has come a long way since then.
I think Basic is like any other language. Use it right, good things happen;use it wrong, good things don’t happen, some bad might.
I have written in about 30 languages through the years; my favorite is C, my least favorite C++.
I have no objection to object oriented; I just happen to think that C++ too closely resembles C; this leads converted C programmers into “bad” (by the standards of C++) habits that a newcomer to C++ would not have learned.
At the end of the day, it is the person, not the language, that matters. Thinking in an an organized, logical manner is more important than the language used to express those thoughts.
One more comment. I remember reading something about different programming languages that always stuck me as very funny and very true.
Pascal is for beginners, those that are learning programing and need to be guided along the proper path.
C/C++ is for the experience programmer that know how to write proper code and can test and find/remove bugs in their programs before the testers even get a peek at it.
ADA is for Criminals. (It imprisons you in the language with no hope of escape to mess something up even by accident!)
Pingback: Simplicity, Elegance, Reliability « Clockwork Zeppelin
Matt M: I’m another TI-BASIC person! It’s worth mentioning that TI Extended BASIC actually had support for subroutines that you could call, so it actually did start to bring you into the “structured programming” era. The original TI BASIC, though, was just a shade fancier than the Dartmouth BASIC that was current at the time Djikstra wrote is essay. Yipe.
And that’s actually a fairly important point: Djikstra wrote his comment in 1975, but most BASICs that we’re familiar with came out later with a number of refinements. Longer variable names, named subroutines and so on. GW-BASIC is a far cry from Dartmouth BASIC, and QBASIC is as much as step over GW-BASIC as GW-BASIC was over Dartmouth.
I programmed several BASICs as a child growing up. It’s not a bad language for *small* programs. The thing it that it just doesn’t scale.
As a language for teaching the, erm, basic principles of computing though–the concept that the computer goes through a series of steps, manipulating small pieces of state, making simple decisions along the way–it’s not so bad. It’s like learning the alphabet and block printing before getting to cursive, let alone learning proper grammar and spelling. Structured programming concepts are more at the level of “how to write a good essay,” whereas BASIC is more about “how to spell.”
As an grade schooler writing my first programs, BASIC was perfect for letting me write programs that did something simple and well defined. Just as you wouldn’t expect an 8 year old to write a novel or even a short story that’s more than a page, BASIC was suited the kind of programs a kid that age might write.
Its low level nature also prepped me for learning assembly language. Knowing how computers work at a low level is very useful in its own right.
When I outgrew BASIC’s limitations, it wasn’t hard to move into Pascal and later C. I can’t imagine starting in either of those languages, especially as a grade-schooler, without having someone help me significantly, though. BASIC, however, was easy enough for me to pick up even before I had learned long division.
Categorical statements like Dijkstra’s are meant to be taken not only with a grain of salt, but also in a particular context. He might well have said instead, “I have found that exposure to Basic and Cobol in my students renders them harder for me to teach, than if I could have started them out with my own particular set of predicates, ideals, and practices.”. That is what ruined meant to him.
The human mind, motivated by the desire to create great software can no more be harmed permanently by reading BASIC, than a great writer could be greatly harmed by reading comic books. Nevertheless, I can just about see an English teacher bemoaning his class of illiterates happily reading comic books, instead of grown-up novels with words, and no pictures.
“Classics Illustrated Considered Harmful”; I can imagine the essay being written a dozen different times by various teachers.
Dijkstra was a talented scientist, educator, and human being. He would be happy that his ideas and opinions have a dialectical value, if not containing an absolute truth in the idea that BASIC harms you, they contain many valuable, important elements.
I started on Commodore 64 basic. I learned almost everything again when I moved to assembler. Then I relearned everything again for ANSI C, and Pascal, and Python,…. and…and. In short, learning is the only thing one really needs to learn. Because learning can never stop. Anybody who can learn BASIC and nothing else was not harmed by BASIC, their brains are merely full, and they should be allowed to transfer to another degree or diploma program. Do not let them graduate with a comp-sci degree.
W
I liked what Basu said about “Spartan thinking and how to work your way to real solutions with highly constrained resources.”
The fact that really minimal legos could enact a thing that existed completely in your mind (the program didn’t resemble the idea at all) is important somehow.
Whereas in good software engineering and language design we try our best to make the program resemble what it is.
It’s important that you can take an idea and figure out how to make it happen; the idea that you should push your thinking toward what, and that how thinking is limiting, is also important.
I’ll bet programming in Eiffel is spartan in a very different, but good way. You know putting your thinking through that strainer produces a good effect. Whereas the rigamaroles of Java are so often just draining mickeymouse.
It’s interesting that Python is both really simply interactive like basic (“print 10” at the prompt just does) and also good for both clear expression and serious work. I think that breaks some of the dichotomies people make and it shows that language really does matter.
The basic-> assembly-> C sequence reminds me of Minol–“Tiny BASIC with Strings in 1.75 K by Erik T Mueller “, in which variables were one byte, and [H,L] meant either a byte or a string at a 16-bit memory location.
Pingback: Top Posts — WordPress.com
I’ve used BASIC over multiple generations. I remember BBC BASIC, GW-BASIC, QBASIC…but I spent most of time using Borland’s Turbo BASIC compiler.
Now THAT was an awesome language.
It had the lot – SELECT CASE, WHILE, defined functions and procedures, and most importantly for me was the ability to incorporate assembly. This gave me the ability to write a true 3D game engine that ran relatively smoothly on a 486DX-33.
The key ability I gained from these experiences was being able to configure algorithms and processes to match the limitations of the language. That’s a good thing, because the simpler an algorithm is the more robust it can be. It also means that the you’re not too far removed from Assembly.
Machine code and assembly isn’t object-oriented, it’s basic logical operations. ADD, MOV, MUL, CMP, etc. Object-orientation and other trends are tricks and complications that remove you further and further from this basic logic.
My biggest influence I think was Michael Abrash…if you haven’t read any of his articles I think you’re missing out on valuable lessons on performance programming.
My programming slowed down when Windows became all-powerful, and I lost interest all together when programming became a case of incorporating libraries of functions someone else had written. I liked the challenge of figuring things out for myself!
I started with DEC Basic ..and wrote the game program that turned into Nukewar for Avalon Hill…
I developed a multiplaver version in RT BASIC then converted it to Data General Basic that allowed the players to communicate with each other with no external code needed..just using the way BASIC handled printers ..to wit declared the other guys terminal the printer port .. take the input.. put it in an array.. then “print” it to the other station.
Got told by “real programmers” that it was impossible…
I stopped using BASIC when Visual BASIC came into the world..why would I even want to trust the other guy wrote his code as tightly as I do myself?
I used every BASIC variant for microcomputers in the 70’s and 80’s from Coleco ADAM BASIC (Which was really a Z-80 version of Applesoft) ..Apple Basic, NorthStar Basic, Processor Tech Basic, TRS-80 BASIC (model 1 to model 4P), Commodore PET BASIC and yes even loaded the paper tape version of Microsoft BASIC ..
I did also work in Fortran ,COBOL, PL/1,PILOT, APL and even a short time with HAL/S (the shuttle on board computer language)
It isn’t the language you use..it’s how you were taught to use it.
I was taught to write clean well documented code, to minimize memory usage, good error messages for input errors, traps for obviously bad input (alpha in numeric field,etc)
And yes.. the ID I use was created as part of my writing NUKEWAR ..I’ve been using it since 1976 on a Decsystem 10 a Cerritos College in California
It is more effort on the part of the programmer to write logic in BASIC than doing the same in a smart compiler like C++/Java; the smartness it brings with the effort is an asset. It helps to acknowledge the fallacies of BASIC easily while using the more advanced languages.
Just like you cannot learn calculus before algebra, you cannot learn good programming if you start with advanced languages. And even if you do, you wont learn OOP paradigms; you’ll start with the POP part. So it wouldn’t really help. But if you start with a less advanced language, chances are you’ll max out its capabilities, so put your brain into more use.
Djikstra is right about upstarts like BASIC. Assembler is barely acceptable but real programmers stick to Machine Code.
Pingback: Matuk.com » Cuando Dijkstra se equivocó
This coming from the man who said that Object-oriented programming is an exceptionally bad idea which could only have originated in California. – Arunabh Das
My uphill both ways backwards in the snow: Algol 60 (1963), Elliott 4130 Assembler (1967), BASIC (1968), FOCAL (1969 on a PDP8e), writing my own variant of FOCAL (also PDP8e, needed to extend the language – no return stack!), designing my own Assembly languages and implementing micros to run them, more BASIC, then finally C, now the full gamut and enjoying very much JavaScript (a refreshing return to the pristine elegance of Algol 60, and where functions are first-class objects again, with real closures).
When BASIC is all you have, and you are learning, that is what you have to run with. This was the case pre-1982 or so for almost everyone, although pre 1969 most of the few of the time did not even have that.
I think the most valuable lessons I kept from those times are: structure for readability, design for efficient use of resources, analyze your algorithm complexities. By the time I left hardware design for software design (around 1980?) I had already had enough language exposure to have realized that the language was not the world, just a particular perspective.
So I do not agree that exposure to BASIC syntax during the formative years is crippling per se. Back then. Today, yes, start high level structured. Those that move on to resource-limited systems can then explore the tiny interpreted languages like BASIC and FORTH, or pass Go and go straight to Assembler Hell. But that really is optional in these days of finger-nail sized SOS’s like the Overo that can be implemented in a board the size of a stick of gum, have USB 2.0, Ethernet, 802.11, BlueTooth, HDMI, 256MB RAM, and run Debian Linux.
Pingback: See's Message » BASIC作为入门编程语言的价值
I still find BASIC useful – QBASIC, no less.
If I need some task doing once, I can spend an hour planning it in C++, another hour writing it, another hour debugging the mess of ASCII symbols till I find the one missing symbol which crashes the program…
…or I can spend 10 minutes writing in QBASIC, run it once, and move on.
Structured programming and OOP are reasonable (not perfect) ideas for large multi-programmer projects that take months to complete. For small utilities they’re a waste of time.
I’m another one that still uses QBASIC. I don’t do much programming anymore, so I don’t have the time or inclination to learn the flavor of the week. It’s great for doing quick, simple automation, and I don’t have to install a bunch of libraries and crap, mess with my paths, etc.
A little addendum – there’s this concept I know as “cognitive loop”. The traditional example is the edit – compile – test – debug – edit … of your standard IDE environment. The problem is the distance between learning what a problem solution should be, implementing it, and seeing it work. BASIC, along with FORTH and Smalltalk and some others, have a much shorter cognitive loop, in that what you type is executed immediately (if you want) and you can invole and sanity test stuff as you write it without needing the full application infrastructure present to support it. This allows for what I call “Exploratory Programming”. Coupled with good instrumentation (which such systems incidentally allow one to quickly fabricate) this makes for very rapid development because the idea in the mind is not de-chunked by having to mess with the mechanics of making it happen.
The downside of exploratory systems, where every action changes your environment, is that it has proved remarkably difficult to seriously support teamwork. My FORTH is not your FORTH, even if we start with the same distribution, and my debug tools will probably break yours. Smalltalk has seen some interesting attempts to solve this problem, notably in the banking arena (for some unknown reason), but I do not know how that really works in practice. DigiTalk’s Smalltalk V team did come up with an interesting fragmentation manager, which worked sort of OK for my personal projects, but there remains the question of the independently built instrumentation addons I made to my environment that are not part of the project fragment, so they did not travel to other programmers very well either. I stopped using Smalltalk when I discovered its closures equivalent ([ … code … ] contexts) were global so it was not possible to run multiple copies concurrently. They told me that was how the language defined it.
Pingback: Programming Books, part 3: Programming the Commodore 64 « The Reinvigorated Programmer
/What would have taken maybe 30 lines or so in Basic took a solution, a project, a form and a save dialog in DotNet./
You need Python, and (especially if you use Windows) the IDLE environment. For Mac and Linux, a command window lets you run Python just like back in the Good Old Days.
Pingback: Post Position » Art as Process, BASIC Considered Helpful
10 FOR I=1 TO 10
20 ? “CLAP”
30 NEXT I
Pingback: What is “simplicity” in programming?, redux « The Reinvigorated Programmer
Pingback: Commodore/BASIC/Early Programming Nostalgia Links | cmsnewstoday.com
The vast majority of the languages being bandied around here (Basic, Algol, Fortran, C, C++, Java, etc. etc.) all boil down to varyingly pretty ways to arrange assignment statements and an excessive concern with sequencing. Until we convince people that (a) programming per se isn’t really that exciting or important, (b) data is way more important and (c) what you want to have happen is far more important than how it’s going to happen, the state of the art is going to look increasingly like the state of the ark.
Tony D., I think I addressed some of your concerns in the post The difference between imperative and functional programming (https://reprog.wordpress.com/2010/03/11/the-difference-between-imperative-and-functional-programming/). But I also think your sentiment “programming per se isn’t really that exciting or important” is not likely to draw a lot of wholehearted agreement here. At least, put me down in the “it is too that exciting and important!” camp.
Hmm… I think what you really need to look at is the difference between declarative programming and imperative programming; declarative programming (whether it’s in SQL, logic languages like Prolog or functional/applicative languages like Haskell) concentrates on the what, not the how.
The reason behind the “programming per se…” comment is that large amounts of programming time gets spent explaining to either the language or some supporting infrastructure that you’re thinking about doing something, rather than getting on and doing it (consider the huge amounts of boilerplate involved in a COBOL or J2EE program). Sure, tooling can help, but that really is an elastoplast over a gaping wound.
The importance of data, and the importance of understanding what data is (and isn’t) and how malleable it can be, always seems to be underplayed in favour of “programming”. Any serious program worth the writing is going to deal with non-trivial data, so why not spend at least as much time learning about that ?
One last thing; LISP is a really bad idea as the “canonical functional language”; for a start it generally has updateable store (bad bad bad !) and as you note it suffers dialect explosion. Haskell really would be a better bet – and, since it’s effectively syntactic sugar for Church’s lambda calculus of the 1930s, it trumps LISP on seniority too ;)
Thanks for a great article. Children of the 70s and 80s! Whoooo!
@Walter Aprile “having learned BASIC (in my case, two BASICs: Applesoft BASIC and Commodore BASIC)”
Whoa, me too! Apple ][+ and C64. Ahh, the memories…
@kikito – “If it had GOTO it was a bad language to teach.”
Wow. Just… wow. /facepalm
Pingback: Testing is not a substitute for thinking (binary search part 3) « The Reinvigorated Programmer
Some of us are showing our age… :)
Yes, I, too, cut my programming teeth on BASIC back in the 80’s on an Apple II.
Re: Mark Muller’s comment – “Lets face it, programming is just a matter of putting the comma in the right place.” – my experience has been missing semi-colons… ;)
Thank Mats for Ruby!
My exposure to BASIC was the worst thing that I could have done. It took many years to shake out the bad habits I learned. The best thing to happen to the BASIC language was removal of line numbers. Renumbering and rearranging the order of routines was possibly the most irritating thing we had to do, in order to speed up the interpreted ones.
BBC BASIC V is my favourite version. The whole interpreter could be held in the StrongArm’s CPU cache, which made it blindingly fast.
The inline-assembler for all versions of BBC BASIC on the Acorn machines, were phenomenal.
I’ve been having fun lately learning the “GW” BASIC that came with the first IBM PC using original documentation and an interpreter that are readily available online. Admittedly, it’s a little more challenging than the VB6 that i started on in 1998. But it’s quite an improvement, imo, over the Commodore 64 BASIC that i’m just now researching. Still, if i want anything at all, like a pop-up message box or a drop down menu, i have to reinvent that wheel, which is interesting at first, but could get old after a while. So i can understand the value of making libraries of reusable code that can be called with just a (descriptive) word and a few easy to understand arguments. And if it was 1984 all over again, and i needed to write some serious code, i would think about coming up with something like QBASIC/QuickBASIC sooner than later. In studying both, and the function of the LINE INPUT, i reckon it would’ve been possible to build a QBASIC-like interface and syntax (complete without any line numbers or gotos) right on top of the GW-BASIC interpreter! (At least v3.24). I’m still looking into it, but i’m not so sure this could have been done with Commodore’s brand of BASIC. As it applies to this discussion, i feel as though i’m learning a lot by this exercise, and can relate to the enjoyment of knowing more about the machine and it’s processes, and looking under the hood of today’s language hot rods. People should understand that a DO LOOP is a way of articulating a naked goto like:
10′
.
.
100 GOTO 10
…where the various conditional methods to exit this infinite loop are what give us our various WHILE/WEND,DO WHILE, DO UNTIL,LOOP WHILE, LOOP UNTIL constructs. Likewise, the SELECT CASE is a mask over multiple IF-THEN statements…and on and on. The way i see it, if you can make money in the stock market with your language, it’s a good language. And we’ve been able to do that since at least QBASIC if not before.
I absolutely adored the Creative Computing books when I first started developing, and all the code was in BASIC. I don’t think I would be a programmer today if I had started on something like Prolog or C. BASIC has that magic ease-of-first-use and the games in the Creative Computing books gave me plenty of reasons to keep going with it.
After getting a BS in CS and spending the last 7 years doing professional software development, I can’t really remember how to write in BASIC (at least not without a reference), but I still remember it fondly from my childhood and I don’t think it crippled me at all.
But for B.A.S.I.C. I probably never would have started programming. In around 1978 I got given an 8-inch floppy diskette with “Microsoft BASIC” on it and never looked back. If that had been a C compiler, I would have just carried on being a technical translator. By the way, VB.Net sucks. Only classic Visual Basic deserves praise. What a shame that Microsoft in its unwisdom dumped it.
Pingback: What is “simplicity” in programming?, redux | The Reinvigorated Programmer
Pingback: Langs: рассуждения « evetro
Heck, I learned to p;rogram in FORTRAN, before BASIC was invented. What relief it was when I discovered C. BUt then OOP came along to set us all back 20 years.
Who cares what Dijkstra said, mathematicians are mostly unable to think in a psychological way, that’s why they are indeed mostly bad teachers. BASIC was developed intensively in the course of time and is therefore today the most common programming language at all. There’s nothing you couldn’t do with it!
In practice, you can’t do everything in Basic. This drove me to assembler, and that’s when I really started to learn something (I started poking machine code, and then wrote an assembler / loader in Basic because an assembler wasn’t available for the machine I was using.)
Ada gets casually abused above; it’s sort of frustrating to work on a language that’s pretty much a feature-superset of Java, that has explicit bitwise conversions between types, has assembly inclusions mentioned in the Standard, as well as a standardized C interface, and it be treated as if it locked you in with no chance of escape.
More on topic, Ada doesn’t have a continue statement, on the grounds you may as well use a goto statement there. And in general I find the attacks the goto statement gets to be overly influenced by FORTRAN and BASIC, languages where it was the primary control structure targetting a numberic label, instead of languages where it’s used along with more powerful control statements, With a text label, it can be a lot clearer about what it’s doing then the break or continue statement.
Strong agreement on GOTO. It’s more explicit than BREAK or CONTINUE. For simple uses, those two commands are better, but whenever you need to do something a bit out of the ordinary, such as break two levels at once, a GOTO is much clearer than the sort of messing about with state variables that you otherwise find yourself doing.
most of you are idiots. why compare old basics from the 80s with their gotos to modern languages like c or c# or java etc…. there are modern basics now, just as capable to do anything any other language can do, just as ‘scaleable’ as required. there are no ‘limitations’ to modern basics except except mindless bias probably because the language has the word basic in its title, but modern basics allow to skip pseudo coding because they are much closer to written human language than other so called ‘professional’ languages. nothing stops anyone frim writing in a structured way as large a program as they wish with modern basic dialects Modern basics ARE as as good, or better than a lot of currently popular goto languages.
Dijkstra’s complaint was made in 1975; I’m guessing he was comparing it to ALGOL 60, ALGOL 68, Pascal and PL/I, all of which I would rather program in then a BASIC from 1975. The first edition of the C programming language was published in 1978 and the Ada standard and compilers for it appear in 1983. This discussion has mostly involved comparing programming languages of comparable era.
Your argument for modern Basics has completely failed to sway me, but I think that’s an argument for a different time and place.
Dijkstra’s complaint might have been made in 75 but it is still used by mindless sheep like yourself today as a reason not to use BASIC. Not many like to mention that better older BASICS like BBC BASIC, Locomotive BASIC, Enterprise BASIC, ST BASIC, AmigaBASIC all had constructs that did not require to use GOTOs at all (REPEAT, DO, WHILE etc) and that spaghetti code was practiced by over enthusiastic individuals NOT because they were using BASIC, but because they didnt know any better. But like all biassed sheep, these facts are conveniently ignored by people putting the language down, a language closer to English compared to curly braces or invisible spaces. Isnt the purpose of a programming language to create a workable interface between man and machine. So why choose languages with strange symbols over one that appears to read English? Probably because one who understands those cryptic symbols has their ego super inflated.
There’s a touch of hubris in claiming to know why I or any of the other posters might reject modern Basics. Personally, there’s no modern Basic that has good Unix support including excellent C or Java interfacing, which was one of my requirements for a new language. There were several others, but again, this is not my blog.
In any case, if I’m looking to skip the strange symbols, I’ll use COBOL — or its object orientated extension, ADD ONE TO COBOL GIVING COBOL — and avoid all those funky +=* symbols that infest BASIC. There’s a lot of deep thought in programming; realizing that { and BEGIN are the same thing is hardly the hard part.
I suspect part of the support for BASIC is nostalgia. BASIC on the Apple II was my first experience after a few lessons during maths in year 7. Simple programs, but the fact I could instruct this machine appealed to me, and I wanted to do more. So I learned on a quick succession of machines, a VZ200, Vic20 and a C64 for a few years. While this was the early 90s, and the C64 was just a 2nd hand one we could afford, having tha manuals, which taught basic and being able to input and save the program directly to cassette made it fun. I borrowed books from schools with listings and tutorials and wrote some basic programs. I wanted to be able to do what the software I ran did, OR at least be able to get into the guts of the machine. Not really possible in BASIC, not easily. I knew that there was a way to do more, and wanted to learn assembly/machibe code, but had no means to do so. I even began to write a program to interpret a byte code I made up instead, to allow for self modiying code and more complex control mechanisms. (I never heard of byte code at this time) Basic was too limiting. No direct access to graphics, interpreted, and hard to maintain. keeping track of gotos, keeping track of logic. Editing and reorganising code was a pain. Qbasic was better, allowing you to edit code as a text file, copy and paste.
It wasnt until I was lent a 386 that I was able to really try assembler, and I never went back to BASIC. Assembler taught me a lot, about how the computer really works, about managing routines and functions, and logic flow. It was also kind of tedious.
After a hiatus of a few years due to uni, I used basic once for a simple population modellling program for Biology.
I then moved to C after Uni, and that was a Gos send. A language with the power of assembly, but the ability to write human readable code that could be reused, code that ran well. It was the best of both worlds, and I greatly expanded my abilities. Ive dabbled in Fortran, and am learning C++.
So here is another that followed the BASIC->Assembler->C path.
BASIC holds many good memories, the joy of being able to control that box that plugs into the telly, learning of all the possibilities, the challenges, the exploration. The computers were fun, invited exploration and intellectual inquisition and were kind open, inviting you to take part in the craft of programming. These days, they just appear consumer products, designed to be closed. I still hold that a computer you cant proogram is worthless.
But BASIC is like your first lover, memorable because its the first, not necessarily because it was the best, or even good. Many good programmers may have started BASIC by simple virtue of the fact that our/your generation (mayybe I’m too young to count) grew up when microcomputers had basic. That is,good programmers have basic, simply because circumstance meant that BASIC was liely the first we would use.
But I remeber the warts, which many may have forgotten. The inevitable spaghetti code, the difficulty maintaining large programs and keeping travk of variables and gotos. The tediusness of refactoring, of editing code in the inerpreter. No copy and paste.
Yes, I have fond memories, but I’m glad I dont have to use it. QuickBASIC may be OK for simple stuff, relly simple but thats it.
Yes, I have fond memories, but I’m glad I dont have to use it.
You completely missed that the whole point of “the value of BASIC as a first programming language” is that it’s a good, simple, first programming language.
Actually, the point of my original post was even less complimentary to BASIC: it’s a good first programming language because its bad, and its flaws force you to learn skills that come in useful when you later move onto good languages.
Pingback: Completely Reliable Conclusions | Ragnarok Connection
APL was better – my first language. Forget loops & think arrays & dimensions.
APL is certainly APpeaLing — I did a short course in it back in the day, and found is fascinating. But it effectively ruled itself out of the programming-language ecosystem by requiring a special keyboard — an idiot move, really. I wonder how well it could have done if it had restricted its syntax to ASCII?
if it had restricted its syntax to ASCII?
IBM (where APL was developed) didn’t use ASCII.
J is an all-ASCII version of APL. J and/or APL are on my list to study some, but either way, I think a language that terse is just scary. Perl is pretty terse, but TIMTOWTDI and it’s still one of the big complaints about Perl.
ASCII, EBCDIC, whatever. Even at IBM, switching to APL meant having to switch the printer ball to one with the right characters.
APL, as implemented, had several fatal flaws…
– variables defaulted to global
– no structured constructs – loops, cases
– enclosed stuff added
The special characters actually had advantages
– more visual, easier to understand, succinct
– people-language independent – no “do” or “if” or “perform”
ronjohn63, I’m using “ASCII” loosely here to mean “the usual set of characters that everyone has access to” rather than the specific character encoding. So, yeah, IBM’s use of EBCDIC falls under the rubric. The point is, no-one without a special APL computer could use APL. Whereas anyone could use Lisp, FORTRAN or C.
I’m using “ASCII” loosely here to mean
:(
This is a blog about computer programming, where words and acronyms have specific meanings.
You are right. I should have written “the ASCII repertoire”.
Nevertheless, it’s hard to imagine that anyone thought what I actually meant is that APL’s downfall arose from its selection of what mapping to use from code-points to characters.
“the ASCII repertoire”
That would mean “the various versions of the ASCII standard”.
Nevertheless, it’s hard to imagine that anyone thought …
Since your facts were so wrong, who knows what someone thought? For example, “What does ASCII have to do with a mathematical language invented before ASCII was invented, at a company that didn’t use ASCII? Am I misunderstanding something?”
Anyway, APL was purposefully created with that character set by a mathematician already familiar with those symbols, in the expectation that other mathematicians would also be familiar with them.
When someone says “the ASCII repertoire”, they mean the repertoire of the 1967 version of the ASCII standard. There’s a line between being precise and being obtuse.
You’ll note that ALGOL 60, LISP, FORTRAN and COBOL all predate ASCII and the last two and PL/I were used at IBM, but all of those map fairly neatly to ASCII. The APL character set alone shows little resemblance to any character set that was in use at the time of its creation, at IBM or elsewhere.
APL was purposefully created with that character set; the wisdom of that is what’s under question. It was created by Kenneth E. Iverson using the notation and symbols he invented; unless they are programmers who know APL, mathematicians will find APL as opaque as anyone else untrained in it.
Ye-es. I suppose it’s conceivable that someone might think that. I suppose people that prone to misunderstanding should be reading a different blog (or auditioning for the role of whoever is the Spock/Data equivalent in the next Star Trek reboot).
Yes. That was indeed the mistake that he made — I couldn’t have put it better myself.
…The thing people forget is that programming was substantially different in 1975. Dijkstra railed against Dartmouth Basic—a glorified assembler language. It isn’t the BASIC used today…
http://programmingisterrible.com/post/40132515169/dijkstra-basic
http://en.wikipedia.org/wiki/Dartmouth_BASIC
Any programming lanuguage is better than none. Basic is far from perfect, but it has its strengths, especially in mathematics.
I can’t remember the computer languages I had to study in college, but I still remember Basic, even the Apple II commands I first used.
Pingback: How Coding, Music, and Writing Help You to Think Big - The Bioneer