Just a quick post let you all know that today is Document Freedom Day. I know that in this day and age it’s easy for anyone to pronounce that any day is Anything Day, but this one is important. It’s about encouraging people to use, and maybe more importantly, accept, open document formats.
I can’t tell you how much it pains me that MS-Word format is still the overwhelming majority in most arenas. People find themselves in the absurd situation where they want to apply for a job in open-source software, and have to send their CV in a proprietary format in order to get it looked at. Or they want to write to their MP about a freedom-of-information issue and have to send the letter in a proprietary format.
This is an immediate followup, about twelve hours after I posted the original What is “simplicity” in programming?, because the excellent comments on that post have pointed me to another insight. In particular, Chris pointed out that “Ideally, you would not have to read the code of all the methods as the name should tell you by himself what is it doing.”
I think Chris has put his finger on an area where individual temperament, preference and aptitude is very important. Probably, skipping over the small methods is the right thing for at least some programs — the ones that have been extensively Fowlered. But it doesn’t come naturally to me at all. It makes me nervous. I actually feel physically uncomfortable about working with code that I’ve not read. So maybe that’s why I am happier with one larger method than several smaller ones sprinkled across various classes and files. (To be clear, I am not advocating big functions and classes — no-one wants to read 300 lines in a single method, or classes with 50 methods. I’m in favour of biggER functions and classes than Martin Fowler is, that’s all.)
We all agree that we want our programs to be simple: in the memorable words attributed to Einstein, “as simple as possible, but no simpler”. Kernighan and Pike’s most recent book, 1999’s The Practice of Programming [amazon.com, amazon.co.uk], has three key words written on a whiteboard on its cover: Simplicity, Clarity, Generality; and the greatest of these is Simplicity.
But what is “simple”?
When I started this blog, the idea was to have an outlet for everything I wanted to write about except sauropod vertebrae — the reason for that exception being that I already have a perfectly good blog that is entirely about sauropod vertebrae, which I encourage you to visit if you have any interest in ancient life.
But for one reason and another, I’ve blogged almost entirely about programming so far (except for one brief article on sushi). That’s not a bad thing, and it’s been fun for me to watch various themes spontaneously developing. But I do reserve the right to write about other things, and the film Percy Jackson and the Lightning Thief is exercising me greatly right now. I need to take that sucker down.
WARNING: many, many spoilers ahead.
This is a blog about being a reinvigorated programmer. So it’s ironic that the most successful articles so far (at least in terms of number of hits) have been about The Good Old Days — Whatever happened to programming? and Programming the Commodore 64 being two examples.
One possible response to this would be to change the blog title to The Nostalgic Programmer, but I’m not going to do that — despite what you might think from what I’ve been writing, I am actually looking forwards more than backwards, and there are plenty of things I am excited about right now, including Ruby, refactoring, REST, Rails and even some things that don’t begin with R. Lisp, for example (although I guess I could have squeezed that into the R-list by substituting “recursion-based languages” or somesuch).
Like most programmers, I suppose, I’m arrogant enough to think I’m pretty good (although see The Dunning-Kruger effect). But through the last thirty years’ programming, I’ve worked with plenty who I know are better than me. Today I want to talk about three of them, and about how very different their skills are; and I’ll finish up by thinking about what the rest of us can learn from them, and what we can do to maximize our own abilities.
I agonized about whether to name them, but eventually concluded that I can say more about their work if I am specific; and what the heck, they might enjoy being held up as examples. They certainly deserve it! So step forward (in chronological order) Steve Sykes, Mike Selway and Adam Dickmeiss!
The recent series of Why XYZ Is Not My Favourite Programming Language articles has been fun to do, and it’s been great to see the discussion in the comments (even if it’s mostly people people saying that I am talking a load of fetid dingo’s kidneys). But I don’t want to extend that series beyond the point of diminishing returns, and it’s time to think about what it all means. As duwanis commented on the Ruby article, “I’m a bit lost as to the point of these posts”; now I want to try to figure out just what, if anything, I was getting at.
Java has four different “kinds” of types. Up until Tiger, it had these three:
- Primitive types: longs, shorts, booleans, ints, outs, whatever. They map more or less to machine types.
- Classes, which are the primary mechanism for extending the language.
- Arrays, which are this weird hybrid type that was introduced to make it easy to port C and C++ code to Java. But you can’t make arrays immutable, and you can’t subclass them, and there’s only modest syntactic support for them, and reflecting on them is painful.
The problem, of course, with having different types of types is that all of the code you write, now and forever, has to be prepared to deal with each type-type differently. So you get interfaces like this lovely one. Scroll down to the Method Summary, and looky looky, there are 76 actual methods in the class, but only 10 distinct pieces of functionality.
Just to make things worse, Java 5 added a new kind of type, called an enum. They’re sort of like C enumerations, but they can also have methods. Which makes them sort of like Java classes, but you can’t instantiate them, and you can’t have one enum type inherit from another enum type, so you can’t have polymorphic methods, nor can you declare an enumeration heirarchy. So they’re not really like classes at all.
— Steve Yegge, The Next Big Thing (lightly edited for brevity).
There’s one wrinkle when while and until are used as statement modifiers. If the statement they are modifying is a begin/end block, the code in the block will always execute at least one time, regardless of the value of the boolean expression.
— David Thomas and Andrew Hunt, Programming Ruby: The Pragmatic Programmer’s Guide, section Expressions.
And the proof:
$ cat crud.rb
print "Hello\n" while false
end while false
$ ruby crud.rb