In his History of Python: Introduction and Overview, Guido van Rossum repeats the confusion between “scripting” and “dynamic” languages:
Python is currently one of the most popular dynamic programming languages, along with Perl, Tcl, PHP, and newcomer Ruby. Although it is often viewed as a “scripting” language, it is really a general purpose programming language along the lines of Lisp or Smalltalk (as are the others, by the way).
I think there’s a big difference between the two categories, and the reason they get confused is that both are usually high level languages with dynamic typing, automatic memory management, etc.
But the two types of languages are really different when it comes to how the programmer interacts with them, and in how they are implemented:
- Scripting languages read whole programs from outside the compiler and execute them immediately, usually with a minimal amount of processing.
- Dynamic languages include the compiler in their own environment, and programming is seen as adding or modifying that environment.
For some reason, calling a language “scripting” is seen as somehow implying it’s a lower, less powerful language. That’s clearly not true. Nothing in the way Python works makes it a less-than-general purpose language. It simply doesn’t work the way Lisp or Smalltalk do.
One consequence of the NULL historical memory our trade has is that we get to see Worse is Better replayed from time to time. I’ve seen it on languages, editors, kernels, windowing systems…
This time it’s distributed version control systems, with discussions flaring up the Internet to decide which shall be The One DVCS to Rule Them All. And, once again, no surprises. The fast has won over the flexible, the cool has triumphed over the supportive, the macho has destroyed the elegant: Git‘s won and Bazaar‘s lost. Nothing new here, move along…
Wait a moment, though, as I’d like to make some suggestions to the major players in this epic battle that will not go into the programming annals, but only because we have none.
Git developers: Congratulations! You have created a good tool which, when you buy into its process, works superbly and –has anyone commented on this before?– it’s incredibly fast. Just be aware that not everyone works on the Linux kernel, and supporting different workflows is clearly an area of improvement.
Git supporters: You’ve won a battle, but not the war, and there’s much work to be done yet. Do not tolerate people using other tools, specially if they are constrained by their employers or customers. Keep proselytizing, in every venue, with your chants of “Faster, faster!”, because you know that’s the argument that always wins, that completely trumps any other. It’s also a good way of passing time until the next battle, because the world is imperfect and there will always be other people, other projects to ridicule.
Bazaar developers: You made something better, and you should have known that’s, unfortunately, worse. But your ideas can still be useful. That great plugin you wrote? Reimplement it for git, using a mix of bash scripts, Perl and C. It should take you three to ten times longer, but now you know your cycles are worth a lot less than your CPU’s.
Bazaar supporters: Yes, all three of you. Sorry, but there’s no prize for fair play, if that was what you were aiming for. Learn Git, it’s not as difficult as it’s said to be, and you’re intelligent people. Welcome to the Better is Worse corner of the world, there’s lots of interesting (but sometimes smug) people here and you get to say “That’s been done before, and better” a lot.
Other DVCS devs and supporters: Yes, I know you exist and are used in some project and each one of you is like a precious flower, unique with its pros and cons. But Git and Bazaar are the two extremes here, so apply one side or the other depending on whether you think your tool of choice has won or lost.
So, Arc is out, in a very alpha form. Ignoring the whole debate over Unicode support, it seems that Arc is just a bunch of abbreviations and nice shortcuts on top of MzScheme.
Many people where expecting a lot more, but Paul Graham had it very difficult. For, you see, Lisp as a language can’t be improved. Every time something is added to Lisp, the result is, well, Lisp. In other language families, evolution is usually based on adding features:
C + classes = early C++
Early C++ + templates + exceptions= modern C++
But in the Lisp language family, adding a feature to Lisp gives Lisp:
Lisp + classes = Lisp
Lisp + exceptions = Lisp
Lisp + anything = Lisp
Why is Lisp so different in this? Because of its syntax, of course. When programming in Lisp, there’s no syntactic difference between what’s already in the language and what has been added afterwards. It was possible to use classes in C, what C++ did was provide a syntax to make an object system the default in the language. But in Lisp, thanks to prefix notation, macros and code-is-data, you can add one or many object systems without having to change the language. Scheme has about one object system for every Scheme programmer, and they all program in Scheme.
What you can improve in Lisp programming is the “programming” part. Going back to the C family, Java was an improvement because it came with automatic memory management, making programming a lot better in many situations. Some Lisps come with debuggers, and I like those a lot more than the ones that don’t. It’s curious how much language designers forget about the programming part. They’d rather add keywords than debugging facilities.
Will programming in Arc be better than programming directly on MzScheme? Maybe a little, as the shortcuts make Arc code a lot more compressed. But without improvements on the programming side, Arc is just another Lisp. For me, the best programming language, but not the best programming.
I have been a WTF programmer. Yes, I confess it. Although I bow to the Beautiful Code gods, and I spend my free time reading Lambda the Ultimate, there’s still this black spot on my resume. I am one of they. Today as I read 2007 best coding practices on WTF, I discovered myself relating to the poor, anonymous hack who used Trim() twice on a string just to make sure. Continue reading
Languages that support both inheritance and nesting of declarations define method lookup to first climb up the inheritance hierarchy and then recurse up the lexical hierarchy. We discuss weaknesses of this approach, present alternatives, and illustrate a preferred semantics as implemented in Newspeak, a new language in the Smalltalk [GR83] family.
Gilad Bracha, On the Interaction of Method Lookup and Scope with Inheritance and Nesting (pdf)
“Ten programmers could not produce the same program to solve a problem because programming is an artful approach. I want to eliminate the art and make it an engineering approach,” said Morrison.
How many times has this been tried? Is it possible? What does it even mean? I’m certain that ten engineers faced with the same problem (like, a bridge) would produce different solutions, so are engineers some sort of artsy hippies that need engineering applied to them?
I wonder if it’s the same in other fields. Do mathematicians get things like “We should use an engineering approach to eliminate art from geometry theorems”?
Haskell’s type system will keep on evolving until it’s a static, completely specified, only available at compile-time implementation of Common Lisp.
For me, the most important sentence in the paper is the first one:
The ultimate goal of all computer science is the program. [..] Designers, programmers, engineers, we must all return to programming!
I also strongly agree with their view that computer science is sufficient for itself. We still have a lot of research to do in programming, and only a small part of it has anything to do with algorithms and type systems. The paper presents a convincing tale of how programs are really developed. How come there isn’t research on the role of copy-and-paste in programming? Why are we so fixed on formal proofs? There are a lot of other sciences we could draw knowledge from beside mathematics. I personally think it’s time we started looking at programming language linguistics. For example, is the Sapir-Whorf hypothesis applicable in programming?
But then, there’s the whole categorization stuff, and this is where the paper fails in my view. Design Patterns is post-modern? The way I see it, it’s an example of a grand narrative! Sun’s JVM is modern, but Microsoft’s CLR is post-modern except its code veirifer, which is “the apparatus of power”? And isn’t all this concept of categorizing too modern? After all, “‘Tsall good”, isn’t?
When I first read the paper I liked both its form and content, although in time I have become weay of the over-use of the word post-modern in many fields, specially in science. Sometimes post-modernism has been used to replace real research, or to avoid taking a stand for some theory. I wouldn’t like a repeat of the science wars in programming. But, then again, scientist are having a “war” because many of them are thinking on what is science. When will we start thinking on what is programming?
This is why I prefer to talk about descriptive languages when referring to the way programs are built with a programming language:
The Two Meanings of Declarative Â« Sententia cdsmithus
The issue here is what is meant by a declarative language. Haskell and SQL are both called declarative.