My non-programmer friends and relatives sometimes ask me:
Why are there are so many different computer languages?
My co-workers sometimes ask me:
Why are there so many reference books for obscure languages on your bookshelf?
The answer in both cases turns out to be the same: Language matters. The language you use affects how you think about a problem. Psychologists and Linguists (and Politicians and Salesmen) have known about this for years as it applies to human languages, and it turns out to be true for computer languages as well, of course.
I was reminded of this fact just today, as I was explaning some concepts in Object Oriented Programming to a co-worker who's just coming up to speed on C#, after using mostly C and Perl for several years.
My own first exposure to Object Oriented Programming was back in the (very) early 90's, with Borland and Microsoft's C++ compilers. I'm not sure I ever really "got" OOP in C++, and neither did most (all?) of the people I worked with. We mostly just used C++ as a better version of C.
Fortunately for me, I managed to snag a copy of Digitalk's Smalltalk for Windows not long after that. And suddenly, it all made sense!. Because the Object-Oriented nature of Smalltalk is so "in your face", you can't help but start thinking about things in a new way. The version I got came with pretty decent tutorial info as well, which was also a big help.
I never actually produced any large-scale software in Smalltalk (it was impractical for work projects in terms of performance), but the new way of looking at things stuck with me as I continued to write low-level bit-banging code in C. When I came out to California and worked for NeXT, my Smalltalk experience translated, more or less directly, to Objective-C.
Anyway, back to the discussion with my co-worker. There are a couple of different ways of thinking about "using objects" that I'm familiar with, and I tend to think of them as "The C++ way" and "The Smalltalk way". In some sense, this isn't entirely a fair characterization, but in my experience, the two views are more-or-less endemic in their respective programmer communities, so that's how I think of them.
The C++ view:
An object is a combination of some data, and functions that operate on that data. To perform some action, you call a member function of the object instance that has the data you need to work on.
The Smalltalk view:
An object is a model of some part of your problem domain, and each object has its own set of responsibilities and capabilities. To cause something to happen in your program, you send a message to an object, which may in turn send messages to other objects.
Now, it turns out that these two definitions are actually equivalent, or at least compatible, despite the fact that the C++ definition is entirely focused on the implementation detail, and the Smalltalk definition is entirely focused on the semantics.
When you get right down to the low-level implementation details, "sending a messsage" and "calling a member function" are really the same thing. A couple of CPU registers get loaded with a couple of addresses, and then you jump to a subroutine. Yes, I know it's more complicated than that in the real world, where you've got vTables, and dynamically-compiled methods, etc, etc. Work with me here...
The C++ programmers that I've met usually come to C++ from a background writing software in C. Because C is such a very low-level language, it encourages (or maybe I should say requires) that you understand the low-level details of how stuff works under the hood - memory management, how structures are laid out in memory, that sort of thing. When these folks start using C++, they apply the same low-level filter to things, and they see a class as just a data structure with some functions attached to it. This is in fact technically true, but kind of misses the point.
As I was trying to explain to my co-worker what I didn't like about some code of his that I was reviewing, I ran into a bit of a wall. I knew that something wasn't quite right, but I wasn't able to articulate the problems well enough. I think I finally figured out that it was at least partly a result of the difference in perspective due to our different backgrounds. Once I figured that out, I was able to take the discussion out of the tactical questions like "should I use a Regular Expression here?", and into the more theoretical territory of "is this an appropriate way to model the problem you're trying to solve?", and I think we made better progress after that.
It'll be interesting to see whether my co-worker has the same kind of epiphany that I did, or if he'll pick stuff up more gradually. Given that we're mostly using C# and C++ at ZING, I suspect it'll be the latter. You can write code in C# that looks just like C, with the minimal amount of "classiness" wrapped around it to compile. I suspect it would be easier for most folks to learn a new concept in a language where their old habits are obviously not going to do the trick.
Which reminds me of something else I wanted to write...
5 comments:
The first day of 6.001 (Introduction to Computer Systems and Programming - at MIT, it's the very first CS/EE course you take) was all about how language matters, but not in an algorithmic way. Languages are syntax; algorithms are semantics. Figure out what you want to do, then express it in whatever language you wish/have available/know/is best. This idea gets force-fed to you since every computer class I ever took there uses a new language - and often one that most students won't have seen before e.g. Scheme, CLU, VMSScript, ...
Some languages have optimized (speed, size, clarity, etc.) syntax for certain semantics, but they're all capable of implementing your algorithms.
Put more succintly, you program into a language, not in a language.
I guess my argument boils down to "language is more than just syntax". It's certainly theoretically true that any two Turing-complete languages are equivalent in terms of what problems they can be used to solve.
In practical terms though, you'll often have to go through so many hoops to get C to act like Lisp (or worse, Prolog) that the exercise becomes "first you write an interpreter (or compiler) for the language you wish you had, then you write the program that actually solves the problem you wanted to solve in the first place".
I'd bet that it wasn't just random chance, or even instructor preference, that led to the use of different languages in different courses at MIT. Exposure to different "toolkits" in the form of very different languages is in itself a valuable learning experience.
This is one of the things that distinguishes a higher-quality Comp. Sci. program, in my opinion. If a school has standardized on Java (or whatever) for all undergraduate classes, they're doing students a real disservice.
It definitely was deliberate that MIT exposes you to so many languages. Each one is chosen for the syntax features that emphasize the overall theme of the course: Scheme for its direct expression of the lambda calculus, Clu for its object-oriented nature (although they didn't call it that at the time, that's what it is), etc. The secondary point is there are two steps to writing code - first figure out what you want to do, then worry about how to do it. It is almost always a bad idea to start writing code before you've wrapped your head around its purpose. (The looser corollary is that well-known lesson from The Mythical Man-Month; don't let the engineers loose without a spec.)
To sum up, my feeling is that language is more than just syntax at the operational level - the choice of language dictates many operational concerns (picking Java means a Java runtime but not necessarily a hardware platform, while picking Lisp or Objective-C limits your hardware choices). At the theoretical level, languages are only more than syntax in the sense that syntax always influences semantics; how you say something can make it mean different things, and the words you pick can shape the idea behind them.
Warning: this is advertising my own work.
Just because two languages are equivalent to Turing machines doesn't mean they are equally expressive. [See "On the expressive power of programming languages", which I published in Sci Prog some 15 years ago.] If you restrict the power of the translation, it is easy to see that a language with feature X can be more expressive than the plain language. Not having X in the plain language forces you then to program in complex patterns and to maintain invariants on your own (something that compilers are much better at than people). Examples of X in an OOP language: assignment. Non-examples: nested blocks.
Hi nice readinng your blog
Post a Comment