I really like how Python lets me start to get things working before everything is working. I can fire up an interactive debugger and immediately start playing with some library I Googled up and think I might need, quickly get it doing stuff, plug it in to other code and quickly get the whole doing useful stuff.
I can get my Python program in a useful state before I have really decided what I wanted it to do, and well before I have stopped to think hard about the best way to do it.
This kind of exploratory programming is exactly what is needed to develop a prototype. But never “ship” the prototype!
Here is an analogy to the physical world: there are prototyping materials that are easy to work with but are not as durable nor economical as are materials suited to real manufacturing. For an extreme example, automobile bodies used to be prototyped, at least in part, with modeling clay. And the very properties that make modeling clay good for prototyping make it terrible for manufacturing. (Take it to go buy a Christmas tree, strapped on top.)
Similarly, in the case of Python, the key property that makes it good for prototyping, makes it terrible for “real” programs: Probably the biggest thing that makes Python powerful is precisely that it allows the programmer to defer so many decisions. What kind of parameter does the function take? “A parameter called X!” Not very useful. Even if the parameter is called something like “address_list”, that only hints–it might not actually be a list, maybe the address “list” is in a dictionary and the keys are customer numbers. (Likely.) And even if we really honestly know the address_list is a Python list. Okay…a list of what? Let’s guess dictionaries, Python loves dictionaries. And what will be in the dictionary? Whatever anyone anywhere else in the code might manage to put in there–or remove from there. And it gets worse: Some programmers think it is cool to put “**kwargs” in the parameters, which means we don’t even know what the parameters to the function are! We have to examine every line of code that might call this function to see what the possible parameters are, and even then you will see (you just now it) that some of that code is going to be passing a dictionary that is only known at runtime.
The fact the programmer doesn’t have to decide what s/he is doing can give the illusion that real programming is happening really fast, but there is an illusion there. A dangerous and beguiling illusion. Worse, of course, is when such dynamic features are actively abused (see kwargs), but merely deciding to use a simple list yet having no good way to pin down what is in it is such a rich place to hide bugs.
Strongly Typed Language: Python
There is this idea that compiled languages such as C (or C++, all the weaknesses of C without the virtues of being a small and elegant language) are strongly typed but an interpreted language such as Python is not. This is half-right.
In C you have to say what kind of data goes into your variable.
In Python you can put whatever you want in your variable–a string, a boolean, some kind of number, some enormous data structure, a function, or None. Not only can you put what you want in there, you can change it at your whim; in one line the you might declare an instance of the class your variable holds, and a couple of lines later (or a different thread, if it can get a chance to run) set your variable to 42. Python is very liberal about such things.
But this doesn’t mean Python isn’t strongly typed! It is very strongly typed, it just doesn’t make up its mind about types until the last possible moment, at runtime. Repeatedly. Every time through your loop.
In fact, Python does almost nothing but constantly checking types of things. It takes much longer to check the types of two variables for adding than it takes to actually add them. (To check whether they are numbers and that adding is sensible, and how to add these particular numbers–assuming they proved to be numbers. Python needs to check a lot before it can do the addition.)
Deferred Work Doesn’t Go Away
It is presumably important to you that when the Python code runs it not crash. One would think. In which case doing that clever thing of instantiating a class instance from a variable at one moment and doing arithmetic on 42 the next had better be done right because the reverse operations will not work. Even doing unclever things, such as misspelling a variable name and accidentally doing arithmetic on a class definition with a similar spelling is a bad idea.
And though Python will catch both of these mistakes if you make them, it will only do so if you exercise the right lines of code with the right (unfortunate) values. And only if the right person is watching in the right way will it do any good.
It is really hard to thoroughly exercise code. And in the case of a very dynamic language like Python the permutations are so great that it really isn’t possible.
Yes, Compilers are Annoying
In statically typed, compiled languages, it is more work to make the compiler happy, but a benefit is the compiler will prevent these sorts of errors. It is less work in total to catch a type problem up-front than to have to do it in the debugger and in vague bug reports from users. Unless you are planning to defer some of the work forever, planning on never finding and fixing some of the bugs…
Yes, Compilers are Inflexible
Yes. And in a good way, if it prevents accidentally doing arithmetic on a class definition.
But what about cases where one needs to be clever. Maybe not so clever as to mess with class definitions at runtime, but something more conventional, such as wanting either a value like 42 or some other flag value (such as Python’s None), isn’t that reasonable?
Yes. And compiled languages allow such things. Some in safe ways, even.
(Some Compilers are Nice)
The Rust compiler is demanding but in exchange lots of bugs simply won’t exist once the compiler is happy.
Rust: Not as slow as C without being as low-level as Python.
Prototypes are Expensive to Operate
I would like to see some hard numbers, but it feels to me like Python must spend a hundred times as much effort constantly checking the runtime type of every bit of data as it does doing real work on that data. Certainly Python is not very efficient, whatever the ratio. How much carbon is released just because of Python?
-kb, the Kent who is looking opportunities to finally get good at Rust.
P.S. Comments are broken and have been for sometime. Sorry.
©2020 Kent Borg