Skip to main content

40 Years of Programming ...

I had been shinning my father's shoes for months to save for my first computer.  I also picked weeds growing in the yard around our house for 5 cents each; but I was only paid if the root of the nasty weed was more than 2 inches in length; yes, my father, the engineer, would measure!  Those chores coupled with a newspaper route and other odd jobs finally made it possible for me to buy my first micro-computer, a TRS-80 in all its 4K of RAM glory! Little did I know, at the tender age of 13, I would be condemning myself to a lifetime sentence of computer programming.

As I write this, I am now in my mid 50's, having earned a couple of degrees and a pile of accreditation's in a multitude of languages and platforms mastered… and forgotten.  I have worn many hats and badges - as a help desk operator, computer operator, programmer, systems analyst, systems manager, database administrator, instructor, and even the trendy "web developer"; but whatever hat or badge you currently wear, all will be found under the umbrella of computer science.

As the pixels on my screen seem increasingly blurry with each passing year, and the multitude of programs I have written over the years fade to magnetic dust, I thought it appropriate that I write my own epitaph and impart some thoughts and opinions which, I hope, may spark a few conversations for those of you who still have many fun-filled years ahead (that was sarcasm for debugging).

Fair warning - much of what I write (or rant) about in the coming paragraphs will seem trivial and may even seem to be non-applicable in today's "modern" computing scene, but be assured, you are all trapped in the same endless ACD (Add, Change and Delete) loop that is the programming world.

My kingdom for the lost days of "real" documentation…

The very first computer documentation I laid my hands on was a manual entitled "Level 1 BASIC" written for the TRS-80 by Dr. David A. Lein.   That manual shaped the rest of my programming days because it made learning easy - it was just fun and everything was explained in easy to remember, simple terms.  Shortly thereafter, I dug deeper and learned about hexadecimal and binary and began writing assembler code on 6502 and 8085 processors by plugging hex "instruction codes" into specific memory addresses; this was before I found (and could afford) an "editor" for assembly language.

Documentation back then came in the form of a (gasp) paper manuals that explained in detail the syntax of all commands.  Most things back then were done via a command line rather than the wonderful graphical user interfaces (or GUI's) of today. As a matter of fact, during my college years, the importance of documenting your code was pounded into us from day one.  The mentality was (and rightfully so), that someone other than yourself will eventually have to maintain the code you have written and if they are unable to understand the logic you applied, changes to your code will likely result in a failure or a time-consuming rewrite.

In my early 20's, I was hired by the college I graduated from as a junior programmer.  The data center was a large building filled with Digital Equipment Corporation (DEC) VAX/VMS (Virtual Address eXtension / Virtual Memory System) mini-computers used to run the administrative and academic systems at the college.
Side Note / Observation:
When did buildings filled with computers (which used to be called "data centers"), become "clouds"? When did data become meta-data? Phooey!
DEC produced wonderful documentation, and the data center (or "cloud", whichever you prefer) had its own library of grey binders packed with detailed documentation for systems management, programming languages and libraries.  Back when "client-server" was all the rage, their manuals included everything a programmer would need.  Having clear, concise documentation combined with simple, yet practical examples made programming life relatively easy, and rarely did I have to consult with anyone to solve a programming problem.  Fast forward 15 or 20 years, and printed manuals and documentation have all but disappeared. The majority (if not all) documentation these days is found online, which is not a terrible thing per se, it was simply a natural progression that has saved companies millions in printing costs.  However, the QUALITY of the documentation over the years has taken a serious nose dive in my humble opinion.  I have lost track of the number of times I have emailed Google and Microsoft and countless other companies in the last 10 years asking for clarification on what a function does, returns, accepts as input, or what side effects it may produce, or even what range of values is valid as input or what values are produced as output - information that USED to be present in what I call "real" documentation.  These days, I rarely find any real documentation and instead find poorly written "code examples".  I say poorly written because the code provided rarely considers error handling (believe it not, it happens in the real world and hence the need for real documentation), and typically leaves out pertinent details that should have been present - such as what exceptions may be thrown.  I don't know about you, but over the last couple of years, I have found myself wrapping almost all code I import from an outside source in a Try/Catch block -- just because I have no clue as to what exception(s) can be generated by the undocumented addition to the project -- this has saved me time and time again.

I recently watched a video produced by one of the largest software providers on the planet to educate myself on yet the next best and greatest iteration of the previously best and greatest thing out there.
Side Note / Observation:
Why does everything work for the happy chatty people in the online tutorial videos, and then when you try, all you get is errors and crashes?
The presenters downloaded and imported a package from NuGet into their project and happily stated "don't worry about how it works, it just does" … Good grief, I MUST worry about how this stuff works because I will be the one called upon to fix it when it breaks!  Which leads me to a couple of truths that I can guarantee you all:
  • All programs will break
  • All programs will be replaced with a new language or library to achieve the same result.
But what's the big deal you say, just go to GitHub and look at the source code for whatever package you have and fix it.  Jokes on you however, when you find out the package "that just works" (or used to) has not been maintained for over 5 years (which in our industry may as well be 500 years) or worse - not a scrap of documentation or a single comment is found within the several thousand lines of single-spaced code to help you understand what is going on!  Or even better, you find out the package that is broken isn't broken, but one of the four other packages it depends on is!  The fun never ends!

I really enjoy how Integrated Development Environments (IDE's) these days provide project templates - designed with you in mind.  Simply select a project template and poof - the IDE has "written" all the "base" code for you. Great, until you look under the hood and try to make heads or tails of what this automated code soup is doing.  I especially enjoy the auto-generated documentation, comments such as:
The code below was automatically generated, do not change!
May as well be written as:
Good luck trying to figure out what the code below is doing!
Typically, the only project template I have been using is the "Empty" project.  Anything I add from that point on, well, at least I'll know how it works; and so will the next person, because I will add meaningful comments throughout!

Thankfully, we now have "online" resources such as StackOverflow whereby someone always seems to have an answer, or at the very least, the same problem as you - so you never feel alone, if that is of any comfort!

A Standard Programming Language?  What could have been, but will never be …

Read the line above again… and then, if you have a couple of hours to kill head on over to:
… and read all the excuses over the years.  (Note: to be fair. the original poster asked why a universal programming language does not exist for all purposes; which of course would be impossible).  A large number of the responses were excuses in my mind because we all think that whatever programming language we are using is the "best" for the task at hand.  Over time, you will find that people in our industry tend to claim that the language they are most comfortable with (that is, most proficient in), is the "best".  But what if the code you write in BASIC, C, Pascal, C++, C#, Javascript, or whatever other language badge you are wearing was compiled down (or interpreted) in the same way as any other language? You would say that is not true - and you would be correct as each language generates a distinct set of instructions that are ultimately executed.  However, every language has common denominators - we all declare variables, write loops and use conditional statements.  Take the following example - Which language is the "best"?

Source Code Generated Code
int i;
for (i=0; i < 100; i++)
   {
   // cout << i << endl; 
   }
MOVE #0, R2
L1  
ADD #1, R2
COMPARE #100, R2
BRANCH_IF_LESS_THAN L1
dim i as integer
for i = 0 to 100
   rem print i
next
MOVE #0, R2
L1  
ADD #1, R2
COMPARE #100, R2
BRANCH_IF_LESS_THAN L1
Of course, it is a trick question - in my fictitious world, the two different languages (source code) above, when compiled or interpreted generate the same set of instructions and would execute in exactly the same way - so the real question becomes: Why do we need DIFFERENT SYNTAX to describe the SAME things?

It is highly unlikely that you, as a developer will use a single language throughout your entire career; I dare say, very unlikely as each language has features and extensions that may not exist in others.  However, I have often wondered (as I am sure many of you have as well), why, by now, a single standard had not been adopted within our computer science world for such things as comments, declaring constants and variables, data types, loops and conditionals?  A metric system of syntax if you will, and the very meat of most programming projects.  Our SQL brethren are light years ahead of us insofar as a standard syntax goes.  I can jump from Oracle, to MS SQL, to My SQL with far fewer headaches compared with jumping to and from the various programming languages that abound today.
Consider how our programming world would have been had a standard been adopted, for example:
!! Single line comment…
!! Constants …
constant k_loop_max as int32 = 1000

!! Variables …
use n as int32 = 0
use s as string

!+
Multi line comment
Do the voodoo we need to do using loops …
!-
for n = 0 to k_loop_max
   n = n + 1
next
!! A while loop …
n = 0
while ( n < k_loop_max )
   n = n + 1
loop
!! Another while loop …
n = 0
loop
   n = n + 1
while ( n < k_loop_max )

Just think, you could have cut and paste the above code in ANY language or IDE and it would be compiled (or interpreted) without changes! As new developments in languages are made over time variables could have been used "as" any type, for example, when the object-oriented sunrise arrived:
use image_obj as image
image_obj = new image()
image_obj.draw()
Join hands everyone and take one small baby step forward and think if we had a SINGLE standard for conditional operators…
== (Equal To)
> (Greater than)
< (Less than)
<= (Less than or equal to)
>= (Greater than or equal to)
<> (Not equal to)
or (conditional or) 
and (conditional and)
not (logical negation)
Keep moving forward and standardize bitwise operators… (choose a flavor; lets vote) … data type names, some examples, (again, send in your vote) …
boolean
int32
uint32
int64
uint64
string
Keep going!... A single standard structure for conditional (if/then/else) statements as well as for multiway branch statements such as switch, case, select or inspect!

Good grief, if only the small samples above were adopted as standards, we all could have bounced from language to language and been relatively proficient regardless.  How much time, effort and money could have been saved?  How much farther along could we have been?  Questions you start to ask yourself after you have jumped from one language world to another countless times over the years.

If your belly button isn't puckering and unpuckering by now you are not a real programmer, OR you have never participated in a project where your day is spent moving code from one language to another as I (and many I know) have done several times in their careers; it really is insanity when you think about it.  Which brings me to another couple of truths:
  • We all use different syntax to describe or do the same tasks
  • We must all be nuts
But regardless what everyone says, we programmers are human and (for whatever reason) we all need our distinct brands and colors of vehicles to get to work; and so, it will likely remain, in the programming world.

You will find proof in almost any online forum where someone will declare that C# is better than C++ which is better than C.  .Net is better than PHP.  The new programming manager always used Java, so we are now moving all development to that platform, bla, bla, bla... No matter what language or development platform you THINK is best, you are ALL just declaring variables, writing loops and using conditionals.  We all use different SYNTAX to describe or do the SAME thing!  The only real difference is what is produced from that syntax (that is, the final executable code).

I have never sat on a major standards committee (the members probably would not appreciate being sat on), but If I did have a seat on a committee that was working on the fictitious Standard or Universal Programming Language (SPL or UPL; because I always wanted to invent a programming acronym - I would vote for SPL so people might start calling it "Super Programming Language" …), I would vote to make the language case insensitive.
Side Note / Observation:
Pause here to let the online flames die down after the debate about case sensitive vs case insensitive languages
Over the years, I have lost count of the number of times colleagues and I have been tripped up due to a simple case-sensitive discrepancy - it really should be called case-sensitive-insanity.  I would also vote for any language or syntax feature that makes the language easy to understand, maintain, and self-documenting as possible; some of the most important aspects (I think) of any programming language.

In short, any IDE or programming environment that can take a standard syntax and produce the fastest executing code should be held in high praise.  Imagine using our fictitious SPL in IDE A to produce an executable.  Take the same SPL source code and compile it in IDE B.  The executable produced by IDE A runs 20% faster than B on the same hardware.  Now you have something legitimate to spout off about, that is, the fact that IDE A produced a superior executable; not the language syntax that was used to get there.

If we did have SPL, we would naturally have SPL libraries we could draw upon.  I don't know about you, but how many times have you searched for some language specific function for a task?  For example, a function that accepts a string containing an email address and returns true or false to indicate whether that email address is syntactically correct according to RFC5322?  Now try to find that same function that gives the same result in JavaScript, C#, Pascal, C++, BASIC, PHP, .Net - good grief, again, the SAME task being done using a different brand and color of vehicle, yet there is no guarantee you will find such a function or even get the same result when you do!

I will again, digress back to my VAX/VMS days in the early 80's.  Back then, you could code in any of the supported DEC languages, BASIC, COBOL, FORTRAN, Pascal and C to name a few (stop laughing, I can still hear, sort of!) - my point is that DEC provided common, well tested libraries and/or utilities that ALL their supported languages could use (the technical term is "link" to).   Functions and utilities existed for screen and forms management, database access, etc.  There was a whole shelf of well documented libraries and utilities to choose from.  The trick was that these libraries were supplied as "object" libraries.  When you compiled your code (regardless of language being used), the compiler would produce an object file.  You would then "link" your object code to any other object library or file you needed to produce the final executable:
myprog.pas -> COMPILE -> myprog.obj
myprog.obj -> LINK -> db_lib.obj, fms_lib.obj -> myprog.exe
Because each compiler produced common "object language" files, those files could then be linked together to produce a final executable. It was a brilliant system that saved countless hours of programming time because a good portion of the code was executed by supplied libraries which all behaved the same regardless of the language that was used to link to it.  This scheme has come full circle - yet again.  Microsoft (and others) for example are now building (or marketing) what they call a "Common Language Runtime" within their .Net platform and most of their languages sit above this foundation of common code - this is saving programmers massive amounts of work and eliminates the need for programmers to search for a specific function or utility that works with whatever .Net language they may be using at the time.   Now image if every NuGet package was based on a common object language (produced by SPL of course), that ANY other language could make use of - no really, think about it!

As a matter of fact, why not build into SPL a documentation standard as well?  Imagine pressing a button and having the IDE produce documentation that clearly states what data type(s) each function or method accepts, which parameters are optional, what range of values are accepted and what is returned by each function or method.  But alas, that would make our lives too easy, better to post your question on StackOverflow …

OK, I know the above is a rant (I readily admit I have been wanting to get that off my keyboard for at least 25 of the past 40 years), and many will justifiably proclaim that a standard programming language is impossible since new developments are always being made and many of the things I described above are available in some (if not most) IDE's these days - just look at how HTML and CSS have evolved (and are evolving) over the years… 

The constantly accelerating, endless upgrade loop…

The irony of the programming world is we are all caught in an endless loop of upgrades.  As technology evolves (and it always will), the upgrade loop will continue without end.  That is not really the problem per se, the problem is that this "upgrade loop" seems to keep accelerating; at least it seems that way to me as the years go on - it needs to slow down a bit and more time needs to be spent hammering out the next version of whatever it is that "needs" to be upgraded.  I have calculated that I could very easily spend my entire day just trying to keep up with to the latest and greatest release of, well you name it: the latest language development (choose any), the latest IDE, the latest "cloud services", the latest databases, etc.  The real challenge these days is choosing any service, platform or software product that will still exist in the next couple of years.

The first time I used HTML was sometime around 1990 or 1991 while I was asked to "fill-in" for a systems manager at a University who had suffered a nervous breakdown and refused to give the passwords to the systems he had locked down (but that is a whole other story).   It didn't take much to see the potential of HTML, as it was VERY easy for anyone to "markup" or build pages and link them to one another (what a great language for documentation I thought).  When I learned about the ability to create and submit data via forms, I began to get really interested, that is, until I found out the whole POSTing / CGI process left me with fewer tools and options than I had 10 years prior. It felt like a giant step backward.

Long ago (circa early 80s), in a data center far far away we had what was then called FMS, or Forms Management System (again, another DEC product).  FMS had its own markup language and design tools that could be used to build forms or "pages" that the user could interact with - except, in my opinion, the tools we used back then were more advanced than the HTML forms we are using today; even when combined with JavaScript or JQuery or any other components we include to hack things together just to submit a form these days!  FMS allowed us to define where on a screen text or input fields would appear, the range of values or pattern an input field would accept, and what error messages would be displayed. We could also write user action routines (UARs) in any language, which would be called when a value in a field changed or other form related events occurred.  It all worked regardless of the terminal (screen) being used because FMS would "detect" the terminal type and render the form accordingly.  We did this with less code and half the hoops we jump through today to get browsers to do our bidding.  My point is had HTML included even a small set of features from a product like FMS, the HTML and browser worlds of today would have been much different and our lives would have been somewhat more enjoyable.

Those of you that have stepped inside the "web development" arena, have been dealing with the various flavors and behaviors of web browsers for years now.  The standards group(s) such as the W3C who govern the HTML and CSS specifications (and a multitude of other "standards"), are generally well written and defined.  However, several individual companies (Microsoft, Google, etc.) went ahead and added their own extensions or "standards" to the standard - does that even make sense?  Each browser behaved just slightly different than the other and we all (still) spend countless hours trying to get this mess to work, and if (or when) we did, a new browser or version was released, triggering yet another difference we had to deal with.  I am not against multiple browsers, so long as each behaved the same way - that is, given the same input, for god sake, render the same output!  Let the browser that renders the fastest win the so-called "browsers war".  How many televisions do you know that take the (standard) input signal supplied and display the picture differently? How many of us would have tolerated that?

When a "standard" is proposed, I think we as an industry need to take a little longer to hammer things out.  Let's face it, we will never get it right the first time around; case in point the constant evolution of HTML, CSS and browsers.  We need a GitHub or Stackoverflow for proposed standards so that those of us in the trenches with a little mud on our boots can at least raise our hands and say, "wait a minute - wouldn't it be better if…" or "look at this, you might want to incorporate some features from…", "here are the test inputs - what are expected outputs?".  And finally, a huge vote from as many in the developer community as possible should be taken whenever an issue get delayed or remains unresolved rather than one or two individuals calling the shots based on the market share of their companies - the politics of standards!

Slowly but surely, browser "creators" are coming around to the fact that they must adhere to the standards that exist.   Most of the standards are written by very knowledgeable people, experts in their fields.  However, I have read far too many specifications over the years that explicitly state a behavior as "undefined" or left to the discretion of the implementer - this produces one thing - chaos!

Bring back the command line and allow every configuration tasks to be scriptable …

I must admit I have come to hate many Graphical User Interfaces (GUIs) for CONFIGURATION, and even for some programming tasks.  These days, I am somehow just supposed to "intuitively" know that the command for disabling or enabling some system option or feature is located 5 levels deep in some obscure menu buried beneath a "task bar" or icon - it simply does not happen - I have spent countless hours (and you have to, admit it…) Googling, YouTubeing and StackOverflowing to find out where the damn command might be that can turn off (or on) a feature that you do (or do not) need.

Back in the day, we were handed a simple syntax diagram that had every command the system understood, like the documentation you might find that describes a Transact-SQL backup command i.e.:
BACKUP DATABASE { database_name | @database_name_var }   
  TO  [ ,...n ]   
  [  ] [ next-mirror-to ]  
  [ WITH { DIFFERENTIAL |  [ ,...n ] } ]
Any competent programmer can read a syntax diagram like the one above and will eventually come up with a backup command that will get the job done.   Not so with a GUI.  To illustrate, compare the following instructions:

Command Line:

  1. Run the system configuration utility.
  2. Enter the following command:
    SET MAXIMUM USERS 100
  3. Done.

GUI:

  1. Run the system configuration utility.
  2. Right-click on the little blue icon located in the top right corner, the one that looks like a little police man. A dialog box should open.
  3. Scroll down to the bottom of the list until you see the section entitled "System Limits". Click on the plus sign to open the panel and click on the "Advanced…" button.
  4. Click on the third tab from the left entitled "Users", a new panel will be displayed.
  5. Enter 100 in the field labeled "Maximum Users".
  6. Click Confirm followed by clicking OK three times.
I can give the following command line statement, i.e.:
SET MAXIMUM USERS 100
to ANY first year, or first month programmer, or any non-programmer for that matter will likely know that whatever system this applies to it will set the maximum users allowed to 100. I could even e-mail (or text these days) the above instructions for a quick cut and paste into a console command line.  Contrast this with the GUI instructions above... need I say more?  Yes, I do, because next year, when you are the poor slob selected to make configuration changes, you fire up the GUI configuration utility that has recently been upgraded from version 1.1 to 1.3, and the little police man you could have sworn existed last time you used the utility no longer exists and has been replaced with a gear icon that will NOT be found in any place remotely close to where you think it should be!  So, you naturally TRY to find the online documentation (which has not been updated yet because the guy that is updating the documentation is taking screen snapshots of every step and drawing circles around the item you need to click on), and end up posting a question on StackOverflow!
  
Now imagine repeating all the above for the next 50 OTHER configuration settings! Oh yes, my friends, a new GUIs release equates to hours wasted while you hunt and click and learn and post online while trying to remember where everything is; but don't bother, because it will all change again with the next minor release!

Every configuration program or utility should come equipped with a single (or set of) plain text configuration files that can be easily edited and changed.  A configuration GUI would then consist of a file picker button (to choose which configuration file(s) to load, parse and possibly display values), and two buttons: "OK" and "Cancel" - How nice to have single configuration file, such as:
!! Configure system defaults …
SET MAXIMUM USERS 100
SET MAXIMUM FILES 1000
SET MAXIMUM TIMEOUT 60 SECONDS
Wait a minute… was that a SPL single line comment in that script?!!

Flash back time - poof - back to the data center filled with VAX/VMS mini-computer systems I mentioned earlier, 3 machines were dedicated to academic staff and 3 machines were for administrative functions; our systems manager had configuration / startup files for each computer in the data center.  One day, one of the administrative computers fried a board and brought down the entire registration system for the campus, our system manager simply took one of our academic machines and restarted it using the startup script from the fried admin machine.  The academic machine "woke up" as an administrative machine and the registrar's systems were back up and running in less than 15 minutes (which was light speed back then!).

Fortunately, most of the better systems these days do come with a set of configuration files and command line functionality - I just wish more did, and please, don't forget to include the documentation!


Exit Sub

I hope I didn't sound too disgruntled - don't get me wrong, I have enjoyed my time in programming purgatory.  I can say the job (like any other) has had its ups and downs, but it has never been boring since everything keeps changing (sometimes, even for the better), especially in the last few years!  It reminds me of an old cartoon I had pinned up in my cubicle long ago that depicted a man sitting on a park bench in the background, looking lost as he stared at the ground.  In the foreground two programmers were standing looking over at the man on the bench.  The first said to the second "What's up with Joe?"  The second replied: "He took six weeks off, when he came back he discovered he was obsolete".

I think I'll go find that bench.
01000111 01101111 01101111 01100100 00100000
01101100 01110101 01100011 01101011 00100000
01110100 01101111 00100000 01111001 01101111
01110101 00100000 01100001 01101100 01101100
00100001