Java and the dream of the "One Ring to rule them all"
There is an enduring myth, nay fantasy, for many young programmers that their language of choice (or simply the first one that they learned) is "the one true language," and all other languages be damned. It is an understandable human quality, much as speakers of their respective human languages often think theirs is superior to all others. I harbor no such illusions, rather focusing more on how each language, with their strengths and weaknesses, fits into the pantheon of languages.
In a previous post, I discussed compiled vs. interpreted languages. And one thing I mentioned was that compiled languages tend to run much faster, but their applications are limited to a single OS/architecture. On the other hand, interpreted languages such as Perl, PHP, Python, and Ruby tend to be more open with their code, run slower, but can often run on multiple OSes/architectures. The only requirement is that the interpreter for the language, and possibly any additional modules/libraries used, be installed on each system where the application runs.
And then Java came along.
Touted with the tagline
Write once, run everywhere, I remember studying the Java programming language before it had reached v1.0. I was taking a graduate course titled "Global Knowledge Networks." At the time, the promise of the "one ring to rule them all”--a single programming language that you could learn and then write code that ran on all the major OSes--was VERY appealing. I suspect it still is. And Java came around right as the Web began flourishing. One could even run Java "applets" in a web browser, expanding its reach even further.
Originally developed at Sun Microsystems, the creators of Java took a unique approach. Instead of creating a typical compiled language (which converted high level, human-readable code into a binary specific to a particular OS/chip) or creating a typical interpreted language (where the interpreter needed to be built to run on a particular OS/chip but otherwise would run any of the high level, human-readable code), Sun went a different way. They wrote a compiler that took their new language, Java, and compiled it down into an intermediate byte code.
For less technical folks, I would explain it this way. Think in terms of human languages and how nations interact. You know English. You wish to communicate with someone who only understands Mandarin. So you write down in English what you wish to communicate. You then hand the document to a translator, who translates your entire English over into Mandarin and hands it back to you. You can now send this translated version to the person you wish to communicate with, and they can read it quickly, as it is in their native language. This is how a traditional compiled language works, taking your "English" and translating it to the specific machine language of the particular OS/chipset your code needs to run on.
But what if you also wanted to communicate the same information to someone who only knew Farsi and another who only knew Russian? You would need two more translators, one that translated English to Farsi and another that translated English to Russian. If you wished to have your code run on different systems, you needed a compiler for each OS/chipset that you wished your program to run on. (Worse, different OSes ran on different chips. And each OS had its own way of doing things--what is known as its Application Programming Interface (API). This often meant having to write your code over again for each OS' API as well.)
But what if everyone agreed upon a common language? In the diplomatic world, for a long time French was that "common language" (a.k.a, "lingua franca"), referred to as "the diplomatic language." In recent decades, French has been supplanted by English. On a more abstract level, there is Esperanto, an artificially constructed language intended to be a kind of neutral, all-in-one language.
Java's byte code works a bit like Esperanto. That is, you write your code in the Java language. You then feed your code into the Java compiler, which translates your code into a kind of "Esperanto". It is compiled code, but the instructions are not chip-specific. So how does one run this code? That is where the Java Virtual Machine (JVM) comes in. The JVM takes the Java byte code and executes it. The JVM is the one program which must exist for each OS/architecture. As long as a system has a JVM, you can run any Java program on that system.
This setup allowed programmers to write code once and have it run on different operating systems/architectures. The code itself was compiled, protecting the source/logic used. It also, in theory, made it faster than interpreted code. But unlike most compiled languages, Java programs could be run on different OSes/chipsets. Again, the only requirement was that a JVM exist for it.
Fast forward a bit, and once in the field, Java's shortcomings became apparent. The tagline among developers quickly became
Write once, debug everywhere, as the JVM (Java Virtual Machine) implementations across OSes were… sub-optimal. It also turned out that the overhead of the JVM made Java applications run just so much slower than native compiled code. Things improved with Just-In-Time (JIT) compilation. But even today, properly written Java code does not run nearly as fast as properly written native compiled code. And the original promise died a bit hard, at least for desktop software. Thankfully, all of that died--including Adobe Flash--in the web browser, as both Java and Flash, in part due to the fact both ran in browsers across almost all operating systems, presented large attack surfaces for malware writers.
Today Java continues to be used more as a backend/server side language. There are still a few client-side apps, mostly by folks currently too heavily invested in Java to shift elsewhere (e.g., Cisco with ASDM and various other tools). Mind you, Java is still going strong and will be around for quite some time. I believe this is mostly because so many university CS programs still use Java as their base language. But just as C++, Ada1, PL/1, Fortran, and BASIC before it (or possibly Lisp or Scheme if you were at one of the more esoteric schools), eventually Java will start being replaced as that base language. It may already be happening.
Today I suspect Python has become far more common as a first programming language. And if not Python, my guess is that Go might be the next language used in CS programs. But that is a post for another day.
Anyone remember Ada? Yeah, I taught that language as a graduate student. ↩