Brief history of JavaScript

Brief history of JavaScript

JavaScript is such a peculiar language.

It’s the only mainstream language that uses prototypes instead of classes. It’s widely misunderstood.

Until a few years ago people would look down to it and even debate whether it was a real programming language or not—and now it’s used all over, including on servers (Node.js).

I was having lunch with a friend, and he asked me a question I should know the answer to,  and only knew partially: what’s the history of modern programming languages? How did we get here?

I told him about machine language, how people wanted to make programming easier and came up with progressively more abstract languages that are still then eventually translated into 0s and 1s.

But—by chance I was watching Crockford on JavaScript, and he started with telling the actual story of JavaScript, and it’s way more interesting.

Brief history/overview of programming languages

To understand JavaScript, we have to look at the history of the languages that inspired its designer.

The early days

Alright, so basically at the beginning programmers didn’t have it easy.

There are computers that work differently, but virtually all computers understand only binary code. That is, a series of 0s and 1s. Computers use this approach because it correlates well with electronic switching: off = 0, and on = 1. Computers are actually dumb!

At the beginning, in order to program a computer you had to speak its language. That is, you had to work with 0s and 1s. Because this is obviously a nightmare for most human beings, computer scientists have tried hard to abstract this process in order to make it easier for people to create programs.

Here’s a brief history and overview of their attempts.

Machine code

Machine code or machine language is a set of instructions executed directly by the computer’s CPU.

Each instruction performs a very specific and low-level task. Every program directly executed by a CPU is made up of a series of such instructions.

Numerical machine code (i.e. not assembly code) may be regarded as the lowest-level representation of a compiled and/or assembled computer program or as a primitive and hardware-dependent programming language.

While it is possible to write programs directly in numerical machine code, it is tedious and error prone to manage individual bits and calculate numerical addresses and constants manually. It is therefore rarely done today, except for situations that require extreme optimization or debugging.

See: http://en.wikipedia.org/wiki/Machine_code#Machine_code_instructions

Assembly language

Assembly language is not machine code, but almost. It’s directly correlated to the underlying architecture, so there is no abstraction.

To turn assembly language into machine code, an assembler is used. The assembler is the first software tool ever invented.

Here is a snippet of code:

MOV AL, 1h ; Load AL with immediate value 1
MOV CL, 2h ; Load CL with immediate value 2
MOV DL, 3h ; Load DL with immediate value 3

Assemblers were already available in the 50s, since they don’t require much code analysis: most of the times they simply take the instructions and translate them directly into their corresponding executable values. As a programmer, you had to think like the underlying machine or architecture.

Assembly language is still used on electronic equipment with limited resources, and before high-level languages and libraries get loaded (ex., hardware firmware).

In the 70s and 80s assembly language was fairly common. For instance, all game consoles used assembly language, as the amount of memory available could be as little as a few kilobytes. Lots of code in the original 1984 Macintosh was written in assembly language to save resources.

Fortran

By this time, a lot of research had been done in high-level programming languages (meaning anything more abstract than assembly).

Fortran was originally developed by IBM in the 50s.

At the time, languages were created to be specialized to solve a specific set of problems: Fortran was intended for scientific processing, and it became the dominant language within this field right away, and enjoyed an enormous amount of success during the following 50 years.

It is still one of the most popular languages in the area of high-performance computing and is the language used for programs that benchmark and rank the world’s fastest supercomputers (see http://en.wikipedia.org/wiki/Fortran).

Fortran established the convention of using the asterisk for multiplication, which is still used today in virtually all languages.

This is how it looks:

Program Hello
Print *, "Hello World!"
End Program Hello

Here’s a punched card containing a typical Fortran program:

Fortran code on a punched card. Photo: CC Arnold Reinhold
Fortran code on a punched card. Photo: CC Arnold Reinhold

COBOL

COBOL (COmmon Business-Oriented Language) was designed for business use. It was an attempt to make programming languages more similar to English, so that both programmers and management could read it.

Among its designers was Grace Hopper (the lady who discovered “the bug”) and who had invented the English-like data processing language FLOW-MATIC. She was the perfect candidate to help create a common business language that looked similar to English.

Here’s a “Hello World!” program in COBOL:

IDENTIFICATION DIVISION.
PROGRAM-ID. HELLO-WORLD.
ENVIRONMENT DIVISION.        
DATA DIVISION.
PROCEDURE DIVISION.
MAIN.
    DISPLAY 'Hello, world.'.
    STOP RUN.

BASIC

BASIC (Beginner’s All-purpose Symbolic Instruction Code) was designed in 1964 by John G. Kemeny and Thomas E. Kurtz at Dartmouth College in New Hampshire.

BASIC was developed specifically for timesharing. It was a very stripped-down version of Fortran, to make it easier to program.

It came with a clever way of editing programs using line numbers, for both programming and other operations like GOTO line jumping.

Versions of BASIC were extremely common on microcomputers in the mid-70s and 80s, which usually shipped with BASIC directly in the machine’s firmware, allowing small business owners, professionals, hobbyists, and consultants to develop custom software on computers they could afford.

BASIC spawned different languages, including Visual BASIC, the most popular programming language in the world for a long time, which Microsoft created from Microsoft BASIC.

Here’s a simple program (in GW-BASIC):

10 INPUT "What is your name: ", U$
20 PRINT "Hello "; U$
30 INPUT "How many stars do you want: ", N
40 S$ = ""
50 FOR I = 1 TO N
60 S$ = S$ + "*"
70 NEXT I
80 PRINT S$
90 INPUT "Do you want more stars? ", A$
100 IF LEN(A$) = 0 THEN GOTO 90
110 A$ = LEFT$(A$, 1)
120 IF A$ = "Y" OR A$ = "y" THEN GOTO 30
130 PRINT "Goodbye "; U$
140 END

ALGOL 60

ALGOL 60 (ALGOrithmic Language 1960) is a committee-driven, very good and influential language that came out in 1960.

It never got popular but it introduced a lot of important concepts, including getting rid of GOTO.

Jumping around from line to line in languages like BASIC made it hard to follow the flow of the program, and made writing programs error-prone. ALGOL 60 introduced structure programming and blocks: it used BEGIN and END (because curly braces weren’t available), and it’s thanks to ALGOL 60 if we have blocks now instead of GOTO.

ALGOL also wanted to be less specialized, good for both scientific and business processing.

Here’s how it looked:

procedure Absmax(a) Size:(n, m) Result:(y) Subscripts:(i, k);
    value n, m; array a; integer n, m, i, k; real y;
comment The absolute greatest element of the matrix a, of size n by m 
is transferred to y, and the subscripts of this element to i and k;
begin integer p, q;
    y := 0; i := k := 1;
    for p:=1 step 1 until n do
    for q:=1 step 1 until m do
        if abs(a[p, q]) > y then
            begin y := abs(a[p, q]);
            i := p; k := q
            end
end Absmax

Pascal

Pascal was designed in 1968–1969 and published in 1970 by Niklaus WirthIt was inspired by ALGOL.

It was extremely popular, and although originally designed as a teaching tool, lots of people used it for general programming for a long time.

However, it wasn’t modular enough and had some design challenges that made programming hard.

Obligatory snippet:

while a <> b do  WriteLn('Waiting');
 
if a > b then WriteLn('Condition met')   {no semicolon allowed!}
           else WriteLn('Condition not met');
 
for i := 1 to 10 do  {no semicolon for single statements allowed!}
  WriteLn('Iteration: ', i);
 
repeat
  a := a + 1
until a = 10;
 
case i of
  0 : Write('zero');
  1 : Write('one');
  2 : Write('two');
  3,4,5,6,7,8,9,10: Write('?')
end;

(I actually remember programming in Pascal when I was 14 at school, pretty cool)

B

B was developed at Bell Labs in 1969. It was inspired by Fortran and BCPL.

B was essentially the BCPL system stripped of any component Thompson felt he could do without in order to make it fit within the memory capacity of the minicomputers of the time.

http://en.wikipedia.org/wiki/B_(programming_language)

B introducted the += operator (although spelled =+), and the incre/decrement operators (++ and –)

printn(n,b) {
        extrn putchar;
        auto a;
 
        if (a=n/b) /* assignment, not test for equality */
                printn(a, b); /* recursive */
        putchar(n%b + '0');
}

C

C was born from taking B and adding some good ideas from Pascal. It was developed by Dennis Ritchie between 1969 and 1973 at Bell Labs (again).

C is probably the most important language out there. It was unbelievably successful (and it still is).

In addition, many languages are based on C, including:

  • C++ (1976)
  • Objective-C (1986)
  • Perl (1988)
  • Java (1991)
  • Python (1991)
  • JavaScript (1995)
  • PHP (1995)
  • C# (1999)
  • Go (2007)
  • …and many others

Simula

Norwegian simulation languages Simula I and Simula 67 are—syntactically—a fairly faithful superset of ALGOL 60.

Its contribution was enormous. Simula took ALGOL 60 and added objects to it, and is therefore considered the first object-oriented programming language.

Simula 67 (released in 1967) introduced objects, classes, inheritance and subclasses, as well as virtual methods, coroutines, and discrete event simulation.

If that wasn’t enough, it featured garbage collection. Not bad!

It looks like this:

Begin
   Class Glyph;
      Virtual: Procedure print Is Procedure print;
   Begin
   End;

   Glyph Class Char (c);
      Character c;
   Begin
      Procedure print;
        OutChar(c);
   End;

   Glyph Class Line (elements);
      Ref (Glyph) Array elements;
   Begin
      Procedure print;
      Begin
         Integer i;
         For i:= 1 Step 1 Until UpperBound (elements, 1) Do
            elements (i).print;
         OutImage;
      End;
   End;

   Ref (Glyph) rg;
   Ref (Glyph) Array rgs (1 : 4);

   ! Main program;
   rgs (1):- New Char ('A');
   rgs (2):- New Char ('b');
   rgs (3):- New Char ('b');
   rgs (4):- New Char ('a');
   rg:- New Line (rgs);
   rg.print;
End;

Smalltalk

Simula had a huge influence on Alan Kay—who went to Xerox PARC and in 1972 started working on Smalltalk.

Smalltalk was originally designed for kids. It was thoroughly tested in the real world and revised many times. It was finally published 8 years after its first release, after several generations.

Smalltalk was a great language, and the first truly modern object-oriented programming language.

It never became hugely popular, but it influenced virtually all modern programming languages. Objective-C, C++, Java, C#, Eiffel, and Ruby are basically C combined with Smalltalk.

Example showing Smalltalk’s “alternative” to control structures:

result := a > b
    ifTrue:[ 'greater' ]
    ifFalse:[ 'less or equal' ]

Self

Another language influenced by Smalltalk—developed at Xerox PARC as well—was Self.

Self was designed for performance. It took Smalltalk, and removed classes to make it faster.

Instead of using classes, is used prototypes: Self allowed objects to inherit directly from other objects, without passing by classes.

Like you might have guessed already, JavaScript was highly influenced by Self.

Scheme

Scheme is based on the actor model. The actor model originated in 1973, and was a radical concept when it was conceived. It’s implemented in LISP, an artificial intelligence language created at MIT in 1958.

It includes tail recursion, closures, and other brilliant stuff.

Example:

;; Calculation of Hofstadter's male and female sequences as a list of pairs
 
(define (hofstadter-male-female n)
  (letrec ((female (lambda (n)
		     (if (= n 0)
			 1
			 (- n (male (female (- n 1)))))))
	   (male (lambda (n)
		   (if (= n 0)
		       0
		       (- n (female (male (- n 1))))))))
    (let loop ((i 0))
      (if (> i n)
	  '()
	  (cons (cons (female i)
		      (male i))
		(loop (+ i 1)))))))
 
(hofstadter-male-female 8)
 
===> ((1 . 0) (1 . 0) (2 . 1) (2 . 2) (3 . 2) (3 . 3) (4 . 4) (5 . 4) (5 . 5))

E

E is a union of Java and Joule, a language developed for security applications.

Based on the actor model, it implements the object capability model.

 def makeMint(name) :any {
   def [sealer, unsealer] := makeBrandPair(name)
   def mint {
     to makePurse(var balance :(int >= 0)) :any {
       def decr(amount :(0..balance)) :void {
         balance -= amount
       }
       def purse {
         to getBalance() :int { return balance }
         to sprout() :any { return mint.makePurse(0) }
         to getDecr() :any { return sealer.seal(decr) }
         to deposit(amount :int, src) :void {
           unsealer.unseal(src.getDecr())(amount)
           balance += amount
         }
       }
       return purse
     }
   }
   return mint
 }

JavaScript

JavaScript has become an extremely important language. It’s ubiquitous in web-capable devices, a pillar of the Web Platform and the base for most web and mobile applications. Thanks to Node.js, it’s making its way on the server as well.

To understand JavaScript, however, we have to take a little detour and go back to the history of browsers and the Web.

The Web

At the end of 1990, Tim Berners-Lee invented the World Wide Web.

Although sometimes used interchangeably, Internet and the Web are NOT the same thing. Internet is the network—and it was created long before the Web—and the World Wide Web is what we see and use every day: a system of interlinked hypertext documents that are accessed via the Internet (usually using a web browser).

The problem back then was that computer systems were all quite different, so while you could use the Internet to connect to a remote machine, chances were it would look totally alien to you, and your computer might not even be able to open the files you needed. This made it extremely hard to collaborate and share documents.

The Web would solve this problem, allowing scientists to finally be able to exchange information.

The concept was of course revolutionary but easy to understand: documents would be encoded with a common language (HTML), and you would have a piece of software (browser) to interpret the documents. While every platform would have its browser(s)—allowing to present the information like the author intended—the actual HTML document would stay the same. Pretty neat!

So, Tim Berners-Lee invented the Web, the first web server, the first web browser (called WorldWideWeb), the first web editor (WorldWideWeb worked as a web editor as well), and the first web pages, which described the project itself.

Browser wars

Mosaic

Tim Berners-Lee (and many others) saw the Web as a tool for scientists and researchers to exchange information. Luckily for us, some folks had different plans for it.

A bunch of students at the University of Illinois Urbana-Champaign created Mosaic, which is the browser that popularized the World Wide Web.

The reasons might be many, but the main one was that they totally ignored Berners-Lee and went ahead and shipped the <IMG> tag, which allowed authors to make things look like they wanted.

Everyone went crazy and the browser quickly became the most popular one.

Mosaic browser (1993)
Mosaic browser (1993)

Netscape Navigator

Netscape took Mosaic’s authors, had them reimplement a browser which they called Navigator, and built a business around it, making a lot of money.

The reason why people would pay for Netscape Navigator was that Netscape ignored Berners-Lee even more, and implemented a lot of useless stuff that people wanted, like fonts faces and other things that we now take for granted.

Microsoft

Netscape was really cool. It looked exactly the same on all operating systems. Another great feature they were experimenting with was a web-based system that allowed users to edit files across the network, regardless of what OS you’d be using.

Of course, Microsoft didn’t like this, as it undermined their Windows business.

So, they released Internet Explorer, gave it away for free, and started the infamous browser wars.

JavaScript

Finally!

So, JavaScript was developed at Netscape, and was initially called LiveScript.

LiveScript was paramount for Netscape. They wanted to implement something similar to Apple’s HyperCard—an application program and programming tool for Apple Macintosh and Apple IIGS that made it easy to build apps—and add it to the browser. In addition, LiveScript would run on the server as well.

They would dominate the client-side along with their popular browser, as well as the server-side.

Java, Scheme, and Self

Brendan Eich was hired by Netscape to build their language. Eich wanted to base LiveScript on Scheme, but Netscape’s management thought that people wouldn’t like its syntax and asked him to make it more similar to Java, Visual BASIC or anything else that people liked at the time.

LiveScript borrowed Java’s syntax (just because it had to), Scheme’s function model, and the prototypical nature of Self.

Because of Netscape’s situation, it was implemented and released in 2 weeks!!!

Sun, and the “JavaScript” name

When Microsoft reacted to Netscape threatening their OS business, Netscape got with Sun to try to put their efforts together and fight back.

Interestingly enough, a major dispute they had while trying to close the deal was what to do with LiveScript. Sun wanted to kill it, and use Java instead, but of course being a major technology for them Netscape wasn’t willing to do that.

Then, somebody (unclear whom exactly) came up with the clever idea of rechristening the language “JavaScript”. Of course, Sun owned the”Java” trademark, but they gave Netscape an exclusive license to use the name, and everything was good.

Deal.

Microsoft’s JScript

Of course, Microsoft had to have a scripting language in their browser, too, so they reversed-engineered JavaScript (pretty well, actually), and put it into Internet Explorer.

Because of the “Java” trademark, they had to name it JScript, but it’s exactly the same thing.

ECMAScript

With all this mess, JavaScript needed to be standardized. So, Netscape looked for a body that would do just that. The W3C refused to do it, and eventually they ended up at the European ECMA (weird).

ECMA did standardize the language, but didn’t fix the obviously awful and confusing “JavaScript” name. The thing is, they didn’t know what to call it. So, they just published it with their working name: ECMAScript.

JavaScript, JScript, and ECMAScript are sometimes thought to be different things, but are simply three different names that mean the same thing: JavaScript.

An unique language

JavaScript was (and is) truly unique.

It doesn’t use classes, for instance: in every other programming language, objects are instances of a class, and inheritance works by extending an object’s class. Instead, JavaScript uses prototypes, which means that an object can be augmented by using other objects directly, without passing by classes.

This is really awesome, and if you use prototypes you can still do classes (you can’t do the other way around), but inheriting directly from objects—along with its great function model—are what makes JavaScript such a powerful language.

JavaScript (perhaps because of it “missing” classes?) used to be considered by many as a joke, and a language for amateurs.

When Google popularized Ajax, JavaScript started gaining a lot of popularity, to eventually become one of the most popular programming languages in use today.

Current version

ECMA is still in charge of the language. While at the time of writing (Oct, 2014) ES6 is the latest release and ES7 is already in the works, the most recent version that has broad browser support is ES5 (if you don’t need to support IE8). Old browsers support ES3 (ES4 was abandoned and was never released).

ES5 is pretty great. It cleaned JavaScript up from a lot of design errors that the language  shipped with because of its quick/premature release. ES5 introduces “strict mode”, which allows developers to specify that they only wish to use features from the modern standard, and a lot of other interesting additions.

Notes

I originally published this as an overview of all programming languages, but it’s always been about JavaScript, so I don’t even know why I did that at all.

If there are any mistakes, please do let me know in the comments. :-)

NewsletterHot content on webdev and more

Delivered to your inbox (not more than once/week).

3 thoughts on “Brief history of JavaScript

  1. Great article! The intricate history of something so fundamental to the web we all enjoy is indeed important. Particularly interesting are the roles played by Microsoft and Google. I had no idea.

    1. JC, there were > 20 comments here just 2 days ago! Damn WordPress…

      Oh well.

      Thanks, Chris. Yes, they did and still have a huge influence. In particular, Internet Explorer is hated by everyone now, but back then devs were perfectly happy with building sites for IE6, and the IE team actually invented A LOT of cool stuff. Just to mention one, XMLHttpRequest—the main component behind Ajax—which in turn made JavaScript popular.

Leave a Reply

Your email address will not be published. Required fields are marked *