Lots of developers really want to learn Go, a programming language for large systems created by Google, meanwhile most developers are sick of attending meetings, and most of those working at multinational corporations aren’t happy there.
That’s according to the results of a survey of over 16,655 developers from 76 countries carried out by HackerEarth, a company with offices in India and San Francisco that provides tools for recruiters to remotely assess developer coding skills.
I followed @realjasonisaacs for his data gathering and analysis of COVID-19 in Ventura County. I wrote a simple Python3 script (using graphics.py) that provides a visualization of the pandemic. The simulation has a few parameters, but it is essentially a simulation of people’s random but relatively confined movements. It starts with one sick individual in the center, and a population of 300 healthy individuals. What was surprising was how easily I was able to simulate the situations presented by Jason Isaacs (e.g., exponential versus logistic growth) with my simple code. See a recording of a run of a typical development of the pandemic.
The once-declining C language has completed a comeback in the monthly Tiobe Index of language popularity, winning the 2017 Programming Language of the Year designation from Tiobe as the biggest gainer in share.
Although the language only grew 1.69 percentage points in its rating year over year in the January index, that was enough beat out runners-up Python (1.21 percent gain) and Erlang (0.98 percent gain). Just five months ago, C was at its lowest-ever rating, at 6.477 percent; this month, its rating is 11.07 percent, once again putting it in second place behind Java (14.215 percent)—although Java dropped 3.05 percent compared to January 2017. C’s revival is possibly being fueled by its popularity in manufacturing and industry, including the automotive market, Tiobe believes.
The Tiobe Programming Language of the Year for 2016 was Google’s Go language (Golang). Tiobe, which provides software quality services, bases its rankings on a formula assessing searches on languages in popular search engines such as Google, Bing, and Wikipedia.
Other languages that saw jumps in 2017 included R, which rose from 16th to 8th place; Kotlin, which jumped to 39th place after being in 89th a year earlier, and Erlang, which placed 23rd after being in 44th place a year ago.
But promising languages such as Julia, Hack, Rust, and Kotlin were not able to reach the top 20 or even the top 30, Tiobe pointed out. “Becoming part of the top 10 or even the top 20 requires a large ecosystem of communities and evangelists including conferences,” said Paul Jansen, Tiobe managing director and compiler of the index. “This is not something that can be developed in one year’s time.”
The top-rated languages in the index this month were as follows:
A long time ago, developers wrote assembly code that ran fast and light. On good days, they had enough money in their budget to hire someone to toggle all those switches on the front of the machine to input their code. On bad days, they flipped the switches themselves. Life was simple: The software loaded data from memory, did some arithmetic, and sent it back. That was all.
Today, developers must work with teams spread across multiple continents where people speak different languages with different character sets and – this is the bad part – use different versions of the compiler. Some of the code is new, and some may be from decade-old libraries that may or may not come with source code. Building team spirit and slogging through the mess is only the beginning of what it means to be a programmer today.
Once the darling of the developer community, Ruby’s popularity has plummeted in the past few years, leading some tech leaders to wonder if the language may eventually die out completely.
In IEEE Spectrum’s ranking of the top programming languages, Ruby comes in at No. 12—down from No. 8 in 2014.
The lack of job prospects led coding bootcamp Coding Dojo to drop Ruby courses from all of its six campuses across the US by the end of the year, while adding a full-stack course in Java.
“We looked at local markets to see the most relevant technologies, and we found that Java was at the top of the charts, and Ruby on Rails seemed to rank much lower in demand in terms of startup positions, and general demand and interest,” said Speros Misirlakis, head of curriculum at Coding Dojo.
For the last 30 years, C has been my programming language of choice. As you probably know, C was invented in the early 1970s by Dennis M. Ritchie for the first UNIX kernel and ran on a DEC PDP-11 computer. I am probably a bit old-fashioned. Yes, C is outdated, but I’m simply addicted to it, like plenty of other embedded system programmers. For me, C is a low level but portable language that’s adequate for all my professional and personal projects ranging from optimized code on microcontrollers to signal processing or even PC software. I know that there are many powerful alternatives like Java and C++, but, well, I’m used to C.
C is not the only vintage programming language, and playing with some others is definitively fun. This month, I’ll present several vintage languages and show you that each language has its pros and cons. Maybe you’ll find one of them helpful for a future project? I’m sure you won’t use COBOL in your next device, but what about FORTH or LISP? As you’ll see, thanks to web-based compilers and simulators, playing with programming languages is simple. And after you’re finished with this review of 1970s-era computing technology, give one or two a try!
Like many teenagers in the 1970s, I learned to program with Beginner’s All-purpose Symbolic Instruction Code (BASIC). In 1980, after some early tests with programming calculators, a friend let me try a Rockwell AIM-65 computer. An expanded version of the KIM-1, it had an impressive 1 KB of RAM and a BASIC interpreter in ROM. It was my first contact with a high-level programming language. I was really astonished. This computer seemed to understand me! “Print 1+1.” “Ok, that’s 2.” One year later, I bought my first computer, an Apple II. It came with a much more powerful BASIC interpreter in ROM, AppleSoft Basic. (This interpreter was developed for Apple by a small company named Microsoft, but that’s another story.)
PHOTO 1: An online emulator for my old Apple II
Since its invention in 1964 at Dartmouth College, BASIC is more of a concept than a well-specified language. Plenty of variants exist up to Microsoft’s Visual Basic. But it has plenty of disadvantages, especially its early versions: a lack of structured data and controls, mandatory line numbering, a lack of type checking, low speed, and so on. Nevertheless, it is ultra-simple to learn and to understand. Even if you have never used BASIC, you’ll understand the code shown in Photo 1without any problem. The main program starts by initializing a variable N with the value 8. I then calls a subprogram that starts at line 100, displays the result F, and stops. The subprogram initializes F to 1 and multiplies the result by each integer up to N. Straightforward.
Let compare this BASIC with a C version of the same algorithm. For this article, I looked for online compilers and simulators. I found a great option at www.ideone.com, which, developed by Sphere Research Labs, supports more than 60 programming languages. You can edit a program using any of them, compile it, and test it without having to install anything on your PC. This is great for experimenting.
PHOTO 2: At Ideone.com, you can enter, compile, and simulate numerous programming languages. Here you see C language.
The C variant of the factorial algorithm is depicted in Photo 2. I could have used plenty of different approaches, but I tried to stay as close as possible to the “spirit” of C. So, how does it compare with BASIC? The code is significantly more structured, but a little harder to read. C aficionados loves short forms like f*=i++ (which multiplies f by i and then increments i) even when they can be avoided. While this makes the code shorter and helps the compiler with optimization, it is probably cryptic to someone new to the language.
Of course, C also has great strengths. In particular, it offers you precise control of the data types and memory representation, which helps for low level programming. That’s probably why it has been so widely for nearly 50 years.
FORTRAN & COBOL
Let’s stay in the 1970s. BASIC or assembly language was for hobbyists and experimenters. C was used by early UNIX programmers. The rest of the programming world was divided into two camps. Scientifics used FORTRAN. Business leaders used COBOL.
FORTRAN (from FORmula TRANslation) was actually the first high-level programming language. Developed by an IBM team led by John Backus, the first version of FORTRAN was released in 1957 for the IBM 704 computer. It was followed by several incremental improvements: Fortran 66 (1966), Fortran 77, and Fortran 90, all the way up to Fortran 2008. Refer to Listing 1 for the factorial program using FORTRAN 77.
LISTING 1: This is the factorial program using FORTRAN 77.
It seems close to BASIC, right? That’s not a surprise as BASIC was in fact based on concepts from FORTRAN and from another disapeared language, ALGOL. I’m sure that you are able to read and understand the FORTRAN in Listing 1, but its equivalent in COBOL is a bit stranger (see Listing 2). I must admit that it took me some time to make it working, even after reading some COBOL tutorials on the web. COBOL is an acronym for Common Business-Oriented Language, so it is not exactly targeting an application like a factorial calculation. It was developed in 1959 by a consortium named CODASYL, based on works from Grace Hopper. Even though its popularity fading, COBOL is still alive. I even read that an object-oriented version was released in 2002 (COBOL 2002) and even upgraded in 2014.
LISTING 2: The COBOL version looks a little stranger, right?
PASCAL & FRIENDS
I never actually used FORTRAN or COBOL, but I developed software on my Apple II using PASCAL. Released in 1970 by Niklaus Wirth (ETH Zurich, Swizerland), PASCAL was probably one of the earliest efforts to encourage structured and typed programming. Based on ALGOL-W (also invented by Wirth), it was followed by MODULA-2 and OBERON, which were less known but still influential.
Do you want to calculate a factorial in PASCAL? Here it is Listing 3. It may look familiar to FORTRAN or BASIC, but its advantages are in the details. PASCAL is a so-called strongly typed language. (You can’t add a tomato and a donut, contrarily to C.) It also forbids unstructured programming and it is very easy to read. PASCAL was a limited, but true, success. It was used in particular by Apple for the development of the Lisa computer as well as the first versions of the Macintosh. It is still in use today through one of its object-oriented versions, DELPHI.
LISTING 3: This is the PASCAL version. Easy to read.
THE ADA STORY
In the 1970s, the United States Department of Defense (DoD) conducted a survey and found that they were using no less than 450 different programming languages. So, it decided to define and develop yet another one—that is, a new language to replace all of them. After long specification and selection phases, a proposal from Jean Ichbiah (CII Honeywell Bull, France) was selected. The result was ADA. The name ADA, and its military standard reference (MIL-STD-1815), are in memory of Augusta Ada, Countess of Lovelace (1815–1852), who created of the first actual algorithms intended for a machine.
While ADA is, well, strongly typed and very powerful, it’s complex and quite boring to use (see Listing 4). The key advantage of ADA is that it is well standardized and supports constructs like concurrency. Thanks to its very formal syntax and type checking, it is nearly bug-proof. Based on my minimal experience, it is so strict that the first version of the code usually works, at least after you correct hundreds of compilation errors. That’s probably why it is still largely used for critical applications ranging from airplanes to military systems, even if it failed as a generic language.
LISTING 4: ADA is more verbose.
LISP & FORTH
ADA is a difficult language. In my opinion, LISP (List Processing) is far more interesting. It is an old story too. Designed in 1960 by John McCarthy (Stanford University), its concepts are still interesting to learn. McCarthy’s goal was to develop a simple language with full capabilities. That’s quite the opposite of ADA. The result was LISP. The syntax can be frightening, but you must try it. Listing 5 is a version of the factorial calculation in LISP.
LISTING 5: LISP is definitively fun!
In LISP, everything is a list, and a list is enclosed between parentheses. To execute a function, you have to create a list with a pointer to the function as a first element and then the parameters. For example, (- n 1) is a list that calculates n – 1. (if A B C) is a structure which evaluates A, and then evaluates either B or C based on the value of A. If you read this program, you will see that it is not based on a loop like all other versions I’ve presented, but on a concept called recursion. A factorial of a number is calculated as 1 if the number is 0, and as N times the factorial of (N – 1) otherwise. LISP was in fact the first language to support recursion—meaning, the possibility for a function to call itself again and again. It is also the first language to manage storage automatically, using garbage collection. Even more interesting, in LISP everything is a list, even a program. So in LISP, it is possible to develop a program that generates a program and executes it!
Another of my favorites is FORTH. Designed by Charles Moore in 1968, FORTH also supports self-modifying programs like LISP, and it is probably even more minimalist. FORTH is based on the concept of a stack, and operators push and pop data from this stack. It uses a postfix syntax, also named Reversed Polish Notation, like vintage Hewlett-Packard calculators. For example, 1 2 + . means “push 1 on the stack,” “push 2 on the stack,” “get two figures from the stack, add them and put the result back on the stack,” and “get a figure from the stack and display it.”
Here is our factorial program in FORTH:
: fact dup 1 do I * loop ; 8 fact .
The first line defines a new function named fact, and the second line executes it after pushing the value 8 on the stack. The syntax is of course a bit strange due to the postfixing but it is clear after a while. Let’s start with 8 on the stack. The command dup duplicates the top of the stack. The do…loop structure gets count and first index from the stack so it executes I * with I varying from 1 to 7, and each iteration multiplies the top of the stack by the index I. That’s it. You can try it using another web-based programming and simulation host: https://repl.it. Look at the result in Photo 3.
PHOTO 3: This is an example of FORTH in the Repl.it online compiler and simulator.
FUN WITH PROLOG & APL
LISP and FORTH are fun, but PROLOG is stranger. Developed by Alain Colmerauer and his team in 1972, PROLOG is the first of the so-called declarative languages. Rather than specifying an algorithm, such a declarative language defines facts and rules. It then lets the system determine if another fact can be deduced from them. An example is welcome.
LISTING 6: The PROLOG version based on a completely different paradigm.
Listing 6 is our factorial in PROLOG. The first fact states that the factorial of any number lower than 2 is 1. The second fact states that the factorial of any number X is F only if F is the product of X and another number, named here FM1, and if FM1 is the factorial of X – 1. This looks like a recursion, and this is recursion, but expressed differently. Then the last line states that X is the factorial of 8 and ask PROLOG to display X, and you will have the result. This is a confusing approach, but it is close to the needs of artificial intelligence algorithms.
Lastly, I can’t resist to the pleasure to show you another exotic vintage programming language, A Programming Language (APL). Refer to the factorial example in APL in Photo 4. I can’t even write it in the text of this article because APL uses nonstandard characters.
In fact, APL-enabled computers had APL-specific keyboards! Published in 1962 by Kenneth Iverson (Harvard University and then IBM), it was firstly a mathematical notation and then a programming language. Based largely on data arrays, APL targets numerical calculations so it isn’t a surprise to see that our factorial example is so compact in this language. Let’s understand it by reading the first line from right to left. The omega Greek symbol is the parameter of the function (that is, 8 in this case). The small symbol just before the omega called “iota” is generating a vector from 0 to N – 1, so here it is generating 0 1 2 3 4 5 6 7. The 1+ is adding one to each element of the array. This gives 1 2 3 4 5 6 7 8. Lastly, the x/ asks to multiply each value of the vector, which is the factorial!
After finishing this article, I searched the web for other interesting languages and found, well, a more than impressive website. Launch your browser right now and enter http://rosettacode.org. These crazy guys simply listed 837 programming tasks, and let the community program each of them with all programming languages. Yes, all of them, and no less than 648 different languages are referenced! Of course, I searched for a factorial calculation algorithm and found it. Versions of the factorial code for 220 different languages are provided! So, you can find similar versions to the ones I provided in this article as versions for more recent languages (Java, Python, Perl, etc.). You will also find obscure languages.
My goal with this article was to show you that languages other than C and JAVA can be fun and even helpful for specific projects. Vintage languages are not dead. For example, it seems that FORTH was used for NASA’s Rosetta mission. Moreover, innovation in computing languages goes on, and new and exciting alternatives are proposed every month!
Don’t hesitate to play with and test programming languages. The web is an invaluable tool for discovering new tools, so have fun!
Robert Lacoste lives in France, between Paris and Versailles. He has 30 years of experience in RF systems, analog designs, and high speed electronics. Robert has won prizes in more than 15 international design contests. In 2003 he started a consulting company, ALCIOM, to share his passion for innovative mixed-signal designs. Robert’s bimonthly Darker Side column has been published in Circuit cellar since 2007.
In addition to making Web applications easier to write, Ur/Web also makes them more secure. “Let’s say you want to have a calendar widget on your Web page, and you’re going to use a library that provides the calendar widget, and on the same page there’s also an advertisement box that’s based on code that’s provided by the ad network,” Chlipala says. “What you don’t want is for the ad network to be able to change how the calendar works or the author of the calendar code to be able to interfere with delivering the ads.” Ur/Web automatically prohibits that kind of unauthorized access between page elements.
Unstructured programming with GOTO is the stuff of legend, as are calling subroutines by line number–GOSUB 1000–and setting global variables as a mechanism for passing parameters.
The little language that fueled the home computer revolution has been long buried beneath an avalanche of derision, or at least disregarded as a relic from primitive times. That’s too bad, because while the language itself has serious shortcomings, the overall 8-bit BASIC experience has high points that are worth remembering.
It’s hard to separate the language and the computers it ran it on; flipping the power switch, even without a disk drive attached, resulted in a BASIC prompt. If nothing else, it could be treated as a calculator:
PRINT "seconds in a week: ",60*60*24*7
Notice how the cosine function is always available for use. No importing a library. No qualifying it with MATH.TRIG.
Or take advantage of this being a full programming language: