21 technologies transforming software development

A long time ago, developers wrote assembly code that ran fast and light. On good days, they had enough money in their budget to hire someone to toggle all those switches on the front of the machine to input their code. On bad days, they flipped the switches themselves. Life was simple: The software loaded data from memory, did some arithmetic, and sent it back. That was all.

Today, developers must work with teams spread across multiple continents where people speak different languages with different character sets and – this is the bad part – use different versions of the compiler. Some of the code is new, and some may be from decade-old libraries that may or may not come with source code. Building team spirit and slogging through the mess is only the beginning of what it means to be a programmer today.

Source: 21 technologies transforming software development | InfoWorld

The death of Ruby? Developers should learn these languages instead

Once the darling of the developer community, Ruby’s popularity has plummeted in the past few years, leading some tech leaders to wonder if the language may eventually die out completely.

The evidence is in the jobs: Java, JavaScript, .Net, HTML, and Python topped the list of languages found most often in tech job postings in the past year, according to Indeed, while Ruby came in far down the list, at No. 9.

In IEEE Spectrum’s ranking of the top programming languages, Ruby comes in at No. 12—down from No. 8 in 2014.

The lack of job prospects led coding bootcamp Coding Dojo to drop Ruby courses from all of its six campuses across the US by the end of the year, while adding a full-stack course in Java.

“We looked at local markets to see the most relevant technologies, and we found that Java was at the top of the charts, and Ruby on Rails seemed to rank much lower in demand in terms of startup positions, and general demand and interest,” said Speros Misirlakis, head of curriculum at Coding Dojo.

Source: The death of Ruby? Developers should learn these languages instead – TechRepublic

Vintage Programming Languages

 

For the last 30 years, C has been my programming language of choice. As you probably know, C was invented in the early 1970s by Dennis M. Ritchie for the first UNIX kernel and ran on a DEC PDP-11 computer. I am probably a bit old-fashioned. Yes, C is outdated, but I’m simply addicted to it, like plenty of other embedded system programmers. For me, C is a low level but portable language that’s adequate for all my professional and personal projects ranging from optimized code on microcontrollers to signal processing or even PC software. I know that there are many powerful alternatives like Java and C++, but, well, I’m used to C.

C is not the only vintage programming language, and playing with some others is definitively fun. This month, I’ll present several vintage languages and show you that each language has its pros and cons. Maybe you’ll find one of them helpful for a future project? I’m sure you won’t use COBOL in your next device, but what about FORTH or LISP? As you’ll see, thanks to web-based compilers and simulators, playing with programming languages is simple. And after you’re finished with this review of 1970s-era computing technology, give one or two a try!

BASIC

Like many teenagers in the 1970s, I learned to program with Beginner’s All-purpose Symbolic Instruction Code (BASIC). In 1980, after some early tests with programming calculators, a friend let me try a Rockwell AIM-65 computer. An expanded version of the KIM-1, it had an impressive 1 KB of RAM and a BASIC interpreter in ROM. It was my first contact with a high-level programming language. I was really astonished. This computer seemed to understand me! “Print 1+1.” “Ok, that’s 2.” One year later, I bought my first computer, an Apple II. It came with a much more powerful BASIC interpreter in ROM, AppleSoft Basic. (This interpreter was developed for Apple by a small company named Microsoft, but that’s another story.)

PHOTO 1: An online emulator for my old Apple IIPHOTO 1: An online emulator for my old Apple II

Now let’s launch an Apple II emulator and write some software for it. Look at Photo 1. Nice, isn’t it? This pretty emulator, developed in JavaScript by Will Scullin, is available online. Just launch it, enter this 10-line program, and then type “RUN”. It will calculate for you the factorial of eight: 8! = 1 × 2 × 3 × 4 × 5 × 6 × 7 × 8, which is 40,320.

Since its invention in 1964 at Dartmouth College, BASIC is more of a concept than a well-specified language. Plenty of variants exist up to Microsoft’s Visual Basic. But it has plenty of disadvantages, especially its early versions: a lack of structured data and controls, mandatory line numbering, a lack of type checking, low speed, and so on. Nevertheless, it is ultra-simple to learn and to understand. Even if you have never used BASIC, you’ll understand the code shown in Photo 1 without any problem. The main program starts by initializing a variable N with the value 8. I then calls a subprogram that starts at line 100, displays the result F, and stops. The subprogram initializes F to 1 and multiplies the result by each integer up to N. Straightforward.

C LANGUAGE

Let compare this BASIC with a C version of the same algorithm. For this article, I looked for online compilers and simulators. I found a great option at www.ideone.com, which, developed by Sphere Research Labs, supports more than 60 programming languages. You can edit a program using any of them, compile it, and test it without having to install anything on your PC. This is great for experimenting.

PHOTO 2: At Ideone.com, you can enter, compile, and simulate numerous programming languages. Here you see C language.PHOTO 2: At Ideone.com, you can enter, compile, and simulate numerous programming languages. Here you see C language.

The C variant of the factorial algorithm is depicted in Photo 2. I could have used plenty of different approaches, but I tried to stay as close as possible to the “spirit” of C. So, how does it compare with BASIC? The code is significantly more structured, but a little harder to read. C aficionados loves short forms like f*=i++ (which multiplies f by i and then increments i) even when they can be avoided. While this makes the code shorter and helps the compiler with optimization, it is probably cryptic to someone new to the language.

Of course, C also has great strengths. In particular, it offers you precise control of the data types and memory representation, which helps for low level programming. That’s probably why it has been so widely for nearly 50 years.

FORTRAN & COBOL

Let’s stay in the 1970s. BASIC or assembly language was for hobbyists and experimenters. C was used by early UNIX programmers. The rest of the programming world was divided into two camps. Scientifics used FORTRAN. Business leaders used COBOL.

FORTRAN (from FORmula TRANslation) was actually the first high-level programming language. Developed by an IBM team led by John Backus, the first version of FORTRAN was released in 1957 for the IBM 704 computer. It was followed by several incremental improvements: Fortran 66 (1966), Fortran 77, and Fortran 90, all the way up to Fortran 2008. Refer to Listing 1 for the factorial program using FORTRAN 77.

LISTING 1: This is the factorial program using FORTRAN 77.LISTING 1: This is the factorial program using FORTRAN 77.

It seems close to BASIC, right? That’s not a surprise as BASIC was in fact based on concepts from FORTRAN and from another disapeared language, ALGOL. I’m sure that you are able to read and understand the FORTRAN in Listing 1, but its equivalent in COBOL is a bit stranger (see Listing 2). I must admit that it took me some time to make it working, even after reading some COBOL tutorials on the web. COBOL is an acronym for Common Business-Oriented Language, so it is not exactly targeting an application like a factorial calculation. It was developed in 1959 by a consortium named CODASYL, based on works from Grace Hopper. Even though its popularity fading, COBOL is still alive. I even read that an object-oriented version was released in 2002 (COBOL 2002) and even upgraded in 2014.

LISTING 2: The COBOL version looks a little stranger, right?LISTING 2: The COBOL version looks a little stranger, right?

PASCAL & FRIENDS

I never actually used FORTRAN or COBOL, but I developed software on my Apple II using PASCAL. Released in 1970 by Niklaus Wirth (ETH Zurich, Swizerland), PASCAL was probably one of the earliest efforts to encourage structured and typed programming. Based on ALGOL-W (also invented by Wirth), it was followed by MODULA-2 and OBERON, which were less known but still influential.

Do you want to calculate a factorial in PASCAL? Here it is Listing 3. It may look familiar to FORTRAN or BASIC, but its advantages are in the details. PASCAL is a so-called strongly typed language. (You can’t add a tomato and a donut, contrarily to C.) It also forbids unstructured programming and it is very easy to read. PASCAL was a limited, but true, success. It was used in particular by Apple for the development of the Lisa computer as well as the first versions of the Macintosh. It is still in use today through one of its object-oriented versions, DELPHI.

LISTING 3: This is the PASCAL version. Easy to read.LISTING 3: This is the PASCAL version. Easy to read.

THE ADA STORY

In the 1970s, the United States Department of Defense (DoD) conducted a survey and found that they were using no less than 450 different programming languages. So, it decided to define and develop yet another one—that is, a new language to replace all of them. After long specification and selection phases, a proposal from Jean Ichbiah (CII Honeywell Bull, France) was selected. The result was ADA. The name ADA, and its military standard reference (MIL-STD-1815), are in memory of Augusta Ada, Countess of Lovelace (1815–1852), who created of the first actual algorithms intended for a machine.

While ADA is, well, strongly typed and very powerful, it’s complex and quite boring to use (see Listing 4). The key advantage of ADA is that it is well standardized and supports constructs like concurrency. Thanks to its very formal syntax and type checking, it is nearly bug-proof. Based on my minimal experience, it is so strict that the first version of the code usually works, at least after you correct hundreds of compilation errors. That’s probably why it is still largely used for critical applications ranging from airplanes to military systems, even if it failed as a generic language.

LISTING 4: ADA is more verbose.LISTING 4: ADA is more verbose.

LISP & FORTH

ADA is a difficult language. In my opinion, LISP (List Processing) is far more interesting. It is an old story too. Designed in 1960 by John McCarthy (Stanford University), its concepts are still interesting to learn. McCarthy’s goal was to develop a simple language with full capabilities. That’s quite the opposite of ADA. The result was LISP. The syntax can be frightening, but you must try it. Listing 5 is a version of the factorial calculation in LISP.

LISTING 5: LISP is definitively fun!LISTING 5: LISP is definitively fun!

In LISP, everything is a list, and a list is enclosed between parentheses. To execute a function, you have to create a list with a pointer to the function as a first element and then the parameters. For example, (- n 1) is a list that calculates n – 1. (if A B C) is a structure which evaluates A, and then evaluates either B or C based on the value of A. If you read this program, you will see that it is not based on a loop like all other versions I’ve presented, but on a concept called recursion. A factorial of a number is calculated as 1 if the number is 0, and as N times the factorial of (N – 1) otherwise. LISP was in fact the first language to support recursion—meaning, the possibility for a function to call itself again and again. It is also the first language to manage storage automatically, using garbage collection. Even more interesting, in LISP everything is a list, even a program. So in LISP, it is possible to develop a program that generates a program and executes it!

Another of my favorites is FORTH. Designed by Charles Moore in 1968, FORTH also supports self-modifying programs like LISP, and it is probably even more minimalist. FORTH is based on the concept of a stack, and operators push and pop data from this stack. It uses a postfix syntax, also named Reversed Polish Notation, like vintage Hewlett-Packard calculators. For example, 1 2 + . means “push 1 on the stack,” “push 2 on the stack,” “get two figures from the stack, add them and put the result back on the stack,” and “get a figure from the stack and display it.”

Here is our factorial program in FORTH:

: fact dup 1 do I * loop ; 8 fact .

The first line defines a new function named fact, and the second line executes it after pushing the value 8 on the stack. The syntax is of course a bit strange due to the postfixing but it is clear after a while. Let’s start with 8 on the stack. The command dup duplicates the top of the stack. The do…loop structure gets count and first index from the stack so it executes I * with I varying from 1 to 7, and each iteration multiplies the top of the stack by the index I. That’s it. You can try it using another web-based programming and simulation host: https://repl.it. Look at the result in Photo 3.

PHOTO 3: This is an example of FORTH in the Repl.it online compiler and simulator.PHOTO 3: This is an example of FORTH in the Repl.it online compiler and simulator.

FUN WITH PROLOG & APL

LISP and FORTH are fun, but PROLOG is stranger. Developed by Alain Colmerauer and his team in 1972, PROLOG is the first of the so-called declarative languages. Rather than specifying an algorithm, such a declarative language defines facts and rules. It then lets the system determine if another fact can be deduced from them. An example is welcome.

LISTING 6: The PROLOG version based on a completely different paradigm.LISTING 6: The PROLOG version based on a completely different paradigm.

Listing 6 is our factorial in PROLOG. The first fact states that the factorial of any number lower than 2 is 1. The second fact states that the factorial of any number X is F only if F is the product of X and another number, named here FM1, and if FM1 is the factorial of X – 1. This looks like a recursion, and this is recursion, but expressed differently. Then the last line states that X is the factorial of 8 and ask PROLOG to display X, and you will have the result. This is a confusing approach, but it is close to the needs of artificial intelligence algorithms.

Lastly, I can’t resist to the pleasure to show you another exotic vintage programming language, A Programming Language (APL). Refer to the factorial example in APL in Photo 4. I can’t even write it in the text of this article because APL uses nonstandard characters.

PHOTO 4: APL looks great, right? It’s unique keyboard alone is fun!PHOTO 4: APL looks great, right? It’s unique keyboard alone is fun!

In fact, APL-enabled computers had APL-specific keyboards! Published in 1962 by Kenneth Iverson (Harvard University and then IBM), it was firstly a mathematical notation and then a programming language. Based largely on data arrays, APL targets numerical calculations so it isn’t a surprise to see that our factorial example is so compact in this language. Let’s understand it by reading the first line from right to left. The omega Greek symbol is the parameter of the function (that is, 8 in this case). The small symbol just before the omega called “iota” is generating a vector from 0 to N – 1, so here it is generating 0 1 2 3 4 5 6 7. The 1+ is adding one to each element of the array. This gives 1 2 3 4 5 6 7 8. Lastly, the x/ asks to multiply each value of the vector, which is the factorial!

GET STARTED

After finishing this article, I searched the web for other interesting languages and found, well, a more than impressive website. Launch your browser right now and enter http://rosettacode.org. These crazy guys simply listed 837 programming tasks, and let the community program each of them with all programming languages. Yes, all of them, and no less than 648 different languages are referenced! Of course, I searched for a factorial calculation algorithm and found it. Versions of the factorial code for 220 different languages are provided! So, you can find similar versions to the ones I provided in this article as versions for more recent languages (Java, Python, Perl, etc.). You will also find obscure languages.

My goal with this article was to show you that languages other than C and JAVA can be fun and even helpful for specific projects. Vintage languages are not dead. For example, it seems that FORTH was used for NASA’s Rosetta mission. Moreover, innovation in computing languages goes on, and new and exciting alternatives are proposed every month!

Don’t hesitate to play with and test programming languages. The web is an invaluable tool for discovering new tools, so have fun!

This article appears in Circuit Cellar 323.

Robert Lacoste lives in France, between Paris and Versailles. He has 30 years of experience in RF systems, analog designs, and high speed electronics. Robert has won prizes in more than 15 international design contests. In 2003 he started a consulting company, ALCIOM, to share his passion for innovative mixed-signal designs. Robert’s bimonthly Darker Side column has been published in Circuit cellar since 2007.

Source: Vintage Programming Languages | Circuit Cellar

Taking the grunt work out of Web development

A Web page today is the result of a number of interacting components — like cascading style sheets, XML code, ad hoc database queries, and JavaScript functions. For all but the most rudimentary sites, keeping track of how these different elements interact, refer to each other, and pass data back and forth can be a time-consuming chore.

In a paper being presented at the Association for Computing Machinery’s Symposium on Principles of Programming Languages, Adam Chlipala, the Douglas Ross Career Development Professor of Software Technology, describes a new programming language, called Ur/Web, that lets developers write Web applications as self-contained programs. The language’s compiler — the program that turns high-level instructions into machine-executable code — then automatically generates the corresponding XML code and style-sheet specifications and embeds the JavaScript and database code in the right places.

In addition to making Web applications easier to write, Ur/Web also makes them more secure. “Let’s say you want to have a calendar widget on your Web page, and you’re going to use a library that provides the calendar widget, and on the same page there’s also an advertisement box that’s based on code that’s provided by the ad network,” Chlipala says. “What you don’t want is for the ad network to be able to change how the calendar works or the author of the calendar code to be able to interfere with delivering the ads.” Ur/Web automatically prohibits that kind of unauthorized access between page elements.

via Taking the grunt work out of Web development | MIT News.

Lost Lessons from 8-Bit BASIC

Unstructured programming with GOTO is the stuff of legend, as are calling subroutines by line number–GOSUB 1000–and setting global variables as a mechanism for passing parameters.

The little language that fueled the home computer revolution has been long buried beneath an avalanche of derision, or at least disregarded as a relic from primitive times. That’s too bad, because while the language itself has serious shortcomings, the overall 8-bit BASIC experience has high points that are worth remembering.

It’s hard to separate the language and the computers it ran it on; flipping the power switch, even without a disk drive attached, resulted in a BASIC prompt. If nothing else, it could be treated as a calculator:

PRINT "seconds in a week: ",60*60*24*7

or

PRINT COS(2)/2

Notice how the cosine function is always available for use. No importing a library. No qualifying it with MATH.TRIG.

Or take advantage of this being a full programming language:

T = 0
FOR I=1 TO 10:T=T+I*I:NEXT I
PRINT T

via Lost Lessons from 8-Bit BASIC.

Stroustrup: Why the 35-year-old C++ still dominates ‘real’ dev

In an interview, C++ designer Bjarne Stroustrup says the programming language remains vital and relevant 35 years after he first designed it in 1979 because of its ability to handle complexity, making it the go-to solution for telecom, financial, and embedded applications and online systems such as Amazon and Google. Stroustrup says Google’s Go language, which has been receiving a great deal of attention, can “do a few things elegantly,” but loses “the edge in performance.” Stroustrup says he used C++ for projects that “required a real programming language and real performance,” by way of noting the language is more suitable for large-scale projects than small apps or hobbyists. Stroustrup says he is continuing to work to build the capabilities of C++ with the release of a new minor edition, C++ 14, this year. The update offers several improvements, including new templates and better memory initialization. Asked what role security should play in software development, Stroustrup says, “security is a systems issue.” He also calls for greater professionalism among software programmers. “There are things in our society that mustn’t break, and most of them depend on software,” he says.

Stroustrup: Why the 35-year-old C++ still dominates ‘real’ dev | Application development – InfoWorld.

PHP gets its own formal language specification

Although the PHP scripting language has been around since 1995 and is a staple of Web development, it does not actually have a formal language specification — just extensive user documentation. But that is all set to change.

Led by Facebook, a draft specification has been posted on GitHub to provide a complete definition of PHP language semantics and syntax.

via PHP gets its own formal language specification | Php web – InfoWorld.

Why Apple’s Swift Language Will Instantly Remake Computer Programming

The language is called Swift, and on June 2, Apple released a test version to coders outside the company, billing it as a faster and more effective means of building software apps for iPhones, iPads, and Macs.

via Why Apple’s Swift Language Will Instantly Remake Computer Programming | Enterprise | WIRED.

Python bumps off Java as top learning language

I think that Python is a great learning language. The constructs are easy and natural, and students end up “fighting” more with the overall algorithm implementation (which is good) than with the details of system management.

Python has surpassed Java as the top language used to introduce U.S. students to programming and computer science, according to a recent survey posted by the Association for Computing Machinery (ACM).

Eight of the top 10 computer science departments now use Python to teach coding, as well as 27 of the top 39 schools, indicating that it is the most popular language for teaching introductory computer science courses, according to Philip Guo, a computer science researcher who compiled the survey for ACM.

The three largest, most popular online class providers — Coursera, edX and Udacity — also offer introductory programming courses in Python, Guo found.

Python has been growing in popularity in the educational realm for at least the past few years, though this survey is the first to show it has eclipsed Java, which has been the dominant teaching language for the past decade, Guo said in a blog post about his survey.

via Python bumps off Java as top learning language – Computerworld.