21 technologies transforming software development

A long time ago, developers wrote assembly code that ran fast and light. On good days, they had enough money in their budget to hire someone to toggle all those switches on the front of the machine to input their code. On bad days, they flipped the switches themselves. Life was simple: The software loaded data from memory, did some arithmetic, and sent it back. That was all.

Today, developers must work with teams spread across multiple continents where people speak different languages with different character sets and – this is the bad part – use different versions of the compiler. Some of the code is new, and some may be from decade-old libraries that may or may not come with source code. Building team spirit and slogging through the mess is only the beginning of what it means to be a programmer today.

Source: 21 technologies transforming software development | InfoWorld

The death of Ruby? Developers should learn these languages instead

Once the darling of the developer community, Ruby’s popularity has plummeted in the past few years, leading some tech leaders to wonder if the language may eventually die out completely.

The evidence is in the jobs: Java, JavaScript, .Net, HTML, and Python topped the list of languages found most often in tech job postings in the past year, according to Indeed, while Ruby came in far down the list, at No. 9.

In IEEE Spectrum’s ranking of the top programming languages, Ruby comes in at No. 12—down from No. 8 in 2014.

The lack of job prospects led coding bootcamp Coding Dojo to drop Ruby courses from all of its six campuses across the US by the end of the year, while adding a full-stack course in Java.

“We looked at local markets to see the most relevant technologies, and we found that Java was at the top of the charts, and Ruby on Rails seemed to rank much lower in demand in terms of startup positions, and general demand and interest,” said Speros Misirlakis, head of curriculum at Coding Dojo.

Source: The death of Ruby? Developers should learn these languages instead – TechRepublic

Larger than proof

David Hilbert, the great German mathematician (he died in 1943), had a stupendous, dazzling vision. He hoped and believed that some day mathematicians would construct one vast formal deductive system with axioms so powerful that every possible theorem in all of mathematics could be proved true or false. Such a system would have to be both consistent and complete. Consistent means it is impossible to prove both a statement and its negation. Complete means that every statement in the system can be proved true or false.

In 1931, to the astonishment of mathematicians, a shy, reclusive Austrian, Kurt Gödel, aged twenty-five, shattered Hilbert’s magnificent dream. Gödel showed that any formal system rich enough to include arithmetic and elementary logic could not be both consistent and complete. If complete, it would contain an infinity of true statements that could not be proved by the system’s axioms. What is worse, even the consistency of such a system cannot be established by reasoning within the system. “God exists,” a mathematician remarked, “because mathematics is consistent, and the devil exists because we can never prove it.”

I recall a cartoon by Robert Monkoff which shows a man in a restaurant examining his bill. He is saying to the puzzled waiter: “The arithmetic seems correct, yet I find myself haunted by the idea that the basic axioms on which arithmetic is based might give rise to contradictions that would then invalidate these computations.”

Fortunately, arithmetic can be shown consistent, but only by going outside it to a larger system. Alas, the larger system can’t be proved consistent without going to a still larger system. Many formal systems less complex than arithmetic, such as simple logic and even arithmetic without multiplication and division, can be proved consistent and complete without going beyond the system. But on levels that include all of arithmetic, the need for meta-systems to prove completeness and consistency never ends. There is no final system, such as Hilbert longed for, that captures all of mathematics. “Truth,” as the authors of this new book encapsule it, “is larger than proof.”

Source: Larger than proof | The New Criterion

Ransomware Attack Sweeps Globe

A major global cyber-attack disrupted computers at Russia’s biggest oil company, Ukrainian banks and multinational firms with a virus similar to the ransomware that infected more than 300,000 computers last month.

The rapidly spreading cyber extortion campaign, which began on Tuesday, underscored growing concerns that businesses have failed to secure their networks from increasingly aggressive hackers, who have shown they are capable of shutting down critical infrastructure and crippling corporate and government networks.

Source: Ransomware Attack Sweeps Globe, Researchers See WannaCry Link | Technology News

Global ransomware attack causes chaos – BBC News

Companies across the globe are reporting that they have been struck by a major ransomware cyber-attack.British advertising agency WPP is among those to say its IT systems have been disrupted as a consequence.

Ukrainian firms, including the state power distributor and Kiev’s main airport were among the first to report issues.

Experts suggest the malware is taking advantage of the same weaknesses used by the Wannacry attack last month.

Source: Global ransomware attack causes chaos – BBC News

Personal details of nearly 200 million US citizens exposed

From: http://www.bbc.com/news/technology-40331215

Sensitive personal details relating to almost 200 million US citizens have been accidentally exposed by a marketing firm contracted by the Republican National Committee.

The 1.1 terabytes of data includes birthdates, home addresses, telephone numbers and political views of nearly 62% of the entire US population.

The data was available on a publicly accessible Amazon cloud server.

Anyone could access the data as long as they had a link to it.

Political biases exposed

The huge cache of data was discovered last week by Chris Vickery, a cyber-risk analyst with security firm UpGuard. The information seems to have been collected from a wide range of sources – from posts on controversial banned threads on the social network Reddit, to committees that raised funds for the Republican Party.

The information was stored in spreadsheets uploaded to a server owned by Deep Root Analytics. It had last been updated in January when President Donald Trump was inaugurated and had been online for an unknown period of time.

“We take full responsibility for this situation. Based on the information we have gathered thus far, we do not believe that our systems have been hacked,” Deep Root Analytics’ founder Alex Lundry told technology website Gizmodo.

“Since this event has come to our attention, we have updated the access settings and put protocols in place to prevent further access.”

Apart from personal details, the data also contained citizens’ suspected religious affiliations, ethnicities and political biases, such as where they stood on controversial topics like gun control, the right to abortion and stem cell research.

The file names and directories indicated that the data was meant to be used by influential Republican political organisations. The idea was to try to create a profile on as many voters as possible using all available data, so some of the fields in the spreadsheets were left left empty if an answer could not be found.

“That such an enormous national database could be created and hosted online, missing even the simplest of protections against the data being publicly accessible, is troubling,” Dan O’Sullivan wrote in a blog post on Upguard’s website.

“The ability to collect such information and store it insecurely further calls into question the responsibilities owed by private corporations and political campaigns to those citizens targeted by increasingly high-powered data analytics operations.”

Privacy concerns

Although it is known that political parties routinely gather data on voters, this is the largest breach of electoral data in the US to date and privacy experts are concerned about the sheer scale of the data gathered.

“This is deeply troubling. This is not just sensitive, it’s intimate information, predictions about people’s behaviour, opinions and beliefs that people have never decided to disclose to anyone,” Privacy International’s policy officer Frederike Kaltheuner told the BBC News website.

However, the issue of data collection and using computer models to predict voter behaviour is not just limited to marketing firms – Privacy International says that the entire online advertising ecosystem operates in the same way.

“It is a threat to the way democracy works. The GOP [Republican Party] relied on publicly-collected, commercially-provided information. Nobody would have realised that the data they entrusted to one organisation would end up in a database used to target them politically.

“You should be in charge of what is happening to your data, who can use it and for what purposes,” Ms Kaltheuner added.

There are fears that leaked data can easily be used for nefarious purposes, from identity fraud to harassment of people under protection orders, or to intimidate people who hold an opposing political view.

“The potential for this type of data being made available publicly and on the dark web is extremely high,” Paul Fletcher, a cyber-security evangelist at security firm Alert Logic told the BBC.

Interesting article about Lie Groups

Math Has No God Particle

Jun. 6, 2017 at 11:01 AM

Ten years ago, Jeffrey Adams, a mathematician at the University of Maryland, made an appearance in The New York Times that prompted a series of angry emails. His correspondents all wanted to know one thing: “Who the hell do you think you are?”

Who Adams is is the leader of a cutting-edge mathematical research project called the Atlas of Lie Groups and Representations. Lie groups are named after Norwegian mathematician Sophus Lie (rhymes with “free,” not “fry”), who studied these crucial mathematical objects. Lie groups are used to map the inner machinery of multidimensional symmetrical objects, and they’re important because symmetry underpins far-flung mathematical concepts, from a third-grade number line to many-dimensional string theory. The Atlas project is a bona fide atlas of these objects, an exhaustive compendium of Lie group information, including tables of data about what they “look” like and what makes them tick. You’d think that cracking the code on these fundamental mathematical ideas would be a big deal. It is, but Adams would rather not dwell on it.

The success of the atlas project poses a tough math problem of a different kind: What should math’s relationship be with the broader, non-expert public? On the one hand, mathematicians in particular and scientists in general relish publicity. It allows them to trumpet good work, lobby for funding and inspire the next generation. On the other, in an ultra-specialized field such as math, publicity can twist finely constructed theorems, proofs and calculations beyond recognition.

In 2007, just before the angry emails started to roll in, the atlas group cleared an early hurdle in its quest, mapping an exotic and supersymmetric Lie group known as E8. They still had years of work before they could declare the atlas complete, yet the milestone was celebrated with a splashy press release from the American Institute of Mathematics explaining that the calculation, “if written out in tiny print, would cover an area the size of Manhattan.” It also provided a pretty picture of the “root system” of E8.

The combination of a superlative calculation and an eye-popping visualization was viral mathematical fuel. The New York Times wrote excitedly that the E8 calculation “may underlie the Theory of Everything that physicists seek to describe the universe.” (Everything! The universe!) E8 news bounced around the internet for months. “All hell broke loose,” Adams said. “We got this incredible tsunami of publicity, and it was only a sort of preliminary, intermediate result. Some people thought it was distasteful.”

Who the hell did he think he was?

“Mathematicians are extremely reluctant to publicize what they do,” Adams said. “The immediate reaction from 90 percent of mathematicians is, ‘It’s too hard, there’s no point in trying to write about this in the popular press.’” (Yet here we are.)

The atlas work, far from complete even amid the tsunami, continued apace. About two months ago — 15 years after it began — the project was finally completed. Adams and his colleagues released Version 1.0 of their atlas software.

This time around, however, there’s been no press release, no pretty picture, no city-size braggadocio, no New York Times story. Adams and his team haven’t trumpeted this latest accomplishment at all. When I reached him at his home, he summarized the milestone plainly, but proudly, in the jargon of his field: “We can now compute the Hermitian form on any irreducible representation.”

Raphaël Rouquier, a mathematician and Lie theorist at UCLA, echoed the ticklish relationship between mathematicians and the press. “There is a general feeling in the pure math community that popularizing mathematics is betraying mathematics,” Rouquier said. But he also argued for the importance of getting the word out. “I think there’s a need for mathematics to be represented in the press,” he said. “And I think we live in a society where people need to be more exposed to science. It’s good for politicians and readers.” The last few decades, up to and including the atlas, have been “an amazing chapter of mathematics,” he said.

Still, for those who do want to bullhorn their research, the difficulty of translation remains, especially compared to the other hard sciences. “We’re not trying to describe the real world,” Rouquier said.

Ah, but then should those of us in the real world care? The hope may be that other scientists, and the rest of us who don’t care about 248-dimensional objects, may profit from this math, but there’s no guarantee. Pure mathematicians do their work with no expectation of concrete application, although applications do have a way of presenting themselves when one least expects it — and often after the mathematician is long dead. In the case of the atlas, symmetry plays an important role in math, but also in physics and biology and astronomy. “There’s always symmetry underlying various systems,” Adams said. “Generally, mathematicians can’t say that what we’re working on is going to be good for society or something,” he said. “Our strong belief is that over time, as we learn these things, we wind up finding applications.”

David Vogan, who’s a mathematician at MIT and was involved with the atlas project, described academic mathematics as a garden. There are showy, flowery fields like number theory. Its beautiful problems and elegant results, such as the prime gap or Fermat’s last theorem, are math’s orchids. There are also the tomatoes — the things you can eat out of the garden, the practical yield. These disciplines, like Fourier analysis with its concrete applications to signal processing of audio, radio and light waves, are businesslike. And then there are the disciplines, often unheralded, that keep the rest of the garden growing — the hoes, the sprinklers. Lie groups, their representations and the atlas project are an example.

“Representation theory,” the field of the atlas group’s research, “is the fertilizer or the rose trellis, depending on the day of the week,” Vogan said.

Even when researchers do want their work shared widely, why don’t we read more about the fuel that makes math grow? “The physicists tell exciting stories,” Vogan said. “In some ways, this is a failure of mathematicians to tell exciting stories.” The physicists also have better names. Black hole and God particle quicken the pulse somewhat more than “irreducible unitary representation.”

The asymmetry in storytelling between math and the other sciences may also be because the research has different start-up costs. You need billions of dollars to build an enormous tunnel to house a particle accelerator to discover evidence of the God particle, also known as the Higgs boson. A good story may secure you coverage, enthusiasm and, if you’re lucky, lots of cash. To map Lie groups, Vogan said, you just need a teaching load light enough to put in extra work on the weekends: “We can do these things with small amounts of money.”

Even after the release of Version 1.0, and even in relative silence, the group has no intention of stopping, and it’ll continue to unravel the symmetric secrets of the mathematical universe. “It’s never complete,” Adams said. “There’s much more to do. I’ll die before I’m happy with everything.”

Source: Math Has No God Particle | FiveThirtyEight

WikiLeaks Dumps CIA Patient Zero Windows Implant

WikiLeaks Dumps CIA Patient Zero Windows Implant

WikiLeaks on Thursday made public a CIA implant that is used to turn a Windows file server into a malware distribution point on the local network.

The documents describing the tool, Pandemic, explain how remote machines on the local network trying to download and-or execute documents from the file server over SMB are infected with “replacement” documents on the fly. The implant swaps out the document with a Trojanized version while it’s in transit, never touching the original document on the file server.

The documentation that was leaked yesterday spans from January 2014 to April 2014 and is for versions 1.0 and 1.1.

The leaks are just the latest CIA tools to be dumped on the internet by the polarizing whistleblower outfit, which has for every Friday since March—save last week—put CIA documents and attacks online for public consumption.

In between are the ShadowBrokers pouring more gasoline on this information-based firestorm promising monthly leaks of not only NSA-built exploits targeting browsers, handsets and Windows 10 computers, but also stolen data allegedly from China, Iran, Russia and North Korea’s nuclear and missile programs.

The ShadowBrokers have already leaked their share of Windows-based exploits and vulnerabilities, the most worrisome being an April disclosure of SMB flaws and attacks that had been patched by Microsoft in March after it was allegedly tipped off by the NSA. One of those SMB exploits, EternalBlue, was of course used to launch and spread the WannaCry ransomware attacks three weeks ago today.

The ShadowBrokers also had their turn in the spotlight this week announcing a pricing structure and delivery schedule for its so-called Monthly Dump Service.

The Pandemic leak does not explain what the CIA’s initial infection vector is, but does describe it as a persistent implant.

“As the name suggests, a single computer on a local network with shared drives that is infected with the ‘Pandemic’ implant will act like a ‘Patient Zero’ in the spread of a disease,” WikiLeaks said in its summary description. “‘Pandemic’ targets remote users by replacing application code on-the-fly with a Trojaned version if the program is retrieved from the infected machine.”

The key to evading detection is its ability to modify or replace requested files in transit, hiding its activity by never touching the original file. The new attack then executes only on the machine requesting the file.

Version 1.1 of Pandemic, according to the CIA’s documentation, can target and replace up to 20 different files with a maximum size of 800MB for a single replacement file.

“It will infect remote computers if the user executes programs stored on the pandemic file server,” WikiLeaks said. “Although not explicitly stated in the documents, it seems technically feasible that remote computers that provide file shares themselves become new pandemic file servers on the local network to reach new targets.”

The CIA describes Pandemic as a tool that runs as kernel shellcode that installs a file system filter driver. The driver is used to replace a file with a payload when a user on the local network accesses the file over SMB.

“The goal of Pandemic is to be installed on a machine where the remote users use SMB to download/execute PE (portable executable) files,” the documentation says. “Users that are targeted by Pandemic, and use SMB to download the targeted file, will receive the ‘replacement’ file.”

Source: WikiLeaks Dumps CIA Patient Zero Windows Implant | Threatpost | The first stop for security news

Vintage Programming Languages

 

For the last 30 years, C has been my programming language of choice. As you probably know, C was invented in the early 1970s by Dennis M. Ritchie for the first UNIX kernel and ran on a DEC PDP-11 computer. I am probably a bit old-fashioned. Yes, C is outdated, but I’m simply addicted to it, like plenty of other embedded system programmers. For me, C is a low level but portable language that’s adequate for all my professional and personal projects ranging from optimized code on microcontrollers to signal processing or even PC software. I know that there are many powerful alternatives like Java and C++, but, well, I’m used to C.

C is not the only vintage programming language, and playing with some others is definitively fun. This month, I’ll present several vintage languages and show you that each language has its pros and cons. Maybe you’ll find one of them helpful for a future project? I’m sure you won’t use COBOL in your next device, but what about FORTH or LISP? As you’ll see, thanks to web-based compilers and simulators, playing with programming languages is simple. And after you’re finished with this review of 1970s-era computing technology, give one or two a try!

BASIC

Like many teenagers in the 1970s, I learned to program with Beginner’s All-purpose Symbolic Instruction Code (BASIC). In 1980, after some early tests with programming calculators, a friend let me try a Rockwell AIM-65 computer. An expanded version of the KIM-1, it had an impressive 1 KB of RAM and a BASIC interpreter in ROM. It was my first contact with a high-level programming language. I was really astonished. This computer seemed to understand me! “Print 1+1.” “Ok, that’s 2.” One year later, I bought my first computer, an Apple II. It came with a much more powerful BASIC interpreter in ROM, AppleSoft Basic. (This interpreter was developed for Apple by a small company named Microsoft, but that’s another story.)

PHOTO 1: An online emulator for my old Apple IIPHOTO 1: An online emulator for my old Apple II

Now let’s launch an Apple II emulator and write some software for it. Look at Photo 1. Nice, isn’t it? This pretty emulator, developed in JavaScript by Will Scullin, is available online. Just launch it, enter this 10-line program, and then type “RUN”. It will calculate for you the factorial of eight: 8! = 1 × 2 × 3 × 4 × 5 × 6 × 7 × 8, which is 40,320.

Since its invention in 1964 at Dartmouth College, BASIC is more of a concept than a well-specified language. Plenty of variants exist up to Microsoft’s Visual Basic. But it has plenty of disadvantages, especially its early versions: a lack of structured data and controls, mandatory line numbering, a lack of type checking, low speed, and so on. Nevertheless, it is ultra-simple to learn and to understand. Even if you have never used BASIC, you’ll understand the code shown in Photo 1 without any problem. The main program starts by initializing a variable N with the value 8. I then calls a subprogram that starts at line 100, displays the result F, and stops. The subprogram initializes F to 1 and multiplies the result by each integer up to N. Straightforward.

C LANGUAGE

Let compare this BASIC with a C version of the same algorithm. For this article, I looked for online compilers and simulators. I found a great option at www.ideone.com, which, developed by Sphere Research Labs, supports more than 60 programming languages. You can edit a program using any of them, compile it, and test it without having to install anything on your PC. This is great for experimenting.

PHOTO 2: At Ideone.com, you can enter, compile, and simulate numerous programming languages. Here you see C language.PHOTO 2: At Ideone.com, you can enter, compile, and simulate numerous programming languages. Here you see C language.

The C variant of the factorial algorithm is depicted in Photo 2. I could have used plenty of different approaches, but I tried to stay as close as possible to the “spirit” of C. So, how does it compare with BASIC? The code is significantly more structured, but a little harder to read. C aficionados loves short forms like f*=i++ (which multiplies f by i and then increments i) even when they can be avoided. While this makes the code shorter and helps the compiler with optimization, it is probably cryptic to someone new to the language.

Of course, C also has great strengths. In particular, it offers you precise control of the data types and memory representation, which helps for low level programming. That’s probably why it has been so widely for nearly 50 years.

FORTRAN & COBOL

Let’s stay in the 1970s. BASIC or assembly language was for hobbyists and experimenters. C was used by early UNIX programmers. The rest of the programming world was divided into two camps. Scientifics used FORTRAN. Business leaders used COBOL.

FORTRAN (from FORmula TRANslation) was actually the first high-level programming language. Developed by an IBM team led by John Backus, the first version of FORTRAN was released in 1957 for the IBM 704 computer. It was followed by several incremental improvements: Fortran 66 (1966), Fortran 77, and Fortran 90, all the way up to Fortran 2008. Refer to Listing 1 for the factorial program using FORTRAN 77.

LISTING 1: This is the factorial program using FORTRAN 77.LISTING 1: This is the factorial program using FORTRAN 77.

It seems close to BASIC, right? That’s not a surprise as BASIC was in fact based on concepts from FORTRAN and from another disapeared language, ALGOL. I’m sure that you are able to read and understand the FORTRAN in Listing 1, but its equivalent in COBOL is a bit stranger (see Listing 2). I must admit that it took me some time to make it working, even after reading some COBOL tutorials on the web. COBOL is an acronym for Common Business-Oriented Language, so it is not exactly targeting an application like a factorial calculation. It was developed in 1959 by a consortium named CODASYL, based on works from Grace Hopper. Even though its popularity fading, COBOL is still alive. I even read that an object-oriented version was released in 2002 (COBOL 2002) and even upgraded in 2014.

LISTING 2: The COBOL version looks a little stranger, right?LISTING 2: The COBOL version looks a little stranger, right?

PASCAL & FRIENDS

I never actually used FORTRAN or COBOL, but I developed software on my Apple II using PASCAL. Released in 1970 by Niklaus Wirth (ETH Zurich, Swizerland), PASCAL was probably one of the earliest efforts to encourage structured and typed programming. Based on ALGOL-W (also invented by Wirth), it was followed by MODULA-2 and OBERON, which were less known but still influential.

Do you want to calculate a factorial in PASCAL? Here it is Listing 3. It may look familiar to FORTRAN or BASIC, but its advantages are in the details. PASCAL is a so-called strongly typed language. (You can’t add a tomato and a donut, contrarily to C.) It also forbids unstructured programming and it is very easy to read. PASCAL was a limited, but true, success. It was used in particular by Apple for the development of the Lisa computer as well as the first versions of the Macintosh. It is still in use today through one of its object-oriented versions, DELPHI.

LISTING 3: This is the PASCAL version. Easy to read.LISTING 3: This is the PASCAL version. Easy to read.

THE ADA STORY

In the 1970s, the United States Department of Defense (DoD) conducted a survey and found that they were using no less than 450 different programming languages. So, it decided to define and develop yet another one—that is, a new language to replace all of them. After long specification and selection phases, a proposal from Jean Ichbiah (CII Honeywell Bull, France) was selected. The result was ADA. The name ADA, and its military standard reference (MIL-STD-1815), are in memory of Augusta Ada, Countess of Lovelace (1815–1852), who created of the first actual algorithms intended for a machine.

While ADA is, well, strongly typed and very powerful, it’s complex and quite boring to use (see Listing 4). The key advantage of ADA is that it is well standardized and supports constructs like concurrency. Thanks to its very formal syntax and type checking, it is nearly bug-proof. Based on my minimal experience, it is so strict that the first version of the code usually works, at least after you correct hundreds of compilation errors. That’s probably why it is still largely used for critical applications ranging from airplanes to military systems, even if it failed as a generic language.

LISTING 4: ADA is more verbose.LISTING 4: ADA is more verbose.

LISP & FORTH

ADA is a difficult language. In my opinion, LISP (List Processing) is far more interesting. It is an old story too. Designed in 1960 by John McCarthy (Stanford University), its concepts are still interesting to learn. McCarthy’s goal was to develop a simple language with full capabilities. That’s quite the opposite of ADA. The result was LISP. The syntax can be frightening, but you must try it. Listing 5 is a version of the factorial calculation in LISP.

LISTING 5: LISP is definitively fun!LISTING 5: LISP is definitively fun!

In LISP, everything is a list, and a list is enclosed between parentheses. To execute a function, you have to create a list with a pointer to the function as a first element and then the parameters. For example, (- n 1) is a list that calculates n – 1. (if A B C) is a structure which evaluates A, and then evaluates either B or C based on the value of A. If you read this program, you will see that it is not based on a loop like all other versions I’ve presented, but on a concept called recursion. A factorial of a number is calculated as 1 if the number is 0, and as N times the factorial of (N – 1) otherwise. LISP was in fact the first language to support recursion—meaning, the possibility for a function to call itself again and again. It is also the first language to manage storage automatically, using garbage collection. Even more interesting, in LISP everything is a list, even a program. So in LISP, it is possible to develop a program that generates a program and executes it!

Another of my favorites is FORTH. Designed by Charles Moore in 1968, FORTH also supports self-modifying programs like LISP, and it is probably even more minimalist. FORTH is based on the concept of a stack, and operators push and pop data from this stack. It uses a postfix syntax, also named Reversed Polish Notation, like vintage Hewlett-Packard calculators. For example, 1 2 + . means “push 1 on the stack,” “push 2 on the stack,” “get two figures from the stack, add them and put the result back on the stack,” and “get a figure from the stack and display it.”

Here is our factorial program in FORTH:

: fact dup 1 do I * loop ; 8 fact .

The first line defines a new function named fact, and the second line executes it after pushing the value 8 on the stack. The syntax is of course a bit strange due to the postfixing but it is clear after a while. Let’s start with 8 on the stack. The command dup duplicates the top of the stack. The do…loop structure gets count and first index from the stack so it executes I * with I varying from 1 to 7, and each iteration multiplies the top of the stack by the index I. That’s it. You can try it using another web-based programming and simulation host: https://repl.it. Look at the result in Photo 3.

PHOTO 3: This is an example of FORTH in the Repl.it online compiler and simulator.PHOTO 3: This is an example of FORTH in the Repl.it online compiler and simulator.

FUN WITH PROLOG & APL

LISP and FORTH are fun, but PROLOG is stranger. Developed by Alain Colmerauer and his team in 1972, PROLOG is the first of the so-called declarative languages. Rather than specifying an algorithm, such a declarative language defines facts and rules. It then lets the system determine if another fact can be deduced from them. An example is welcome.

LISTING 6: The PROLOG version based on a completely different paradigm.LISTING 6: The PROLOG version based on a completely different paradigm.

Listing 6 is our factorial in PROLOG. The first fact states that the factorial of any number lower than 2 is 1. The second fact states that the factorial of any number X is F only if F is the product of X and another number, named here FM1, and if FM1 is the factorial of X – 1. This looks like a recursion, and this is recursion, but expressed differently. Then the last line states that X is the factorial of 8 and ask PROLOG to display X, and you will have the result. This is a confusing approach, but it is close to the needs of artificial intelligence algorithms.

Lastly, I can’t resist to the pleasure to show you another exotic vintage programming language, A Programming Language (APL). Refer to the factorial example in APL in Photo 4. I can’t even write it in the text of this article because APL uses nonstandard characters.

PHOTO 4: APL looks great, right? It’s unique keyboard alone is fun!PHOTO 4: APL looks great, right? It’s unique keyboard alone is fun!

In fact, APL-enabled computers had APL-specific keyboards! Published in 1962 by Kenneth Iverson (Harvard University and then IBM), it was firstly a mathematical notation and then a programming language. Based largely on data arrays, APL targets numerical calculations so it isn’t a surprise to see that our factorial example is so compact in this language. Let’s understand it by reading the first line from right to left. The omega Greek symbol is the parameter of the function (that is, 8 in this case). The small symbol just before the omega called “iota” is generating a vector from 0 to N – 1, so here it is generating 0 1 2 3 4 5 6 7. The 1+ is adding one to each element of the array. This gives 1 2 3 4 5 6 7 8. Lastly, the x/ asks to multiply each value of the vector, which is the factorial!

GET STARTED

After finishing this article, I searched the web for other interesting languages and found, well, a more than impressive website. Launch your browser right now and enter http://rosettacode.org. These crazy guys simply listed 837 programming tasks, and let the community program each of them with all programming languages. Yes, all of them, and no less than 648 different languages are referenced! Of course, I searched for a factorial calculation algorithm and found it. Versions of the factorial code for 220 different languages are provided! So, you can find similar versions to the ones I provided in this article as versions for more recent languages (Java, Python, Perl, etc.). You will also find obscure languages.

My goal with this article was to show you that languages other than C and JAVA can be fun and even helpful for specific projects. Vintage languages are not dead. For example, it seems that FORTH was used for NASA’s Rosetta mission. Moreover, innovation in computing languages goes on, and new and exciting alternatives are proposed every month!

Don’t hesitate to play with and test programming languages. The web is an invaluable tool for discovering new tools, so have fun!

This article appears in Circuit Cellar 323.

Robert Lacoste lives in France, between Paris and Versailles. He has 30 years of experience in RF systems, analog designs, and high speed electronics. Robert has won prizes in more than 15 international design contests. In 2003 he started a consulting company, ALCIOM, to share his passion for innovative mixed-signal designs. Robert’s bimonthly Darker Side column has been published in Circuit cellar since 2007.

Source: Vintage Programming Languages | Circuit Cellar

Adylkuzz Cryptocurrency Mining Malware Spreading for Weeks Via EternalBlue/DoublePulsar

On Friday, May 12, attackers spread a massive ransomware attack worldwide using the EternalBlue exploit to rapidly propagate the malware over corporate LANs and wireless networks. EternalBlue, originally exposed on April 14 as part of the Shadow Brokers dump of NSA hacking tools, leverages a vulnerability (MS17-010) in Microsoft Server Message Block (SMB) on TCP port 445 to discover vulnerable computers on a network and laterally spread malicious payloads of the attacker’s choice. This particular attack also appeared to use an NSA backdoor called DoublePulsar to actually install the ransomware known as WannaCry.

Over the subsequent weekend, however, we discovered another very large-scale attack using both EternalBlue and DoublePulsar to install the cryptocurrency miner Adylkuzz. Initial statistics suggest that this attack may be larger in scale than WannaCry: because this attack shuts down SMB networking to prevent further infections with other malware (including the WannaCry worm) via that same vulnerability, it may have in fact limited the spread of last week’s WannaCry infection.

Symptoms of this attack include loss of access to shared Windows resources and degradation of PC and server performance. Several large organizations reported network issues this morning that were originally attributed to the WannaCry campaign. However, because of the lack of ransom notices, we now believe that these problems might be associated with Adylkuzz activity. However, it should be noted that the Adylkuzz campaign significantly predates the WannaCry attack, beginning at least on May 2 and possibly as early as April 24. This attack is ongoing and, while less flashy than WannaCry, is nonetheless quite large and potentially quite disruptive.

Source: Adylkuzz Cryptocurrency Mining Malware Spreading for Weeks Via EternalBlue/DoublePulsar | Proofpoint