In this article, we take a brief look at ten popular programming languages and what they’re used for.
What we would recognise as the first computer programming languages date back to the 1940s, were very specialised and were based on mathematical notation. The 1950s saw the development of the first compiled programming language ‘Autocode’, at the University of Manchester. Most of the major language paradigms that we now use, however, have their roots in the 1960s and 1970s. The 1980s also brought advances in programming language implementation, and from then on through the 90s and 2000s there have been huge advances in IT, hardware, processors, the growth of the Internet, the IoT and more. This brought further development of programming languages and the introduction of new languages.
Here are some examples of popular programming languages and what they are used for:
C, which dates back to the 1970s, is an imperative language that was used to develop early operating systems (IBM) and is still used in systems development (e.g. operating systems, embedded devices, and firmware). Writing in C is now more of a specialised skill and it is used mostly for low-level systems programming.
C++ essentially extends C with object-oriented features and was developed to help with faster and more powerful platforms. Like C, however, this language is specialised and used for systems programming and low-level hardware development.
C# (pronounced C sharp) is a language similar to Java and is used, for example, to develop Microsoft applications.
Java, which is similar to C and C++, was introduced by Sun Microsystems in the early 90s. Java has cross-platform compatibility and is used for business, Web, and mobile apps and is the language at the core of Google’s Android OS.
PHP is a popular language that was developed to extend a CGI program to support HTML forms and database access and is a general-purpose scripting language that works well as a web application server-side scripting system. PHP can interact with different database languages including MySQL.
Although regarded as not essentially a language, Structured Query Language/SQL is a domain-specific database query language that is used for managing data held in a relational database management system. As such, it is very helpful for facilitating the retrieval of specific information from databases.
Introduced in the late 1980s, Python (named after the eponymous Monty Python Show) could be regarded as relatively new. This a good general-purpose language that is regarded as being relatively easy to learn due to its simple and straightforward syntax. Python is now used, for example, in creating web applications and artificial intelligence applications, and is the language behind platforms like Pinterest and Instagram.
Ruby (Ruby on Rails) is a dynamically typed, high-level, general-purpose programming language. It is also a relatively new language (mid 1990s), and, Like Python, it is regarded to be relatively easy to learn, and is used in the development of web apps.
Visual Basic is a third-generation, event-driven programming language from Microsoft that was introduced in the early 1990s but declared ‘legacy’ in 2008. Visual Basic.NET (VB.NET) is Microsoft’s implementation Visual Basic language that allows developers to write .NET applications using Visual Basic.
Looking ahead, some tech commentators have noted that although general-purpose, imperative languages are good for building apps and scripts, the need to match a language with a purpose means that special-purpose declarative languages are a likely way forward. There is a large number of different languages now, but the likelihood is that some will go, leaving a set of preferred, standard declarative languages.
There has also been research into and development of AI to help ‘advise’ on how to improve programming languages. For example, researchers from Intel, Georgia Institute of Technology, University of Pennsylvania, and MIT developed a machine learning algorithm, called machine inferred code similarity (MISIM) that can look at what a program is supposed to do and (based on its learning from the Web) make suggestions about how to improve it. This idea points to the likelihood that, in the not-too-distant future, human programmers will have AI-powered helpers, and may eventually rely on machine programming to do the majority of their programming work.