The Evolution of Programming Languages
The term “programming languages” evokes thoughts of C, C++, Java, SQL, and other computer languages with complicated syntaxes and endless lines of code.
But this isn’t where programming languages started out. That particular origin story begins a lot earlier.
In the 1940s, the first electronic computers came into existence. But there was a problem. Early level assembly languages were horribly complicated. Programming with them was like trying to carve “David” with a toothpick.
There had to be a better way.
A breakthrough came with the invention of FORTRAN by computer giant, IBM. The fact that it wasn’t just limited to being a test subject but was actually functional, made it revolutionary and incredibly popular. Another thing that made FORTRAN so revolutionary was that it was great for high-performance computing. A lot of the world’s modern supercomputers still use it, and there are benchmark programs made out of FORTRAN that push these behemoths to see how far they can go.
But FORTRAN still had a problem. It was mostly useful for scientific and numeric computing. Also, it wasn’t as intuitive as people needed it to be. In addition to all of this, it was limited in what it could do.
And that’s where COBOL came in to pick up the slack.
COBOL, The Next Step Forward
The problem with coding in FORTRAN was that people had to know and be comfortable using mathematical formulae and scientific notations.
Most people weren’t. They needed something closer to English.
COBOL came in with that solution in 1959. It wasn’t targeted at scientists and mathematicians looking to find the secrets of the universe. COBOL helped them find solutions to business tasks. There was also the added benefit that COBOL supported object-oriented programming, something we take for granted with languages like C++ and Java.
This meant that computer programs could get much more complex, handle more complex tasks, and be useful to everyday users, instead of being another microscope in a laboratory.
Inspired by COBOL, other languages popped up, providing small improvements and ease of use. But it wasn’t until the 1970s until we’d see something that truly changed the computing world into what we know today.
C was a radical departure from the likes of COBOL, FORTRAN, and other languages of its time. It was structured, written with English syntax, and usable for a variety of applications.
Developed in 1972, it’s one of the most widely used programming languages in existence. It’s still taught in many curriculums.
Over the years, it’s been followed up by C++, which added object-oriented programming concepts into it like inheritance, encapsulation, and polymorphism.
The biggest change after C++ came by way of C#, which was better equipped to create web applications. After the explosive growth of the Internet, that was the biggest motivator for the advancement of programming languages, like Java, Python, PHP, and more.
But Why The Internet?
The Internet is a huge platform where different systems running different platforms all have to work well with each other. Because of that, programming languages had to evolve to support that need.
Web applications became more popular, browsers became more complex, we started using smaller and simpler scripts for simpler tasks, and instead of having a complete programming language, there was a focus on function. If a language could do one thing, and do it well, it was useful. Otherwise, it was thrown to the waste pile. Applications needed to be developed at a faster pace, and languages had to be easy enough to support that.
This is the time when concepts like rapid application development and low-code actually started. Modern languages were focused on helping developers speed up the development process, not spend hours searching for that one semicolon they missed.
Forrester, Gartner, and the Low-code Love Affair
Forrester coined the term, and Gartner helped them introduce it to the masses. But they both have very similar definitions of what low-code is.
“Low-code platforms enable rapid delivery of business applications with a minimum of hand-coding and minimal upfront investment in setup, training, and deployment”
Gartner has a similar definition:
“Low-code development both describe platforms that abstract away from code and offer an integrated set of tools to accelerate app delivery”
But what does “abstract away from code”, “an integrated set of tools to accelerate app delivery”, and “rapid delivery of business applications” mean when you strip away the tech speak?
What does the business user have to gain from all of this?
Why Low-code Platforms are Inevitable
At the end of the day, business users care about a very simple list of things. They want to spend less money, improve productivity, waste less time, and make more money.
And that’s why low-code platforms succeeding is an inevitability. They provide a way to do all of that.
You don’t need to invest in expensive training programs for your employees. You can have them build apps faster, with less training. And at the end of the day, all of this makes your business more revenue.
But keep in mind that low-code doesn’t mean there’s no code involved at all. It just makes it a lot more easier for your existing developers to create applications without spending arduous hours on coding. A lot of the development becomes visual, using modules and templates that can be freely sourced from the Internet.