11 Cool FACTS About Programming and Coding You NEED to Know

Cars, computers, and Columbus Ohio. What do they have in common? Coding! For every technology you rely on in your daily life, code is the language that powers it. Cars run on it, computers won’t work without it, and some cities are using it to improve their infrastructure. Sounds like it’s time for coding trivia.

In fact, coding is so essential that many schools are teaching it in kindergarten. And as students progress through their studies, real-world problem-solving opportunities are increasingly centered on coding and STEM, like coding a virtual robot to fill orders in a warehouse for Amazon. Programming and coding are now such an integral part of everyday life, it’s no longer just up to the nerds.

Take Karly Kloss for example, a supermodel who leads Kode with Klossy, a coding camp Ms. Kloss started in 2015 to empower girls to learn to code and become leaders in tech. If you are a teen with aspirations to build an app, publish a website, or dive into data science, coding is for you! Get your feet wet with these 11 things you didn’t know you NEED to know about coding.

1. Coding has over 700 languages.

Coding Facts Infographic
Click to share what you learned on Twitter

In the United States there are about 350 spoken languages. But coding has us beat with over 700 coding languages in use today! Only two countries speak more languages, Papua New Guinea (836) and Indonesia (710).

Some programming languages, like Java, Python, and HTML, are more common, but others, like Rust and Kotlin, are used in very specific situations. The good news for coders? Once you learn the big ones, the more niche languages come easily.

2. Coding Bugs were NOT named after an actual bug.

Coding Infographic
Click to share what you learned on Twitter

Have you ever encountered a computer bug? How about a real bug in your computer? In 1947 a technician at Harvard had an issue with the performance of their Mark II computer. Once they investigated, they discovered that a moth had gotten into a relay – an actual real live bug.

In the logbook, it was noted as “First actual case of bug being found.” While it is oft-repeated that this is where the term “bug” came to refer to errors that impacted the performance of programs, this is not the case. The term “bug” was already in fairly widespread use in technical circles in 1947. Thomas Edison used it in 1869 to describe problems in his own inventions.

Even if the origin story isn’t quite true, debugging is still an essential part of programming. If bugs aren’t discovered, the results can be disastrous! In 1983, the soviet early warning system registered five incoming nuclear missiles from the USA. Lt. Col. Stanislaus Petrov reasoned that if the U.S. wanted to attack the Soviet Union, would it really launch only five missiles? He ordered his men to stand down, and 15 minutes later, radar outposts confirmed that there were no incoming missiles. The mistake was due to a bug in the system.

3. Coding will soon be as important as reading

Coding Infographic
Click to share what you learned on Twitter

In the future, coding and technical literacy may be nearly as essential to daily life as literacy is now. The United States has a literacy rate of 99%. Imagine 99% of the population knowing how to code.

While it may sound difficult, coding can be easier than writing for students who struggle with language mechanics. In 2020, MIT neuroscientists found that interpreting code activates a general-purpose brain network, but not language-processing centers.

Regardless of which part of your brain is in charge, the best way to learn both? Practice!

4. The first programmer was the daughter of a mad poet

Coding Infographic
Click to share what you learned on Twitter

Coding and STEM fields may seem like it’s built for boys, but the first person to write our modern understanding of a program was Ada Lovelace.

Being the only legitimate daughter of the poet, Lord Byron, Ada’s mother feared her daughter would suffer the same madness as her father. To stave off the madness as long as possible, she dedicated her daughter to studying math and science.

While working with a peer on a mechanical general-purpose computer known as the Analytical Engine, she recognized that the machine could go way beyond simple and pure calculations, publishing then the first algorithm intended to be carried out by a machine like this one.

5. The first computer virus was a Creeper

Coding Infographic
Click to share what you learned on Twitter

Just like a virus infects a human body by replicating versions of itself to pass on to other hosts, a computer virus spreads by inserting its own code and spreading to new computers via networks.

The idea of a computer virus was published in the essay “Theory of self-reproducing automata” by John von Neumann in 1949, but the first replicating computer program was not written until 1971. The program was not actively malicious software as it caused no damage to data, the only effect being a message it output to the teletype reading “I’M THE CREEPER; CATCH ME IF YOU CAN”.

The virus was not created to bring harm, but it did not take long for the idea of self-replicating software to turn to the dark side. Good news, there are “ethical hackers” out there that work for the good guys. In fact, it’s a great career field!

6. NASA still operates some projects on programming from the 1970’s

Coding Infographic
Click to share what you learned on Twitter

You may be fluent in Javascript or C++ but what NASA engineers really need to know is ADA and HAL/S. Up through 2005, NASA was still using a computer language from 1973 specifically designed for their needs called HAL/S (or High-order Assembly Language/Shuttle).

Although HAL/S is designed primarily for programming on-board computers, it is general enough for almost any application and is used widely across NASA’s projects. Newer projects, such as the International Space Station, operate on a programming language called ADA, developed in 1980 and accepted as an international standard programming language in 1995.

7. There is BIG money in coding

Click to share what you learned on Twitter

In 1972, Steve Wozniak and Steve Jobs collaborated on an arcade game, Breakout, for Atari. In 2018, Apple Inc became the first US Trillion dollar company. There’s no doubt, there is big money to be had in coding. And by big money, we mean billions.

The average salary of a data scientist is up to $100,000. Enjoy computer games? Markus Persson, a Swedish programmer, created and launched the computer game Minecraft in 2009. By 2014, Microsoft bought it for $2.5 billion.

8. It’s all 0’s and 1’s

Click to share what you learned on Twitter

Computers operate on what is called a “binary code.” All of the software that runs them is written using only 0’s and 1’s, and there are infinite combinations of these two digits. That’s why new software can be written all the time.

9. You don’t have to work in tech to use coding

Click to share what you learned on Twitter

As of the end of 2020, 70% of coding jobs are in career fields not connected with technology. Those who learn to code early and well will have a choice of many careers in almost every industry imaginable.

10. Computer was a job title, and the first programmers were women

Click to share what you learned on Twitter

In 1945 the ENIAC (Electronic Numerical Integrator and Computer) was turned on and put to use computing trajectories of ballistics during World War II. It was the first programmable, electronic, general-purpose digital computer, and it was operated by six women.

The women studied the machine’s logic, physical structure, operation, and circuitry in order to not only understand the mathematics of computing, but also the machine itself. To use it, they had to manipulate switches and cables by understanding the machine’s blueprints, as programming languages did not yet exist.

Though contemporaries considered programming a clerical task and did not publicly recognize the female programmers‘ effect on the successful operation and announcement of ENIAC, the six women (McNulty, Jennings, Snyder, Wescoff, Bilas, and Lichterman) have since been recognized for their contributions to computing.

11. Coding can “power up” your brain

Click to share what you learned on Twitter

Learning to code has definite cognitive benefits – creative problem-solving, critical thinking, and developing teamwork skills. Research dating back to 1991 has demonstrated and confirmed that coders developed higher cognitive skills on average, and that coding or other intellectually stimulating activities dramatically reduced the chances of degenerative diseases such as Alzheimer’s.

Today, soft and hard skills are equally important, but those who know how to work in teams, solve problems, pay attention to details, and experience mistakes as learning experiences will have way more possibility to become the leaders of tomorrow.