11 Cool FACTS About Programming and Coding You NEED to Know
11 Cool FACTS About Programming and Coding You NEED to Know

11 Cool FACTS About Programming and Coding You NEED to Know

article
K-12 coding and STEM, Creative thinking & learning skills
Updated: August 2022 Aug. 2022
7 minutes read
article
K-12 coding and STEM, Creative thinking & learning skills
Robotics and coding facts you need to know stem coderz

Cars, computers, and Columbus Ohio. What do they have in common? Coding! For every technology you rely on in your daily life, code is the language that powers it. Cars run on it, computers won’t work without it, and some cities are using it to improve their infrastructure. Sounds like it’s time for coding trivia.

In fact, coding is so essential that many schools are teaching it in kindergarten. And as students progress through their studies, real-world problem-solving opportunities are increasingly centered on coding and STEM, like coding a virtual robot to fill orders in a warehouse for Amazon. Programming and coding are now such an integral part of everyday life, it’s no longer just up to the nerds.

Take Karly Kloss for example, a supermodel who leads Kode with Klossy, a coding camp Ms. Kloss started in 2015 to empower girls to learn to code and become leaders in tech. If you are a teen with aspirations to build an app, publish a website, or dive into data science, coding is for you! Get your feet wet with these 11 things you didn’t know you NEED to know about coding.

Click to share what you learned on Twitter

In the United States there are about 350 spoken languages. But coding has us beat with over 700 coding languages in use today! Only two countries speak more languages, Papua New Guinea (836) and Indonesia (710).

Some programming languages, like Java, Python, and HTML, are more common, but others, like Rust and Kotlin, are used in very specific situations. The good news for coders? Once you learn the big ones, the more niche languages come easily.

Click to share what you learned on Twitter

Have you ever encountered a computer bug? How about a real bug in your computer? In 1947 a technician at Harvard had an issue with the performance of their Mark II computer. Once they investigated, they discovered that a moth had gotten into a relay – an actual real live bug.

In the logbook, it was noted as “First actual case of bug being found.” While it is oft-repeated that this is where the term “bug” came to refer to errors that impacted the performance of programs, this is not the case. The term “bug” was already in fairly widespread use in technical circles in 1947. Thomas Edison used it in 1869 to describe problems in his own inventions.

Even if the origin story isn’t quite true, debugging is still an essential part of programming. If bugs aren’t discovered, the results can be disastrous! In 1983, the soviet early warning system registered five incoming nuclear missiles from the USA. Lt. Col. Stanislaus Petrov reasoned that if the U.S. wanted to attack the Soviet Union, would it really launch only five missiles? He ordered his men to stand down, and 15 minutes later, radar outposts confirmed that there were no incoming missiles. The mistake was due to a bug in the system.

Click to share what you learned on Twitter

In the future, coding and technical literacy may be nearly as essential to daily life as literacy is now. The United States has a literacy rate of 99%. Imagine 99% of the population knowing how to code.

While it may sound difficult, coding can be easier than writing for students who struggle with language mechanics. In 2020, MIT neuroscientists found that interpreting code activates a general-purpose brain network, but not language-processing centers.

Regardless of which part of your brain is in charge, the best way to learn both? Practice!

Click to share what you learned on Twitter

Coding and STEM fields may seem like it’s built for boys, but the first person to write our modern understanding of a program was Ada Lovelace.

Being the only legitimate daughter of the poet, Lord Byron, Ada’s mother feared her daughter would suffer the same madness as her father. To stave off the madness as long as possible, she dedicated her daughter to studying math and science.

While working with a peer on a mechanical general-purpose computer known as the Analytical Engine, she recognized that the machine could go way beyond simple and pure calculations, publishing then the first algorithm intended to be carried out by a machine like this one.

Click to share what you learned on Twitter

Just like a virus infects a human body by replicating versions of itself to pass on to other hosts, a computer virus spreads by inserting its own code and spreading to new computers via networks.

The idea of a computer virus was published in the essay “Theory of self-reproducing automata” by John von Neumann in 1949, but the first replicating computer program was not written until 1971. The program was not actively malicious software as it caused no damage to data, the only effect being a message it output to the teletype reading “I’M THE CREEPER; CATCH ME IF YOU CAN”.

The virus was not created to bring harm, but it did not take long for the idea of self-replicating software to turn to the dark side. Good news, there are “ethical hackers” out there that work for the good guys. In fact, it’s a great career field!

Click to share what you learned on Twitter

You may be fluent in Javascript or C++ but what NASA engineers really need to know is ADA and HAL/S. Up through 2005, NASA was still using a computer language from 1973 specifically designed for their needs called HAL/S (or High-order Assembly Language/Shuttle).

Although HAL/S is designed primarily for programming on-board computers, it is general enough for almost any application and is used widely across NASA’s projects. Newer projects, such as the International Space Station, operate on a programming language called ADA, developed in 1980 and accepted as an international standard programming language in 1995.

Click to share what you learned on Twitter

In 1972, Steve Wozniak and Steve Jobs collaborated on an arcade game, Breakout, for Atari. In 2018, Apple Inc became the first US Trillion dollar company. There’s no doubt, there is big money to be had in coding. And by big money, we mean billions.

The average salary of a data scientist is up to $100,000. Enjoy computer games? Markus Persson, a Swedish programmer, created and launched the computer game Minecraft in 2009. By 2014, Microsoft bought it for $2.5 billion.

Click to share what you learned on Twitter

Computers operate on what is called a “binary code.” All of the software that runs them is written using only 0’s and 1’s, and there are infinite combinations of these two digits. That’s why new software can be written all the time.

Click to share what you learned on Twitter

As of the end of 2020, 70% of coding jobs are in career fields not connected with technology. Those who learn to code early and well will have a choice of many careers in almost every industry imaginable.

Click to share what you learned on Twitter

In 1945 the ENIAC (Electronic Numerical Integrator and Computer) was turned on and put to use computing trajectories of ballistics during World War II. It was the first programmable, electronic, general-purpose digital computer, and it was operated by six women.

The women studied the machine’s logic, physical structure, operation, and circuitry in order to not only understand the mathematics of computing, but also the machine itself. To use it, they had to manipulate switches and cables by understanding the machine’s blueprints, as programming languages did not yet exist.

Though contemporaries considered programming a clerical task and did not publicly recognize the female programmers‘ effect on the successful operation and announcement of ENIAC, the six women (McNulty, Jennings, Snyder, Wescoff, Bilas, and Lichterman) have since been recognized for their contributions to computing.

Click to share what you learned on Twitter

Learning to code has definite cognitive benefits – creative problem-solving, critical thinking, and developing teamwork skills. Research dating back to 1991 has demonstrated and confirmed that coders developed higher cognitive skills on average, and that coding or other intellectually stimulating activities dramatically reduced the chances of degenerative diseases such as Alzheimer’s.

Today, soft and hard skills are equally important, but those who know how to work in teams, solve problems, pay attention to details, and experience mistakes as learning experiences will have way more possibility to become the leaders of tomorrow.

Written by:
CoderZ Team
Written by:
CoderZ Team

Recommended Articles

AI in Education

As AI evolves, students, educators, and institutions will have to adapt in order to meet the needs of a technology-driven world. Here are three ways AI will impact computer science education and careers

Continue reading

While AI is a major buzzword, it's not necessarily new. AI may change some tasks, understanding how to code enables the creation, management, and ethical use of AI systems. Early coding education fosters creativity, problem-solving, and adaptability, preparing students for a technology-driven future

Continue reading

Explore how real-world problem-solving and a passion for coding drive students toward CTE programs, where high-demand coding skills open doors to careers in manufacturing, automation, and robotics.

Continue reading