The History of the Internet of Things

Author photo

Mark Dabbs

28 May 2019 - 5 min read

According to Intel, the Internet of Things market will reach $6.2 trillion by 2025. That’s amazing considering that the formal history of the Internet of Things only really started in 1999. Kevin Ashton of Proctor and Gamble was the first to use the term “Internet of Things” in conjunction with RFID technologies at the World Economic Forum in Davos. It’s worth taking a look at IoT history to understand where we are with these technologies and, where it’s going.

We’re on the cusp of important changes, so those who would like to see all of the little steps comprising IoT’s history should check out Postscapes to see how figures like Samuel Morse and Nikola Tesla fit in. I believe that a more practical history, however, involves the technologies that made the Internet of Things possible.

What is the Internet of Things?

The Internet of Things expands the Internet by making it possible for devices (not just people) to communicate and interact with each other. While computers, smartphones and tablets can be considered part of the Internet of Things, they are the means whereby all other devices can be controlled – lights, cameras, security systems, stereos, televisions, drones and robots, etc.

IoT in the Stone Ages (up to 1990)

True or False? “When we set the upper limit of PC-DOS at 640K, we thought nobody would ever need that much memory.” — William Gates, Chairman of Microsoft

The Internet of Things or IoT, didn’t really begin to take off until about 2009, despite being intertwined with the advent of the Internet and two centuries of science fiction. Wireless communication is a fundamental IoT requirement. Wireless began by rubbing two copper wires together to produce analog signals to produce the 1G standard in 1979. Cough. It was magical, everyone used America Online with dial-up modems on regular phone lines for a whopping maximum download speed of 56 kilobits per second (kbps). As much as it would be fun to reminisce, the early days of the Internet were exciting, frustratingly slow and 99.9999% dependent upon landlines. The first cellphone, Motorola’s DynaTAC, went on sale in 1984 for $4,000.

IoT Upgrades in the Iron Age (1991 – 1998)

The first 2G wireless network was rolled out in Finland in 1991, bringing with it the capability of sending SMS messages via cellphones. Some 2G networks are still operating internationally offering transfer speeds of up to 1 Megabit per second (Mbps). The first Pentium processors hit the market in 1993. By 1995, a whopping 16 million people were using the Internet globally. Note, however, that 60,000 Bulletin Board Systems (BBSes) were serving 17 million users in the United States alone. Overall though, the world was busy in the last half of the 90’s upgrading to Windows 95 and 98 (with its 38 floppy disks).

IoT in The .com Era (1999 – 2007)

In 1998, the first 3G (later 3.5 and 3.75) networks appeared alongside broadband networks, proliferating like rabbits across most of the United States. The Y2K Panic hit and while the world didn’t end, a lot of Internet startups did. And, for a time, the market had an excess of barely used office furniture… which I used to decorate my apartment for a grand total of $3.

All the “experts” on television were talking about how the Internet was just a passing fad… which it would’ve been if we were all still stuck with dial-up modems and AOL. The total number of Internet users reached 1 billion in December of 2005.

Broadband cable connections pushed “real world” download speeds to 1 and 4 Mbps, depending upon carrier (AT&T vs Verizon, etc.). Gradually, the Internet was transformed from a mostly-text and small picture format to something approaching what we know today, albeit short on video size. The first 64-bit processors for PC’s started rolling out in 2003.

Still, for the Internet of Things, wireless networks were not yet always so reliable. The notorious Verizon, “Can you hear me now?” commercial remains an icon of the era.

IoT in the Early Mobile Era (2008 to 2015)

The first iPhone was released in 2007 and the Internet ceased to be a “passing fad.” In 2008, 4G (and 4.5) networks — what most of us have today — started rolling out providing real-world download speeds of roughly 20 Mbps. Nearly simultaneously, Google Play and Apple’s App Store came into existence. Amazon followed suit, while also launching Amazon Web Services that made the “Cloud” a thing.

Mobile as we know it now comes into being with full web access, gaming services, HD mobile TV, cloud computing, and more. Smartphones started using 64-bit processors putting more computing power into the hands of everyday people than was needed to put a man on the moon. But, yeah, you probably remember the experts on television talking about how “mobile” would be a passing fad – despite people owning more mobile phones than toilets. Incidentally, the iPhone 5 came out in 2012… which could be why we all survived the end of the world.

The Internet of Things begins generating a momentum of its own. At this stage, we have everything that IoT needs to work reliably work on scale. This includes reliable wireless connections, sufficient bandwidth, processing power, improvements in sensor technologies, cost-effective production, and enough controlling devices (smartphones, tablets and PC’s) for it to be viable anywhere. The real bottleneck was how to intelligently make use of the data that it was capable of collecting.

IoT in the Mobile Era (2016 – 2020)

One of the biggest advancements for the Internet of Things has been made with Smart Home Assistants, like with Amazon’s Alexa-enabled systems. Over a hundred million have been sold, but Google and others are also fighting for market share. These voice-first gadgets can connect with every device in your home by talking to your dot, echo, or other inconspicuous looking electronic device.

Mobile usage surpassed desktop usage for the first time in late 2016, proving the MSM experts wrong yet again. This has led some networks to create their own AI news anchors like Xinhau’s Qiu Hao andXin Xiaomeng. Okay, well probably not – nevertheless, whether we call it Machine Learning, Deep Learning, AI or Neural Networks, the experts say they’ll never be able to… xyz. Almost anyone with half a brain able to extrapolate technologies a few years forward stopped listening to the MSM experts years ago. We favor the likes of Stanley Kubrik, the Wachowski brothers, and Michael Crichton.

Where we started off citing Intel projecting the IoT market reaching over $6 trillion by 2025, data monetization itself is simultaneously projected to reach over $700 billion. Check out all of the specific types of IoT devices ranked by popularity and projected market share (ADD LINK). There, we provide many more specific examples of how IoT devices can be used than by just saying, “Everything” – to include diapers and toilets.

The IoT of Tomorrow

By 2025, there’s expected to be over 75 billion IoT devices. Over the years, the experts have decried that we’ll run out of bandwidth. That was a real and valid concern back in the 1990’s – but every time we start merely approaching bandwidth limits, a new technology comes along. The next one is 5G networks. Where 4G networks have a peak speed of about 1 Gigabyte per second, 5G magnifies that by a factor of 20. As for 6G, though only theoretical at this point, Finland’s University of Oulu is working on it along with others, predicting we may see it by 2030.

Whenever the experts say can’t… warrants immediately ceasing to recognize them as an expert. There are so many NEW and exciting technologies under development, separately and simultaneously in areas of AI, quantum computing, nanotechnology, energy, and more. Just because we can’t do it now, doesn’t mean we won’t ever be able to do it. Humanity’s been around for several thousand years and computers, as we use them today, have been around perhaps 40 years. Today’s microwave oven is getting smarter, and now we even have toilets connected to the Internet of Things.

Internet of Things vs Internet of Everything

The difference between the Internet of Things and what is sometimes called the “Internet of Everything” is pretty simple. In IoT, devices are connected. In IoE, devices are “intelligently connected.” The complex algorithms associated with Machine Learning are essential to IoT to help make sense of all the data billions of devices can generate. When you go to the doctor, the nurse records your blood pressure, pulse, temperature, etc – each barely a snapshot of your vitals.

The Internet of Things with the use of wearables could generate that data every minute over the course of a month. It could identify any health anomalies to provide your doctor more specific data about when, how frequent, how intense the anomalies are which could improve the effectiveness of any treatment. The Internet of Everything would combine that data with data generated from other devices to include weather conditions, when you ate, whether you took your medications on time, how well you slept the night before, and more.

The Internet of Everything is more complex, so we’ll be exploring all that the Internet of Everything brings to the table next.

There's a Better Way to Manage Your Mobile Business

Share:
1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...

Leave a Reply

  Subscribe  
Notify of

Meanwhile, how about getting inspired
at Reinvently Insights?