How am I Reading This? A Brief History of the Information Age
Some people reading this blog have never known a world without internet. But for many of us, there was such a thing as a completely offline world. In the 1970s, Robert Kahn and Vinton Cerf invented TCP/IP, which they integrated into the Department of Defense’s ARPANET, and the internet was born. However, it wasn’t until the 1990’s that it started becoming a widespread phenomenon.
In fact, the internet remained largely under government control up until 1984, but when AT&T underwent divestiture in order to get involved with the computer industry, things started to change. While Bill Gates promoted the idea of a personal computer on every desktop, it was the investment of the telecommunication giant that drove the internet forward.
Back then, phone lines could only transmit signals at a snail’s pace, and an incoming call would interrupt the connection, heralding the end of of a download for many of the nerdy hobbyists of the time. The development of fiber optics and the increased speed of microprocessors meant that signals moved faster, and before long, practically every house on the block was buzzing with that oh so familiar dial-up sound.
In 1991, the internet changed from simply a way to send and receive files, and became the world wide web of information that we know of today. However, it wasn’t until Mosaic was developed by students and researchers at the University of Illinois a year later that people had a user-friendly way to access that web. In 1993, Doom from id Software popularized online gaming, and in 1997, SixDegrees.com (the first website that could be considered a social networking site) launched and electronic companies agreed to make WiFi and wireless internet an industry standard.
What’s interesting about the Information Age is just how quickly technology has developed. The Industrial Age began somewhere around 300 years ago, and lasted right up until about 30 years ago when the internet was in its infancy. While there was a notable acceleration of progress from that of the Agricultural Age, it still took lifetimes before real change in technology took hold. In the early stages of the Information Age, it took decades, and now, technological advancement has accelerated even further, to the point where new technology is released multiple times a year.
With this advanced speed of progress and dearth of easily accessed information, many experts are starting to call for the end of the Information Age. Why? Well, there’s a number of different reasons out there. Some experts feel that the amount of information available to the average individual and the rise of Big Data is causing us to stagnate, and it’s only a matter of time before something new crops up
This timeline from awesomescience.us shows how the rate of innovation has really taken off in the last 300 years.
Some experts believe we’ve developed all the technology devoted to information that we possibly can, and that our focus has shifted towards creating better infrastructure instead. With the amount of research going into self-driving cars, the development of the internet of things, and the push for clean energy, it’s easy to see where this idea has come from. While devices dedicated to information technology are still advancing, at this point, the technology is mostly being refined and improved, and there aren’t a lot of new developments.
Maybe Boomers complain about Millenials and their selfies because they’re jealous they couldn’t share their experiences as quickly and openly as we do now?
Others believe we’ve moved on to an “experience age.” They feel that the focus of technological advancements has moved from sharing information to sharing our experiences, and that our technology now revolves around better ways to share those experiences. When you consider the improvements made in cellular phones, the vast strides being made in Virtual Reality technology, and the popularity of social networking, the idea is not without merit. The way we interact as humans and share our experiences is definitely changing to be a blend of technology and reality; however, there’s a really important concept that this “experience age” theory is missing.
Sharing experiences is still a form of sharing information, and the sharing of information is still a major part of today’s society. It’s such an important and fundamental ideal that it’s been the focus of major scandals, and currently influences America’s politics. The fact that information is so readily available even frightens major nations, like China, who has developed an extensive firewall that’s intended to block negative information about the government from being spread among citizens. Earlier this week, Turkey blocked access to DropBox, Google Drive, and other major cloud services.
Another thing to consider is that the dearth of information available could very well be a driving force behind the development of artificial intelligence. Right now, many business handle more data than they know what to do with, the infamous Big Data, which in turn has created the emerging IT postion of data scientist.
But the human mind cannot process all the data being generated by businesses all that quickly or efficiently. Computers, however, can, through machine learning and neural networks. It’s technology that’s still getting its feet, and right now its greatest achievement is beating humans at Go and Jeopardy, but someday, it may be making big marketing decisions for us.
Or it could be sending Arnold Schwarzenegger back in time to kill Sarah Connor.
I’ll stop making Terminator references when Arnold comes back in time and forces me to stop.
Either way, I think the Information Age still has a few surprises left in store for us.