Tim Berners-Lee wins the Turing Award. It’s About Time

Last week, we released our shiny new video on the Internet Protocol. This week, Tim Berners-Lee, the inventor of the World Wide Web, wins the Turing Award. Is it a coincidence? Probably, but we like to think we had a hand in influencing the universe.

The Turing Award, for those who might not have heard of it, is like the Nobel Prize of computer science. It’s named after Alan Turing, who is probably the most well known for his work with ciphers and code breaking during World War II, but who is also widely recognized as the father of modern computing. He created the Universal Machine in 1936, a device that could read a set of instructions from a line of cells known as a “tape,” and perform an action. These days, they’re known as Turing Machines, and are mostly used to model how computing algorithms operate.

Recently, it was discovered that PowerPoint can be used to make a simple Turing Machine.

The first Turing Award was presented to Alan J. Perlis in 1966 for his work on advanced programming techniques and compiler construction. In 1955, Perlis and his team designed a mathematical compiler language called “Internal Translator,” or IT. IT (not to be confused with Information Technology, AKA the reason Deep Core Data exists) was a unique compiler language for the time because, while it was created on a Datatron 205 – one of the most advanced computers at the time – it could be run on basically any computer with only minor modifications.

When I talked a little bit about the development of the internet while discussing history of the Information Age, I mentioned how Mosiac was one of the first web browsers. What I failed to mention was that without Berners-Lee, the National Center for Supercomputing Applications (NCSA) would not have been able to develop it.

Tim Berners-Lee didn’t invent the internet, but he did make it more accessible to the average user.

Berners-Lee is credited with developing HTML (the language most webpages are created from), URLs (the way addresses are attached to a webpage), and HTTP, which is how we navigate from page to page. Even outside the tech industry, these phrases are considered commonplace. These days, we encounter them on a daily basis, to the point where we don’t really think about them. It’s taken for granted that these concepts have always been a part of our existence, which is probably why it took Berners-Lee over 25 years to be recognized.

And while he feels the internet has lived up to his vision of an open platform that facilitates the exchange of ideas, he is a little worried about its future. He’s not particularly keen on all the data mining companies do, or the fact that some information sources will post false information in order to generate clicks. When Berners-Lee released the internet into the wild back in 1991, it was free and open-source because he believed that the technology should be an equalizing force that benefits everyone.

To combat some of the worrisome trends on the internet, Berners-Lee and his organization, the Web Foundation, has developed a five year plan to clean things up a bit. Proposed solutions include encouraging “gatekeepers” like Google and Facebook to continue vetting the accuracy of their news feeds, developing new technology he refers to as “data pods” that will decouple user data from the applications that use it, and enforcing algorithmic transparency to prevent the development of an “internet blind spot” when it comes to political campaigns.

It’s almost refreshing that, in a time when many of our tech geniuses are warning us about the dangers of the very technology they create, at least one is concerned about the behavior of the humans using it. While all of the problems Berners-Lee is worried about are facilitated by technology, human choices are the forces driving them. His belief seems to be that technology can help us as humans treat each other better.

And speaking for all of us here, that’s something we think deserves an award.

2017-04-06T11:10:01+00:00

About the Author:

Andrew is a technical writer for Deep Core Data. He has been writing creatively for 10 years, and has a strong background in graphic design. He enjoys reading blogs about the quirks and foibles of technology, gadgetry, and writing tips.

Leave A Comment