Moore’s law (an observation made by Gordon Moore in 1965) predicts that the number of discrete elements on a square-inch silicon integrated circuit will double every two years. While it’s not exactly a direct relationship, you can interpret that to mean that computers will double in processing power every two years and this has been said to be the greatest technological prediction of the last half-century. That means in the years between 2010 and 2050, computer processing power will double 20 times if Moore’s law holds true. Moore’s prediction has fueled many of today’s breakthroughs in artificial intelligence and it has given machine-learning techniques the ability to chew through massive amounts of data to find answers.
There is breaking news where Apple is warning of serious security flaw for iPhones, iPads and Macs that allows hackers to access devices. Apple’s explanation of the vulnerability means a hacker could get “full admin access” to the device. That would allow intruders to impersonate the device’s owner and subsequently run any software in their name.
When the internet was in its infancy the word ‘cloud’ was used as a metaphor to describe how the complex telephone networks connected. This makes sense since the only way to connect was with a telephone modem and all of the telephone wires were up in the air at that time. Now, many people and organizations refer to it as ‘THE cloud’ but it’s not a single entity, and it doesn’t exist in just the one place. Cloud is a model of computing where servers, networks, storage, development tools, and even applications (apps) are enabled through the internet. Cloud computing is believed to have been invented by Joseph Carl Robnett Licklider an American Psychologist and Computer Scientist in the 1960s with his work on ARPANET (Advanced Research Project Agency Network) to connect people and data from anywhere at any time. This project was trying to develop a technology that allowed…
View original post 1,397 more words