Friday, May 17, 2013

Beyond If, Then

Usually I am not one for bandwagon jumping. Concerning topics in which I am interested, if a news story comes out and everyone is talking about it, I'll think about said news story and deny myself the opportunity to comment about it through this, my personal soap box, my modest little blog. But this time I can't help myself. I have to explore my thoughts on loud speaker.

I am talking about the quiet little story that is going to, most likely, make all the difference. In an article about Google's I/O conference, PC Magazine as well as a bunch of other media outlets reported that Google is buying a Quantum computer from Lockheed Martin, and is going to make it available for academic research.

Here's the quote that caught my eye: "This new approach toward computing allows us to put bits of information into their 0 and 1 states at the same time, essentially allowing researchers to view a number of possibilities simultaneously." Simultaneously???? For Serious???!!

Okay, so from what I remember from that ancient branch of my knowledge that I have touched for approaching a decade and a half (computer programming), things go like this: the bunches of code that make up a program can respond to user input. The user has a finite number of possible actions such as clicking on a button, or rolling a cursor over a certain part of the screen. This user action is the "if" or the cause. So in the programming it tells the machine, at a very high level, that IF the user clicks that particular button, THEN something happens. The effect. Another window opens or a bunch of text is deleted or the whole picture turns purple or any other infinite number of possibilities that can be programmed into the software to make it more functional.

That's all a pretty simplistic explanation, but the point is as follows: it is a fairly static model. The computer, be it tablet, laptop, smartphone, what have you, that you read this on cannot predict. Not really. Sure Google and Apple have all those predictive text algorithms bu the reason why there are so many blogs devoted to their foibles is that they are not truly predictive, not really. They are still based on the IF, THEN model. (There are other programming statements, I know, but for simplicity's sake...)

What, then, is the implication for computer programming if a piece of software can be in two states at once? The PC Mag article only refers to 0s and 1s. But like our bodies are not only organs and cells and atoms and subatomic particles, the 0s and 1s of a computer make up its machine language, its programming language like C++, and effect its GUI (the stuff that we the computer user sees). So what does this mean for software? For the users of software?

The article then goes on to point out this quantum effect means that patterns in weather or stocks might be better examined. Knowing the basics and simplistic version of how computers work that I know, is it far fetched to assume that our interactions with operating systems will soon be those of knowing what we want and need before we even ask for it? Or if that's not quite on the horizon yet, maybe quantum computing can at least make AI programs such as Siri a little less frustrating to use? Good god she know little to nothing. Am I right?

Another possibility that I can think of for the use of quantum computing is, as Google itself points out, machine learning. In that, perhaps this breakthrough could hail the ability of computers to make decisions: Be in two states at once and thereby play out scenarios in order to learn which is the best. I know this idea sounds far fetched and some people might read into it that the machines are just that much closer to taking over, but here I am thinking more along the lines of a robot vacuum that is a little more flexible in its movements than the current Roombas are. 

What do you think about all this anyways? Am I just way off course? Do I know too little about the way in which computer software really works to make any real sense? I'd love to know how it all actually works.

  

No comments:

Post a Comment