Technology is advancing at a rapid rate. We are living in an age where we have robots that behave like humans and can mimic human emotions. We have already heard of robots handling some of the jobs better than humans. AI is dominating the realms that were thought to be only accessible to human intelligence. First, it was only the small retail jobs. Now AI is involved within surgery, medical applications, deliveries and even playing video games better than humans. You read that right!
Sony has created an AI that has beaten the humans at their own game, like literally! Sony’s famous AI has thwarted top human gamers at the smash-hit Gran Turismo racing game. It seems board games weren’t the only domain that the AI was good at conquering.
Though we are recognizing the victory of this AI just now, Sony has been working on this AI’s development for over two years. The name of this AI is GT Sophy, and it is one of the ambitious projects. The electronic giant has been training the computer system of GT Sophy to play Polyphony Digital’s Gran Turismo Sport which is one of the most popular and entertaining racing car games. All the hard work that Sony put behind developing this AI bore fruit, as the AI was successful in defeating the top gamers who had practically mastered Gran Turismo.
If we dig deeper a little, we can find out a great deal about how this all works. AI is a technology that is fed data based on real-life examples. Using a complex set of neural networks which were designed by studying the human brain, AI can study how humans behave or the most common occurrences taking place in a phenomenon. This data is fed into the neural networks of AI using a process which is called training. Depending upon how good the training examples were, or how effective the data that was fed into the AI, the trained system can be a real game-changer. Based on how GT Sophy performed, it is quite clear that Sony had fed only the quality data to the AI. Now, the AI system beats humans in a game of tactical open-ended choices. In some aspects, this doesn’t look like an impossible feat for an AI that had been under training for 2 years. We, humans, have a multitude of fallacies that machines lack. For example, it is hard not to control our frustration when a game is too long, or when the competition is not turning in our direction. Machines have just the level of decision-making at a very finesse level that humans usually don’t have. Taking this into account, it is quite understandable how a machine is better at handling a tactics-based game than humans.
GT Sophy is a prime example of the ongoing development that continues to inspire more research in this domain. Though beating humans in the game doesn’t seem like a formidable feat, if this algorithm is extended, we can have amazing robots that can take powerful tactical decisions either to save lives or increase the progress of humankind.