Monsters, superheroes and video games: is AI real?

Much like Sesame Street’s Cookie Monster, AI is a data monster, one that can devour data (and process it) at breakneck speeds. It takes thousands of images of cats for a deep learning algorithm to understand what a CAT is, or to rephrase the Greek philosopher Plato, the “felinity” of the cat. And even after ingesting those thousands of images, our data monster is still confused (I’m sure Cookie Monster would never be confused about recognizing Oreos).

Search online for images of “Blueberry Muffin or Chihuahua” or “Sheepdog or Mop” and you’ll get a clue. Come to think of it, and I’ve never done this before, a Chihuahua’s face often looks oddly like a blueberry muffin! But, even then, if a Blueberry Muffin and a Chihuahua were in a police line and you had to identify the suspect, I suspect you would be able to tell the difference. AI might not do it and yet governments are asking us to trust AI in policing!

Before we get to what gives, let’s briefly think about the poor soul who had to sift through 10,000 photos of cats and label them “cat.” Think of the utter boredom of the man or woman doing this for a living…labeling photos as “cats” or “dogs” or really anything else. Repeating “meow” and “bow wow” for your child is one thing, but labeling shot after shot for a machine… I mean, it’s the secret superheroes that keep the next generation of AI running.

Let’s think of something worse. We don’t want extreme violence to be part of video streaming services. We do not want terrorist ideals to be propagated through microblogging sites. We don’t want the electorate to be brainwashed with fake news on social media. But how would the microblogging platform, video-sharing service or social media site know that they are being used for something so nefarious? And no, the answer is not necessarily AI. It’s the work of human beings, again. When users flag videos or posts often enough, human intervention is used to check whether that content should be removed or not (which in turn can train AI in the future). Let’s briefly consider the mental health of the highly paid Silicon Valley FAANG employee (or, indeed, the extremely poorly paid FAANG entrepreneur sitting in Gurgaon), whose job it is to tag these videos and who is continually forced to watch topics that don’t even fit into the cesspool of the World Wide Web. Can AI help him? Or, on a more positive note, can the AI ​​start driving cars, if these unsung superheroes tag enough street videos to help the AI ​​tell the difference between a human and a rock, or a cat crossing a street for a blueberry muffin that a hooligan kid threw out the window? In short, can AI help create data to support the future of AI?

The answer to that might be video games. Think of a gaming platform where you can create near-perfect racing games. Let’s say a big pothole has been placed in your trail or a cat is crossing the street and you want to avoid them. What is a signal for your brain to turn sharply to the right can also be used to feed that same signal to the algorithm of a self-driving car. Of course, the computer knows the pothole is a pothole in a game, because it’s already tagged in the background (unlike in real life where it has to be tagged manually). Likewise, in the game, the cat has been developed in 3D where different aspects of felinity have already been captured. So when you train the AI ​​algorithm on driving on these streets, you won’t need a human to label anything. Imagine a virtual reality flight simulator used to teach AI to fly. It is synthetic data that is now the next big thing in this endless search for ever more data to feed the AI ​​monster.

And remarkably, the creation of this synthetic data is now automated. In April 2022, NVIDIA (those who make graphics cards for gamers, which are now used so much in deep learning) came up with NERF. NERF can take a small set of photographs of a cat (or anything else, really) and create a realistic 3-dimensional synthetic cat. So when training AI algorithms, you might have a cat running towards or away from a car, a cat that looks like a blueberry muffin, or the side profile of a cat, but all of them will be automatically labeled correctly because they were created by an algorithm itself, and the AI ​​will know that it is to be avoided. Once he learns this on the streets of Need for Speed, he can replicate his effectiveness in real life as well.

Sure enough, the monster is fed, our secret superheroes can find more useful missions (and ways to earn a living), and our cats can cross the street in relative safety thanks to video games.



LinkedIn


Warning

The opinions expressed above are those of the author.



END OF ARTICLE



Comments are closed.