The University of Massachusetts Amherst
Categories
Operating System

Today in “Absurd Tech Stories”: Burger King vs Google

“OK, Google: What is the Whopper burger?”

The internet is all over a story today involving burger giant Burger King and tech giant Google, in which Burger King released a new ad that takes advantage of Google Home, the in home personal assistant created by Google. This device mirrors other in home assistants like Amazon’s Alexa.

Google Home.

The short commercial, titled “BURGER KING® | Connected Whopper®” (shown below), features a Burger King employee using the phrase “OK, Google” to purposefully trigger in home devices or mobile phones with Google Voice capability to conduct a Google search for the Whopper. On the surface, this comes across as a pretty clever marketing ploy by BK, taking advantage of current tech trends to make the commercial more relate-able

However, in true internet fashion, those that wanted to have a little fun caught wind of this ad pretty quickly turned this innocent commercial into something a little more ridiculous.

Asking Google Home the question “OK, Google: What is the Whopper burger?” gives the user a description based on the current Wikipedia article. This rule applies to anything that is searched for in this fashion. Users who wanted to mess around with the first line of the Wikipedia article started to edit the line, making it say things like that the Whopper’s main ingredient was cyanide, and that the Whopper was “cancer-causing”, which would then read out when someone tried to run the voice command.

Within three hours, Google had modified their voice detection to not interact at all with the Burger King commercial. Users could still normally ask the device the same phrase, but it seemed that Google didn’t take too kindly to the small disturbance that this commercial was causing and shut it down as fast as it started.

Stories of internet trolls taking advantage of AI programs are becoming more and more prevalent in recent years. In March of 2016, Twitter users were able to modify TAY.AI, Microsoft’s Twitter chatter bot, to make remarkably inflammatory and inappropriate comments.

 

The commercial can be viewed here:

https://www.youtube.com/watch?v=U_O54le4__I