Steamboxed.com explores Valve’s Biofeedback patent, and shows us how the gaming industry giant will tap into your emotions and alter your games forever.
‘Biometrics’, it’s a word that somehow reaches into the future and tells us that we are going to be more closely tied into the technology we create. But before we let it permeate our everyday lives, Gabe Newell will make sure it’s a part of our video game experience. before we start, lets try to define biometrics:
a branch of biology that studies biological phenomena and observations by means of statistical analysis.
Well that sums it up pretty well, but the etymology of this word is much more graceful. Bio, Greek for life, and metricus, Latin for measurements or life-measurements.
So just how does Valve Software intend on delivering biometric capabilities in their new console? As it turns out, on January 13th, 2011, a patent titled Player Biofeedback for Dynamically Controlling a Video Game State was published detailing how. The beginning of the abstract states the following:
Various embodiments are directed towards employing one or more physical sensors arranged on or in proximity to a video game player to obtain biofeedback measures that are then useable to dynamically modify a state of play of a video game.
That sounds great, especially on paper, but it doesn’t explain in any level of detail. Let’s explore this patent and see if we can what we find.
Altering the State of The Game
If we take a look at an emotionally tense game, like Silent Hill for instance, and we just had the living daylights scared out of us, this would be the perfect moment to have Valve’s biometric calculations take it one step further and scare us to death. As another example, lets imagine that the game caught you zoning out while farming for materials in Far Cry 3, why not spawn a few sabotage inspired pirates to keep you engaged in the game? This is the basic premise of the Biofeedback API (BAPI). Check out the flow chart below to get a visual explanation:
Even though the figure above shows a limited version of the capabilities, it does show that decisions can be made to change the outcome of a game based on your levels of “arousal”. Moreover, what is it that Valve will capture from our bodies that gives away our emotions and physical state? In Claim 3 of the published patent we find this:
…the biofeedback measures includes at least one of a heart rate, a heart rate variability, a blood oxygen level, a skin conductance level, a respiration rate, a skin tension, a voice stress level, a blood pressure, facial expressions, body temperature, pupil dilation, an eye movement, gestural motion, an Electroencephalography (EEG) or other neural imaging measure, or an Electromyography (EMG) measure.
Some of these data points even suggest that we would be wearing some kind of apparatus on our heads (I suggest they call it the Headcrab if it goes to market). With Gabe showing interest in wearable computing, it may be possible.
But hold on there, that is a lot of data collection going on. There needs to be some sort of interface going on to process all of these data points. That’s where aforementioned Biometric Application Programming Interface (BAPI) comes into play. To give us an idea of what kind of data can be pulled from all of these measurements, we can take a look at some example queries provided in the patent:
Those are some very revealing queries that show us how Valve will integrate with our future games. It is now very clear that biometrics will affect so many facets of gaming, that it’s only limited by the creativity of the game developers. The most interesting to me is the IsPlayerLying() query, which would be made possible by the data points collected. How cool would it be to try to lie to a character in a game, or to have multilayer elements in a game where you can discover that you are being lied to?
I wish we could see screenshots of the prototype controller mentioned in the Nerdist Podcast with Gabe Newell, but for now we can take a look at the illustration below and see the obvious components that make up the biofeedback “package”.
We can see a head apparatus, controller that and camera that appear to be the main peripherals of the biofeedback collection, which can support all of the data points listed earlier. Here is a wild guess as to what each peripheral may collect.
|Head Aparatus||Electroencephalography (EEG) or other neural imaging measure, or an Electromyography (EMG)|
|Controller||heart rate variability, a blood oxygen level, a skin conductance level, a respiration rate, a skin tension, a blood pressure|
|Camera||a voice stress level,facial expressions,pupil dilation, an eye movement, gestural motion|
Sooner Than Later
With Gabe Newell making several statements about biometrics having a lot of potential, I would dare to say that some of the ideas in this patent are coming to your living room much sooner than we may realize. The technologies have been here for years, but no one has taken the time to innovate until now. The ability for games to know your emotional state to keep us engaged will bring us closer to to the stories they convey, a change that is welcome in my view.
What are your thoughts on biofeedback? What are some biometric scenarios that may end up in our games in the future? I would like to hear what you have to say in the comments below.
This article only scratches the surface, to view the entire patent hit this link http://www.faqs.org/patents/imgfull/20110009193_02