Immersion - Games influence our emotions the most when we are immersed in them. One thing that can take a player out of an experience in the presence of the HUD (heads up display). The HUD is usually vital for gameplay purposes, giving important information about directions and resources (like health and ammunition). Some games, like Dead Space, use a "diegetic interface," which means the interface is visible to characters in the world.
In the image above, protagonist Isaac's health and kinesis levels are represented by meters on his in-game suit, and his inventory is displayed as a hologram projected by his suit. One possibility for physiological input to aid in immersion is to use eye tracking as a biofeedback modality, to detect when players' eyes search the corners of the screen that usually contain elements of the HUD. When a player is focusing on the action, the HUD would disappear, but if they look at a certain corner, the game would give them information unintrusively.
Pacing - In his GDC talk Biofeedback in Gameplay: How Valve Measures Physiology to Enhance Gaming Experience, Mike Ambinder at Valve talks about implementing affective feedback in Left 4 Dead's "AI Director" system which controls the pacing of their gameplay scenarios. Normally, the AI Director determines the distribution of resources through a level, and the placement and intensity of enemy encounters according to approximated levels of stress and arousal. Valve has experimented with using actual levels of stress and arousal and are optimistic about incorporating affective feedback into their games in the future. |
|
Matchmaking - Online multiplayer games are notorious for frustrating encounters with toxic players. Perhaps users' physiological data can be interpreted as emotional responses to competition and a profile can be created for them. Then, they can be matched with players of a similar temperament. Conversely, maybe players should be matched with team members of a different, but complementary profile in a game that rewards playing specific roles on a team. This idea is adapted from Mike Ambinder's GDC talk. |
Accessibility - Mike Ambinder at Valve also discusses and shows off in his GDC Talk how Valve has configured aiming in Portal 2 to be controlled with eye tracking. Although the Portal series requires dexterity and speed to solve puzzles, its gameplay is simple enough that eye tracking can be used effectively to point at where the player wants their portals to go. Valve has also used eye tracking for playtesting, to decide which aspects of visual design players are drawn to first. Often times the environment is meant to subtly convey information to the player, so this can also help with level design.
|
Heartbeat Sensors in Multiplayer - Some Call of Duty multiplayer titles include "heartbeat sensors" that can show the location of enemy players within a certain radius. Affective feedback could be used here in a situation where a player's actual heart rate is detected and their location is only shown on the sensor if it is over a certain threshold. This would reward players who keep their cool and stay focus.
Balance - Another way that physiological input could be worked into competitive game design is by balancing the odds based on the metric of "fun." For example, in the Mario Kart series, the effectiveness of the items players pick up on the course is determined by how well they're doing in the race. This "negative feedback loop" (different kind of feedback!) is explained in the video on the right (at 1:54). The system assumes that players who do poorly are having less fun and balances the game so players with less skill can still have fun and get a shot at winning. I would propose adding some physiological input, interpreted as "fun," as a factor in the quality of items received. This way, players who do well can get better items if they are frustrated with how the system works against them, and quick changes in which place a player is in relative to others are accounted for.
|
|