This article originaly appeared October 23rd, 2007 in The Escapist
World of Germcraft
On September 13, 2005, a disease outbreak claimed the lives of uncounted thousands and rendered a number of cities uninhabitable. The disease was not SARS or Avian Influenza; it was Corrupted Blood, a debuff cast by Hakkar in Zul'Gurub, what was then World of Warcraft's latest instance.
It was supposed to be simple. Over time, Corrupted Blood would do damage to a player while simultaneously passing itself to other characters in close proximity. For a higher-level character (such as those capable of entering the instance and taking on Hakkar) the debuff was little more than an annoyance that tied up healing resources. For lower-level characters, it was downright deadly. What's more, it was never meant to have left the confines of Zul'Gurub, and yet it did. This is where the epidemiologists come in.
When Corrupted Blood ran rampant, its patterns of diffusion resembled those of real-life diseases, and scientists took interest. Not only did the disease spread to other users, but there were carriers who, both knowingly and unknowingly, were able to take the malady across great distances via their infected pets. The parallels between World of Warcraft's plague and potential outbreaks in the real world were so compelling that the CDC requested the simulation data.
But the data from the Corrupted Blood epidemic is only so useful. The face-to-face social interaction that World of Warcraft presents is extremely limited; between guild chats and global channels, there's little reason to come together in actual gatherings. Avatars may bump into each other on the bridge between the auction house and bank in Ironforge, but the main attraction still lies in the remote wilds and dungeons of Azeroth.
An alternative to studying diseases in World of Warcraft would be to look at some of the more "realistic," socially-engaging games, like Second Life, Active Worlds and There. Instead of a heavy focus on combat, these games are more freeform and place an emphasis on user-generated content and social interaction. Worlds like these bear a closer resemblance to our world and are seemingly better candidates for epidemiological research.
Second Life's scripting language offers epidemiologists the opportunity to personally engineer their own diseases. By developing a disease like a programmer creates a virus, epidemiologists can exercise control over how a disease transmits itself, what symptoms (if any) it manifests and how it could potentially mutate. The real advantage of virtual epidemiology is that these games have a human influence that just can't be simulated by computers. People do illogical and unpredictable things, things that have a strong influence on when and where a disease can be transmitted. However, World of Warcraft trumps everything else on the market in terms of critical mass, and a large population may be a more important criterion of realism than gameplay.
But if researchers really want to create a lasting, believable experiment, they'll need to work with a game's developers. A research team could create a disease in Second Life by itself, but it wouldn't have access to any tracking data Linden Lab may have. Instead, by working with developers, epidemiologists could enjoy more control over a simulation and its initial variables.
The chances of something going wrong would also be minimized if the developers were involved. Development teams could help bug-test experiments the same way they do for upcoming content. Even better, researchers could infect people without the player-base's knowledge; a behind the scenes "simulation layer" could keep track of infected characters without having any noticeable effect on the characters themselves. In this way, an infected character could progress through diseases stages without actually dying in-game.
Eventually, though, players will need to be made aware of the diseases and the consequences infection brings. The threat of a detrimental effect can change people's behavior; these changes in patterns would have an obvious effect on a disease's transmission.
Of course, the anonymous nature of the internet can create atypical wrinkles where players purposefully infect others, and with more people come more Typhoid Marys. The Corrupted Blood outbreak may have been caused by players intentionally using their pets as containers to carry the disease into populated areas. However, this behavior, while more prevalent, is not wholly incongruent with how people work. It's not rare for someone carrying a highly contagious disease to get on an airplane and spread a disease far and wide, germ warfare is as old as normal warfare, and bioterrorism is becoming an increasing concern.
Regardless of their goals, it's important that scientists make getting sick fun. If players don't enjoy the experience, they'll just stop playing the game. In traditional MMOGs, where there is an emphasis on character advancement, scientists and developers could include the experiment in a content update as a special global event. Players could be rewarded experience or items at different levels based on their participation in the study. In more open games, players could be given the opportunity to play the roles of first responders, doctors, even insurance carriers. Tailoring the experiment to the environment will encourage players to participate.
Experiments involving MMOGs don't just benefit researchers and developers, they benefit society and the videogame industry as well. Getting involved in the world's health makes games look good. By doing something to benefit others, much like the Child's Play charity and Folding@Home, gaming-based research can go a long way toward promoting games in the public eye.
MMOGs are more than mere distractions. They're social simulations, miniature economies and living worlds. They're legitimate scientific tools. Once people begin seriously using them, it'll help make the world a better place.