Skip to main content

Mike Sugarbaker

Design for the user who's over your crap

3 min read

It’s happening again as I write this, with tilde.club: at first people were excited about the stripped-down, back-to-basics user experience of a plain UNIX server designed for serving web pages, and the aspect where logged-in users could chat at the command line gave the place the feeling of an actual social network. But now the initial excitement is spinning down and people are updating their pages less often; whether the chat is still hopping, I couldn’t say – I don’t have an account – but I guarantee you it’s changing.

What do we need from the social network that’s next, the one that we actually own? (You could argue as to whether it’s coming, but no need for that right now.) I propose that the moment we get bored is the most important moment for the designer of an app to consider. Right? Because what’ll people do with whatever revolutionary new web thing you put in front of them? If my experience on both sides of the transaction is any guide, they’ll probably get sick of it, and fast.

There are so many kinds of boredom, though. There’s the smug disappointment of paddling your surfboard over to what looks like the next wave, only to find that it “never” crests. A more common pair, though: there’s the comedown – when something was legit exciting but then the magic leaves – and then there’s the letdown, when something seems exciting at first blush but you investigate and find the glamour was only skin deep. Most systems have more to fear from the latter. New systems that are any good, though, don’t often have a plan for the former. Distributed social networking needs one.

What do people need at first, and then what do they need later?

At first:

  • Co-presence (hanging out)
  • Discovery (more and more users!)
  • Things to play/show off with (hashtags, what have you)

Later:

  • Messaging (purpose-driven – I need to get hold of *that* person)
  • Defense (from spam, griefing, and attention drains of various kinds – generally, but not entirely, from the public)
  • Things to use and enjoy (tools and toys that aren’t purely social)

One’s needs from the first list never go away, exactly. You’ll always want to bring something up to the group now and then (where “the group” is whoever you’re actually personally invested in conversation with), and play and discovery don’t die. But we see so much more design for that first list – probably because a commercial social network needs to privilege user acquisition over user retention… or thinks it does. And as a whole culture we are only now coming around to the importance of designing for defense, despite the evidence having been here for 35 years.

It’s hard to keep coding when the bloom is off the rose of a project. One way to keep yourself motivated, when the work is unpaid, is to take the perspective of that un-jaded, excited new user, discovering and fooling around. This naturally leads to features that appeal to that mindset. A major obstacle we face in developing the decentralized, user-owned permanent social network is making faster progress while maintaining the mindset that will result in a grownup network for grownups.

Mike Sugarbaker

Conservatism and roleplaying

8 min read

There’s this story that you hear people tell, of a lost glorious age taken away by those with no right to it, and its last, struggling few defenders. This lost age is a time when there was no challenge to, by which I mean not even the smallest noticeable difference from, a standard hierarchy of power. All difference is challenge, you see, because this person, this storyteller who values this lost age, is so closely identified with their own power that any possible attack on it might be an attack on their very selves. It ends up that the most important job of conservatism is to protect “the private life of power”: the intimate insults, whether in the home or on the nightly news, that stop masters (or those who think of themselves as masters in training) from feeling like masters. “Every great political blast – the storming of the Bastille, the taking of the Winter Palace, the March on Washington – is set off by a private fuse: the contest for rights and standing in the family, the factory, and the field. […] That is why our political arguments – not only about the family but also the welfare state, civil rights, and much else – can be so explosive: they touch upon the most personal relations of power.”

This analysis, like the quotes above, comes from Corey Robin‘s The Reactionary Mind, a polarizing book for people on both sides of the ideological fence. Lots of folks on the American left believe that the red-meat culture-war side of right-wing politics is just a cover story, a theatrical shell over their real, merely corporatist agenda. Robin proposes not only that the two conservative agendas are really one, but that the people who espouse them are not crazy; instead, they have a large and well-constructed body of philosophy behind them – they just see no problem with its being built on an idea as sick as “some are fit, and thus ought, to rule others.” This possibility frightens a lot of middle-class progressives, because it means that we will have to fight after all, and fight hard. The liberal middle class hates fighting. We hate the thought that we can’t all just get along if we finally explain the facts well enough.

This anxious aversion to conflict, I have to admit, is probably what has driven a lot of my online research into roleplaying. You’d think that when it comes to games, the stakes would be so low that there wouldn’t be much fighting, and certainly not much anxiety over it. But many people in online RPG-discussion circles seem to have a permanent hate-on for new-style gaming, to the point where some have made that hatred a banner of online identity. It’s confusing, at first blush; I mean, how can people not grasp that they can just go on playing whatever they like? Why react not just so strongly, but so persistently? It doesn’t stop online, either. Many folks in the real world who’ve tried to introduce new games or gaming techniques to traditional roleplayers have been rebuffed with accusations that seem out of all proportion.

My anxiousness has declined a great deal since I’ve realized what’s going on: roleplaying games, to date, have generally embodied a number of power relationships – between players and the fiction, and between players and the GM. For the last forty years, the roleplaying hobby has invested most of its hopes for any feeling of fairness in the loosey-goosiest game ever invented in the role of the game master, or most often the Dungeon Master. The GM/DM has been invested not only with the final say over any matter that comes up for adjudication, but with control over the game’s opposing forces. Players venerate the people who manage this conflict of interest well, while anyone who can’t – while being given precious little systemic support for doing so – has, over the life of the hobby to date, mostly just been shamed.

It’s been traditional for a long time, as well, for the GM to be the social host of the game, as well as to decide who is invited to be part of the game and who’s not. Since one incompatible player can ruin everyone’s fun and a lot of players regard play opportunities as a scarce, valuable commodity, the GM role can be a massive source of social power.

On top of these, there’s the power of the storyteller. In some RPG subcultures, the GM is expected to be the main driver of the narrative. If players want to do anything of great consequence to the plot, they can’t just up and do it – they either need to cooperate deliberately with the GM, or they simply understand that what they’re at the table to actively do is something else (perhaps fighting the monsters that have been placed in the encounter, perhaps just being a bystander to a good story). All fine, and all perhaps necessary when the rules don’t much help all players get a satisfying story simply through their play actions, but all certainly adding to the social power of the GM role. Great storytellers are respected across cultures.

The GM-and-players relationship is not the only power relationship in RPGs. A player who has mastered the rules, or other skills required to play well, successfully enough to get whatever he or she wants out of the game, gets many forms of power, including some social ones. In a collaborative game like a traditional RPG, do you help the other players when they struggle with rules? When, and on what terms? Do you help them with strategy, or do you deride them as dragging the group down? What if you don’t have that mastery and your contributions to the game are getting blocked by people who do – do you then build a relationship with the GM, such that you depend on her to keep that blocking player in check so you can contribute?

These are all power relationships that invite personal identification. How often do we hear GMs identify as such, almost like it’s an ethnicity? How often do they talk about “their players” in a vaguely or explicitly paternal way? And in the end, what identification could be more personal than one’s role in a game full of stuff made up by oneself and one’s friends? Especially a game that’s not essentially different from the game you played for countless hours in your childhood?

So, you have people who for whatever reason are closely, personally identified with their position of power at the gaming table – no matter whether that position is high or low. Non-RPG story games upset these positions. They become a threat.

I am not saying that people who defend traditional RPGs necessarily hold conservative politics in other arenas, although [as I’ve said elsewhere], it shouldn’t be forgotten that D&D was born amongst Midwestern armchair generals who didn’t like hippies much. RPGs also quickly found cultural footing in the science fiction and fantasy fandoms, which have their own strong currents of conservatism to this day. But conservatism can also be quite compartmentalized; you might have no beliefs about a natural status order of economic roles, but strong ones about an order of genders, as one example. (Not to forget liberal activists who end up showing off, and defending, their privilege – nor people who identify destructively with a permanent role of outcast or spoiler.)

I’m also not trying in general to make the problems with our conversations about RPGs out to be a bigger or more important problem than they really are. It’s enough, to me, that RPG conservatism poses problems for anyone who wants to work towards a better hobby-wide conversation, find players for new games, or even just search on Google for more information about them. Not even coming up with the new term “story games” can help us with that one forever.

(By the way, all of the above also explains why D&D edition wars will continue, despite almost every edition of D&D currently being back in print.)

Mike Sugarbaker

MisubaTwine CompetiFest 20B

1 min read

I am holding a Twine game design competition.

Entries are due by midnight PDT on Friday, April 19. Send me some mail and either attach the game or give me a link. Put [Twine] in the subject line of all entry emails.

I will be judging all entries and selecting a winner. Judgment criteria include innovation in use of Twine mechanics, replay value, and expressiveness/awesomeness/tendency to make milk come out of my nose.* Bonus points for incorporating something I’ll recognize from story gaming but not being too hammy about it.

There will be a prize, valued at approximately $40 and not very useful. I haven’t selected it yet.

I’ll be updating this post as needed with further news. Send email or come find me on G+ if you’re dying to discuss something.

* I don’t drink milk.

Mike Sugarbaker

Making your HTML5 and CSS internationalization-friendly

2 min read

Someday, one of these crazy web things you make will catch on – yes, it will! I believe in you! – and when you aren’t busy freaking out about scaling it up, you will maybe want to spend a couple minutes thinking about writing it so the rest of the world can read and use it. Here are some ways to do your future self a favor when you do your markup and styles.

Give descriptive IDs to all the things. Any JavaScript that’s gonna translate the non-core text on your page is gonna need to find it first. Even if you’re normally into using fewer IDs, consider the way you’d want to work if you had to write a translation tool (or a translation). No, lengthy CSS selectors are probably not that way. Nice, human-readable ID attributes that succinctly describe the text contents are the way to go.

Charsets, punk. UTF-8 declarations are part of HTML 5 Boilerplate and other templates more often than not, but just to make sure, check that you have <meta charset="utf-8"> in your <head>. If you’re rocking a proper HTML5 doctype, that should be how to write the <meta> tag.

Watch margin-left and margin-right. Either avoid calling out these specific properties in your styles, or make sure your special-case classes or alternate sheets for RTL will have no trouble overriding them. Don’t go around sticking !importants on things that won’t be important in Mandarin. Bear in mind that in the future, margin-start (currently -moz-margin-start and -webkit-margin-start, and similarly named -end properties) will automatically apply to the right thing in RTL or LTR situations. But right now it’s good for impressing people in job interviews and that’s about it.

How will those fancy new CSS properties know when things are RTL, you ask? (I had to ask this, so don’t feel bad. Or feel bad, but do it for me.) CSS3 has a direction property that takes rtl but defaults to ltr. Also there’s the dir HTML attribute that takes the same values, which has been around a while but is now (in HTML5) kosher to use on absolutely any tag you like. Look ’em up for more.

Mike Sugarbaker

Roleplaying and the brain: when you can't "just roleplay it out"

8 min read

There’s a thing psychologists call the fundamental attribution error. You could summarize it (when married with its cousin, self-serving bias) as “I Messed Up For A Good Reason, You Messed Up Because You Just Suck.” Specifically, the reasons we give when we mess up tend to be external factors, rather than some internal quality we identify with, whereas the reasons we assume for other people’s mistakes or offenses are internal rather than external – inherent to who they are. We make this mistake in part because we have access to our own subjective experience but not other people’s; if we did have that access, it would tell us a lot about what’s really going on with them.

Now, when you play a roleplaying game, there’s an other person: your character. To the degree that you aren’t just treating your character like a pawn, you have to do some thinking about the reasons for what they do, because you’re deciding what they do. But they don’t exist; they’re only in your head.

It’s okay; you can be just as wrong.

My theory, for which I have some backup, is that we don’t have nearly as much access to our roleplaying characters’ subjective experiences as we think we do. See, the external conditions that surface into our conscious awareness when we make decisions aren’t the only conditions that apply. There are dozens (this may be low by an order of magnitude or two) of involuntary chemical responses our brains have to things, especially stressful things. They are far enough outside of conscious control that you might as well call them external – they’re certainly external to the conscious volition of everyone but the very highly trained – and they are often the most salient conditions to the kinds of judgments we make when we make the FAE. I am talking especially about fear and embarrassment.

These chemical responses can turn into conscious feelings in unexpected, and unexpectedly changeable, ways. I’ve been seeing the story make the rounds lately of a 1974 study (the same year as the first publication of D&D!) in which men who had just walked across a famously frightening suspension bridge were asked a set of questions just afterwards by an attractive female interviewer who offered her phone number to the men for follow-up questions. A control group was given time to recover normal heart rate and such before being approached. Men in the control group were significantly less likely to call the number and ask for a date. The slight shift in context – hey, attractive woman! – took the neurological arousal of fear and put it to an entirely different conscious purpose. That’s just one example.

None of this would have any implications for roleplayers, if it weren’t for the way we often check in to our characters and imagine what they would do: by stepping into our characters’ heads, and trying to see through their eyes, at least metaphorically. Now, you do find the occasional “immersive” roleplayer who claims to have a trance-like ability to feel what their characters feel. But these claims are unverifiable, and the idea that a master immersor’s brain chemistry would have the necessary kind and depth of neurochemical accuracy to a natural response strikes me as an extraordinary claim requiring extraordinary proof. At any rate, most of us keep a little more mental distance from our characters while we play.

Suppose you’re “roleplaying out” an in-character debate, in one of those free-form roleplaying sessions where vague, grandly scoped political debates take forever and never resolve. Apply our jokey summary this way: “I caved in in the argument because my feelings ganged up on me. You caved in in the argument because you just suck.” Except the “you” is your own character, right? They don’t suck! They’re pretty awesome, in fact! So why would they cave in? And so the argument goes on.

Well, actually, they’d cave for the same reason you would: involuntary emotional responses. Maybe not under the exact same terms or at the exact same time, but they won’t be free from those forces, unless their neurology is substantially non-human.

If you want to make realistic decisions on your character’s behalf when your character is in one of many kinds of stressful situations, you must either apply some kind of external constraint (e.g. system), or step away mentally from imagining your character’s conscious volition (that is, think more authorially).

Already, though, I should remind you that the first word in that there commandment is “if.” Realism isn’t the one true yardstick by which story, play, or our weird story/play amalgam called roleplaying must be judged. But (and here we go back to doing aesthetic theory), I think roleplaying needs more realism in this particular neurological arena, for two reasons.

The first is that content wherein the heroes never feel self-doubt, fear, or a single moment’s weakness is trite. It’s fine for kids and, I hasten to add, for games in which you aren’t there for the content – that is, in which the main point is clearly the gamey business of bashing monsters and thinking tactically. (However, the recent history of indie video games shows us that tactical, traditionally heroic gameplay need not conflict with other modes of gameplay that question or even undercut them.) But my primary interest is in those games that value story more highly.

The second is that lack of realistic judgment about our characters’ mental states contributes to many of the classic social problems in gaming that split up groups and drive gamers out, as well as stopping new participants from coming in. I don’t just mean the two-dimensional stories and interminable arguments; I mean things like the rampant sociopathy on characters’ part that tends to creep into many games, due to the human inability to feel involuntary shame on behalf of their made-up characters. (The other side of that coin is the ineffectiveness of peer pressure on a fictional character; RPG lore is also full of tales of the one player who insisted on making their character follow a rigid moral code, screwing up the other players’ fun. In reality, a holy knight who went around adventuring with a bunch of miscreants would find themselves acquiescing to all but the most horrible crimes in pretty short order.) Basically, besides all the other problems with saying “but that’s what my guy would do,” when someone says those words, you should take a hard look at whether it really is.

There’s no limit to the number of ways system could possibly deal with the problem. One is to break the one-to-one relationship of player decisions to character decisions, by allowing more than one real brain a shot at driving the fictional brain. If the incentives are lined up properly, this could do the trick, but I don’t know of a good specific example. Another possibility is to enforce a fiction change that lines up with the desired player experience – a recent conversation between Vincent Baker and E6 author Ryan Stoughton speculates on a game in which players’ characters are recast as robots who have certain rigid programs that take over for their free will. I’m unconvinced that particular game would put the players’ felt experience in the exact right place, but it’s an elegant attempt.

The most historically popular option, though, are so-called “social mechanics,” meant to handle things like mental stress and manipulation. Social mechanics have a reputation amongst RPGers for not doing these things all that well (often because they’re modeled closely on the main historical lineage of RPG mechanics, those derived from wargaming). The last ten or so years of design have produced systems that do the job better, as well as systems that dodge the question entirely by operating on a much less character-viewpoint-identified level – asking players to think from time to time like authors or directors, as well as like their characters. Entrenched roleplayers are famously resistant to either approach, often advocating instead that we “just roleplay it out.”

One thing’s for sure, though: these players are not wrong when they call social mechanics “mind control.” They’re just wrong that their own minds aren’t being involuntarily controlled all the time.

But if you don’t like the two choices in my boldfaced rule above, there’s actually a totally viable third one: simply accepting that some of the character decisions in your game are going to cause problems, and that it isn’t such a big deal to go on with your game knowing the problems are there. Being wrong isn’t the worst thing that can happen to you, and if your game is fun overall, then you should enjoy it. Accounting for taste is the job of aesthetic theory, but that doesn’t mean it’s, you know, possible.

Mike Sugarbaker

But what do the rules have to say about the story?

8 min read

It might not be clear to some readers of my series on defining story games (in three parts!) just how it is that the rules of a game, of all things, are supposed to interact with an ongoing fiction. I mean, what do you do? Do you just flip a coin and say, “heads, my guy beats your guy, and tails, your guy beats my guy”? And that doesn’t even answer anything, because when do you do that, and under what circumstances? Finally, just: what’s the point of using this rule, on this story, when we could just freely make stuff up instead?

Starting with the first question: every game does it differently, that’s part of the point of having different ones. And while not all games are structured in a way that makes this plain, you could think of a story game’s rules as a set of inputs and outputs – and indeed we already have, in our loop diagrams. In the fat-green-loop variant of the diagram, input comes from the fiction-y bits, into the rules, and the rules put some specific addition or restriction back out into the fiction. (This is leaving aside a certain level of rules, implicit for long-time roleplayers, that govern the way we make stuff up: players say what their characters do, one player per character in most games, et cetera. Those rules and other implicit rules are always on. When I talk about made-up stuff and rules being separate, assume for now that I mean the diegetic content of the game versus explicit, procedural mechanical interactions.)

Helpfully for the purpose of giving you an example, a recent design trend has been back towards rules interactions that are brief, focused, and very specific about when to apply them and what goes back into the story. This trend was crystallized neatly by Apocalypse World, a game by Vincent Baker, which puts the bulk of the rule interactions players make into what it calls Moves. Here’s a sample move, from the Gunlugger character’s playbook:

Fuck this shit: name your escape route and roll+hard. On a 10+, sweet, you’re gone. On a 7–9, you can go or stay, but if you go it costs you: leave something behind, or take something with you, the MC will tell you what. On a miss, you’re caught vulnerable, half in and half out.

Now, if your game’s loop is more black than green, you want rules that let made-up stuff change play-by-the-rules in such a way that your experience of play-by-the-rules is enhanced, not diminished. This opens all sorts of questions of balance and fairness that remain challenging for designers to this day. Our primary interest here, though, in case you haven’t noticed, is fat-green-loop games. In mostly-making-stuff-up games, you (predictably) want rules that let play-by-the-rules change made-up stuff in such a way that your experience of made-up stuff is enhanced, not diminished. That’s what the above is an example of. It triggers when the character is in a specific situation (in this case, wanting out of somewhere dangerous), and it complicates that situation in certain known but flexible ways.

However, in making-stuff-up-oriented games we face the challenge that our own process of collaboration, the just-talking-to-each-other part, is in competition with the rules. More rules-oriented games don’t have this problem; when they get more and more rules-y, they just trend towards not being story games anymore. They remain story games because no matter how small the green loop gets, it’s still there; the things you get from it can’t be gotten any other way. In a fat-green-loop game, you can similarly argue that at least one tacit rule will always remain (the one that says “we’re making up a story”), but every rule that actually makes one designed story game different from another could conceivably fall away. To put it Vincent Baker’s way, if a given rule doesn’t get a given group better results for their story than “vigorous creative agreement” does, then there’s no reason for that group to use that rule. Story games have to keep justifying their existence by bringing players things that they didn’t already know they wanted.

The trick to that – that is, the aesthetic value in a given piece of game design – lies in when you decide to make the input, what the rules put back out, and in how it feels to use the rules to make that transformation. All three of those things should support the goal of play – there’s that weaselly phrase again! – to the satisfaction of the designer and the players.

So can we put this together into a nice, concise package? Here’s Baker again, who along with fellow habitual-RPG-theorist Ben Lehman has lately been doing it like this:

  1. A rule is something, spoken or unspoken, that serves to bring about and structure your play.
  2. Good rules encourage players to make interesting changes to the game’s state.
  3. “Interesting” is locally defined by play groups.

This is a bit of a change to the way RPG theory is heading. Some of you may have heard, or read, about a little thing called “GNS,” and its birthplace, a web forum called The Forge. It’s hard to separate the two, perhaps because GNS stands for three different families of “creative agenda” in RPGs that have been “observed to conflict with one another at the table,” and throughout its recently-concluded life, The Forge tended to cause conflict.

For most of the last decade-and-change, GNS – nevermind what it even stands for – has been the nearest thing story games have had to a theory of aesthetics. As it turns out, though, conflict isn’t a great basis for an aesthetic theory: conflict is complicated, divisive, and utterly subject to accumulated historical accident. When you try to make it a part of your answer to “what should rules have to say to the story?” you end up getting an argument about something else most of the time. On top of that, all this theoretical work was being done on web forums, which are notoriously poor at keeping arguments under control.

(It should be said, though, that when the seeds of GNS theory were planted, the fight was kinda necessary. It was 1995 or so, and the state of the roleplaying art was a muddle. There was a new 800-pound gorilla on the block, a game called Vampire: The Masquerade, that had a bell-clear stylistic vision and did the rare trick of actually, for-reals having an effect on the larger culture outside of gaming. It produced, and its progeny continue to produce, a ton of fun play. But… its actual rules-bits did little or nothing to reinforce its style and themes, and it came right out and admitted this, pushing and popularizing the notion that satisfying story-play and the use of rules were mutually exclusive things. To get away with this trick, it had to lend weight to some long-standing fallacies like the socially suspect notion of the gamemaster as master-Svengali-storyteller. All of that was theory that needed to be destroyed for the art to move forward, and GNS helped to destroy a lot of it. So, good. GNS served its purpose, and now we have a different job, that needs different tools.)

The new orientation around “interesting” is a much better foundation. In some ways, it’s a cheat – not coincidentally, the same cheat that we made in our definition of story gaming. It allows the same necessary flexibility in terms. In the name of “interesting,” you can bring in anything that shapes human attention – and if you wanted to fully understand roleplaying, you might have to bring in everything.

But as of now, we have a definition and a basic aesthetic theory. Once you have those, what do you do? You might start by making a few more specific aesthetic sub-theories, such as the one I promised you last time. After that, though, there’s also some of the more structural stuff in the Big Model, the GNS-associated theory that we should be careful not to throw out with the bathwater. So we might talk about that next. And of course, you can go play.

Mike Sugarbaker

Defining roleplaying, part 3: "impactfully" is too a word

7 min read

To recap:

  • A story game is a game which explicitly allows for players to make things up about fictional characters and events, allows whatever is made up to have a meaningful impact on the point of play, and isn’t generally intended to produce instances of play for an external audience.
  • A roleplaying game is a story game which hews more or less to the traditions that produced Dungeons and Dragons.

That story-game definition is a wordier restatement of where we left off, “a game which sanctions players to make things up, impactfully with respect to the point of play, about fictional characters and events, usually not for theatrical purposes”; despite being longer I think the new one’s clearer. “Not for theatrical purposes” was read by some as meaning “no speaking in character,” which certainly doesn’t define story gaming; the addition “instances of play” means events of a group of people playing one game one time. (We have to distinguish that from game-as-product, as in “a game of Monopoly” versus “Monopoly is a game.” Story games as products are generally intended to be seen by someone other than their creators!)

So yeah. Miss those funny circle diagrams? Me too. Fortunately, they still have a use, not as much for defining role-playing games as for talking some more about the elephant in the room: “the point of play.”

inverted open loop50/50 open loopopen loop

There we have three roleplaying games, let’s say – that is, three different instances of play, at different tables with different groups. In the one on the right, the players spend most of their time engaging with the gamey bits – the rules – and a little bit of time inserting details that the explicit rules might suggest, but don’t codify. (The black parts of the loop represent explicit rules, and the green parts represent fictive stuff.) These diagrams are really just meant to represent that, the time spent – they aren’t meant to say that every rules interaction (every trip around a feedback loop, that is) contains some made-up stuff in it.

The game on the left is a very loose game by comparison, perhaps touching on no rules at all other than the traditional RPG structures of individual characters being controlled by individual players, and of a gamemaster who adjudicates things. The game in the middle is, well, somewhere in the middle.

As we talked about when this whole thing started, all three of these ways of playing were present at or near the inception of Dungeons and Dragons. They all remain popular today. And they remain almost entirely separate – independent cultures of play which, when their members are even aware of the other cultures at all, are often at war with one another.

For those of you whose interest in nerdfights such as these is casual at best, let me try to summarize. When your main interest in a game is to interact with its rules – you know, to play it, thinks the gamer in the rightmost loop somewhat crossly – the folks who spend time emoting and doing story stuff can be more than just annoying. For instance, when they don’t know the rules well due to lack of interest, they can become liabilities to the success of the whole party. (Also, their increasing reach for creative control over the world outside their characters can be attempts to game the system to gain unearned advantages over others.)

In the game on the left, someone who’s impatiently asking when the fight starts isn’t just being gauche (although maybe he’s that too); his interest in competition can find a home in socially corrosive story-blocking of other players. In the middle are stable hybrid cultures of play, with roughly equal but implicitly divided domains of story-stuff and gamey bits; but that may just mean there are two fronts on which a group has to defend against newcomers who “do it wrong” and spoil the fun. All three types of games are roleplaying games, clearly, but the presence of someone who expects one type at a table full of people who expect another is a ticking time bomb. You’ve got to defend your group against it somehow, when the [geek social fallacies] may mean you’re stuck with a problem player for good. Let all this brew for 40 years or so, aided by poor critical language and lots of hurt feelings on all sides, and you have a full-blown culture war.

Definitions are powerful weapons in a culture war. Small wonder that people have reached quite innocently for “That’s not roleplaying!” when a tactic at the table or a proposal in conversation went against their long-unexamined assumptions about what they’d invested so much of themselves in, or made them feel threatened socially. But when the work of definition has gotten more deliberate, there has sometimes been nothing innocent about it. Game designer John Wick, long a fan of controversy, once deliberately put forth a definition of the term “roleplaying game” that didn’t include D&D – which at least tells you how seriously he took his chances of success at propagating a definition.

This is why it’s so important to have that morass of vagueness, “a meaningful impact on the point of play,” in our definition of story gaming. I want all three of those tables to see what they do reflected in that sentence and respected there. Plus I don’t just want to respect the current field, I want respect for new ideas to be built into the foundation of this emerging medium. If that means that “story game” as a category becomes a kind of catch-all for wildly different kinds of play, united only by the act of making up some fiction, that is okay by me. (I even waffled a little on the “fiction” part – early drafts of my definition didn’t have it and got kicked around on Twitter as covering games like the abstracted legal system Nomic, the great abstract puzzle game Zendo, and the notorious hidden-information exercise Mao. They seemed clearly not story games… at first. The more I thought about it, though, the more I thought they could be story games for telling very abstract non-fiction stories – in Mao’s case, stories about how much you hate your friends. Anyway I decided not to go there explicitly.)

The danger of having such vague bits in a definition, though, is that people start making assumptions about what the vague bits mean. Those assumptions, if culturally powerful enough, can essentially replace the real definition. This has happened more than once right in front of me as I’ve watched people try to put forth critical language for story gaming – people weaponize the wiggle room to score status points, just as surely as some do with the open loop in RPGs.

Why all this ambient rancor and vindictiveness in the hobby? I believe it all begins with how games shape our brains early in life, the ways RPGs don’t fit into those shapes, and a few other facts of neurology for which RPG players rarely account (plus some lines of research that RPGs are actually out in front of). But that’s a whole other series of posts, even more arduous to write and shakily founded than this one has been. As for definitions, though, we’re out in three. Whew!

Mike Sugarbaker

Comments must die

5 min read

I’ve been blogging more lately, and getting some response – not here on my own domain, but on the various social networks on which I’ve been posting notices. (That’s Twitter and, to my occasional chagrin, Google+.) Now, it’d be awesome if more people owned and relied upon their own web sites to host content, it’s true. And you can only trust the cloud so far, so I’m definitely going to strive to keep and host my own copies of any really awesome things that come out of discussions.

But I am not bothered by not having local comments on Gibberish. I don’t miss them at all.

Having comments on your domain means turning it into a nightclub that takes considerable management. Not helping matters is the inherent disembodiment of an anonymous, textual medium – imagine if the hecklers in your nightclub were ghosts. The natural inhibitors that come with being physically present aren’t there to moderate people, so the moderating tactics we’re left with are more blunt, therefore more damaging to the remaining human conversation. The worst of both worlds, fighting each other. (And again… lessons here for RPGs, where everyone’s acting in an ideaspace as a fictional self that isn’t limited by the things that happen to brains when people are physically present.)

As Neven Mrgan writes in the afore-linked, some of us just want to write. Others of us just want to pretend we live in a world without comments, and have access to the tools to make that possible by hiding them all. But is that enough? We can buy ourselves distance from each other with technical capital, just like how people with monetary capital can buy houses on far-off estates; but where’s our responsibility to each other in that picture?

Brian Eno said a while back that “a more connected world is a more vulnerable world,” and he predicted that soon our societies would start to shy away from the trend of more and more connection. I think we’re at a moment where we can and must choose – voluntarily, for a goddamn change – to pull back from a social extreme and therefore stay accessible to people who aren’t a certain flavor of social extremist (see also: grognard capture).

As web analyst Paul Ford writes, the central question of the web has become “Why wasn’t I consulted?” And sometimes that’s great. But I have come to believe that other times, this harms the larger culture. I’d like to see an online world that has a wider range of answers to that central question – not just always “oh, my bad, here’s your comments thread.” Would that online world have more elitism in it? Yes, it would. It would have more of all the things that are in the real social world. But it would also remain the online world: it would connect people who otherwise aren’t connected, and thus would retain more anti-elitism than the real world. Not every point on the web needs to be a customer-service desk for that to be true. Every domain should have a customer-service desk, probably, but not everything should be one.

For example, the web page for a New York Times article should not be conferring any kind of status on the comment of just anybody who wanders by. Whatever you think of the NYT’s ideas about its own status, it does need to have those ideas, and use them. That’s what being the Times is. Lots of newspapers, and other publishers per Ford’s piece, feel a bit put upon by web culture because they’ve reified comments rather than seeing them as an instrument that has a necessary but specific use. Someone convinced them it’s not a web site unless a thread of comments is trailing off everything, like drool; they need to snap out of the trance.

Another example is feminist and anti-racist blogs, where a measure of what might look like elitism – shutting out a voice that’s popular – would help protect voices that don’t have power in the world. (That said, I can understand someone whose blog gets them regular death threats via email wanting to provide comments as a kind of pressure valve for the hate.)

Must comments die everywhere? No – I was just trolling you (another often-harmful social fact of the world, just like elitism, that must be managed for the greater good, not abolished). But no matter where or how you put things on the web, I urge you to change your comment policy this year. You don’t have to block all comments (if your blog-host-or-whatever leaves you that as the only option, I encourage you to change hosts), nor even make your policy more restrictive – just try a new form.

Mike Sugarbaker

Hourly Comics Day, February 1, 2012

1 min read

I did them. Here goes:

I did some cheating (because that’s how you win) – you can see some Photoshop stitchery going on, and that all happened today rather than yesterday.

If you’re unfamiliar with hourly comics, here.

Mike Sugarbaker

Reconsidering the open loop: more on defining role-playing games

8 min read

I want to expand the definition of “story game” I settled on in our last episode. (For those just joining us, my definition of “story game” underpins a definition of “role-playing game,” one of which has been missing for 35 years and would be culturally advantageo– oh, just go read the post.) I’m rather pleased with the new one; it’s weaselly in all the right ways. Here goes:

  • A story game is a game which sanctions players to make things up, impactfully with respect to the point of play, about fictional characters and events, usually not for theatrical purposes.

Why the change? Well: when Scott McCloud put forth his definition of the medium of comics in Understanding Comics, he went to considerable trouble to, as he put it, “not be so broad as to include anything which is clearly not comics.” It may be that we’re always going to have a lot of trouble doing this with story games, because they’re simply a lot more complex than comics are. In a medium made as much (or more) out of people’s minds and interpretations than out of the artifact that’s been put down on paper, maybe there’s no such thing as “clearly” or “clearly not.” On top of that, the various ways that RPG culture has built fractiousness right in from the beginning have made it even harder to choose where to draw the line. Everyone goes with their own gut, based on their own gaming experience, when choosing what needs to be included in the definition; gamer guts tend to diverge (write your own medieval-combat joke); and taking the sum of people’s guts is as unproductive as it is unfeasible.

That said, my own gut is giving me some trouble with so-called “parlor narration games.” I’ve never been happy with the term (is there any usage left of the word “parlor” that isn’t pejorative?), but it refers to games wherein rules can insert things into made-up stuff, but not so much the other way around. The common belief amongst role-playing theorists is that a game with too many parlor-narration mechanics is not a role-playing game (or a story game, by my definitions). I find myself wondering whether this distinction is productive. It feels a bit like including a piece of historical accident into what’s supposed to be a picture of the essence; like it’s a failure to fully separate form from content.

The argument for it: without it, we have the Monopoly problem back. Recall our game of Monopoly where we make up stories about our pawns, never letting those stories affect the rules, represented by the “spiky closed loop:”

closed loop with offshoot

When we play this way, are we playing a story game? If we, a lone group of players, are playing one, does that mean Monopoly is one, in the general case? If so, doesn’t our definition potentially include all games? If we’re playing a game that does have an open loop but we don’t happen to use that loop at all in our one instance of the game, is it still a story game? What if most groups who play the game don’t make any use of a provided open loop, but a few do? Requiring the open loop cleans up this mess.

The argument against the parlor-narration distinction, though, is that it creates a new category of game that isn’t useful for any purpose other than helping story gamers point at something they don’t like (and therefore becomes just another definition-as-rhetorical-stick, which roleplayers don’t need any help creating). There isn’t any significant group of people looking for new parlor-narration games to play, except possibly fans of long-form improv theater – and even then, you could say that the play of long improv games affects decisions made during the thin sliver of game time that interacts with explicit rules. (This gives our definition of story games a wee problem, which you can already see that we patched.) And very few people set out to create a parlor-narration game in particular; most are intended as story games but dismissed as such by a critic.

Over in the neighboring geek culture of science fiction literature, there still isn’t quite an agreed-upon definition of what’s necessary and sufficient in a piece of fiction for it to be called SF. Consensus amongst the serious thinkers, though, seems to be that the best move is to punt, mostly not defining it at all except as whatever we all say it is, but intriguingly sometimes making noises about defining the medium as “a way of reading.” (I apologize for not being able to find my reference here.) What if we did the latter for role-playing, calling the play of Monopoly with lots of in-character dialogue a story game? What if we let the spiky closed loops into the tent?

Well… why not? Why not make the tent as big as possible? Geek cultures too often try to tighten and restrict their definition of “the real thing” when they start declining as a culture, and that strictness just leads to further decline, in a particular mode known as grognard capture. (Mainstream comics arguably succumbed to this, but the clearest example in geek culture is in heavy metal – and yes, that is a geek culture). So, instead of tightening things up, why not get downright imperialist? “Those people writing to each other on forums as Harry Potter characters? That’s us. All those people playing Mass Effect and Skyrim and World of WarCraft? That’s us too.” (I’ve borrowed this latter idea from somewhere, which I again can’t find; sorry.)

Of course, if we do this, we cross the line from building a definition as a tool for thinking, into the realm of building a definition for political purposes… which is unavoidable, to a point. And it’s a point we’re already past: I talked in the prior post about my political goals for having a definition at all, so hey. Just to cover some more political bases, I don’t actually want to embrace console RPGs and MMOs all the way; it strikes me as important that computers don’t know how to care what human players think is the point of play. This new cut of the definition could be seen as halfway letting CRPGs in, possibly, maybe. But I’m not interested in that direction.

So to recap, the full set:

  • A story game is a game which sanctions players to make things up, impactfully with respect to the point of play, about fictional characters and events, usually not for theatrical purposes.
  • A role-playing game is a story game which hews more or less to the traditions that stem from American skirmish wargaming.

There’s a ton of information, and wiggle room, encoded in “the point of play” (to say nothing of the word “game,” but I’ll leave that one for the philosophers). We’ll talk some more about that. And yes, I do think it’s basically a rephrasing of the previous definition, one that allows for games that don’t incorporate player input the way we might like. They might be bad story games, but they are still story games – we don’t need another category. The separate definition of RPGs is still mostly a political gambit, and I have more to say about that too. (Note also that those are “the traditions” – plural – “that stem from American skirmish wargaming.”)

Fewer diagrams are necessary now, which is also a plus in my book. (Fans of diagrams would do well to check out the work of Vincent Baker, ace designer and about 90% of the vanguard of role-playing theory for the last three years, on “dice and clouds” as a more detailed way of looking at the interaction of rules with the imagined space in story games.)

What is a roleplaying game? | Understanding Roleplaying | Defining roleplaying, part 3: “impactfully” is *too* a word >