Understanding Bounded Accuracy (5e Guideline)

What is Bounded Accuracy?

Bounded accuracy is a fundamental design philosophy underlying the mathematics used in the core rules for attack roll hit probability. It is unique in D&D history, in that it is one of the few times the developer was publicly vocal about their development standards, going so far as to even give it a name, and expressing this idea through official statements. This is a big departure from the typically secretive or silent R&D department for past editions.

The Developer's Own Words
Conventional D&D wisdom tells us that the maxim "the numbers go up" is an inherent part of the class and level progression in D&D. While that might be true, in the next iteration of the game we're experimenting with something we call the bounded accuracy system.

The basic premise behind the bounded accuracy system is simple: we make no assumptions on the DM's side of the game that the player's attack and spell accuracy, or their defenses, increase as a result of gaining levels. Instead, we represent the difference in characters of various levels primarily through their hit points, the amount of damage they deal, and the various new abilities they have gained. Characters can fight tougher monsters not because they can finally hit them, but because their damage is sufficient to take a significant chunk out of the monster's hit points; likewise, the character can now stand up to a few hits from that monster without being killed easily, thanks to the character's increased hit points. Furthermore, gaining levels grants the characters new capabilities, which go much farther toward making your character feel different than simple numerical increases.

Now, note that I said that we make no assumptions on the DM's side of the game about increased accuracy and defenses. This does not mean that the players do not gain bonuses to accuracy and defenses. It does mean, however, that we do not need to make sure that characters advance on a set schedule, and we can let each class advance at its own appropriate pace. Thus, wizards don't have to gain a +10 bonus to weapon attack rolls just for reaching a higher level in order to keep participating; if wizards never gain an accuracy bonus, they can still contribute just fine to the ongoing play experience.

This extends beyond simple attacks and damage. We also make the same assumptions about character ability modifiers and skill bonuses. Thus, our expected DCs do not scale automatically with level, and instead a DC is left to represent the fixed value of the difficulty of some task, not the difficulty of the task relative to level.

We think the bounded accuracy system is good for the game for a number of different reasons, including the following:

Getting better at something means actually getting better at something. Since target numbers (DCs for checks, AC, and so on) and monster accuracy don't scale with level, gaining a +1 bonus means you are actually 5% better at succeeding at that task, not simply hitting some basic competence level. When a fighter gets a +1 increase to his or her attack bonus, it means he or she hits monsters across the board 5% more often. This means that characters, as they gain levels, see a tangible increase in their competence, not just in being able to accomplish more amazing things, but also in how often they succeed at tasks they perform regularly.

Nonspecialized characters can more easily participate in many scenes. While it's true that increases in accuracy are real and tangible, it also means that characters can achieve a basic level of competence just through how players assign their ability bonuses. Although a character who gains a +6 bonus to checks made to hide might do so with incredible ease, the character with only a naked ability bonus still has a chance to participate. We want to use the system to make it so that specialized characters find tasks increasingly trivial, while other characters can still make attempts without feeling they are wasting their time.

The DM's monster roster expands, never contracts. Although low-level characters probably don't stack up well against higher-level monsters, thanks to the high hit points and high damage numbers of those monsters, as the characters gain levels, the lower-level monsters continue to be useful to the DM, just in greater numbers. While we might fight only four goblins at a time at 1st level, we might take on twelve of them at 5th level without breaking a sweat. Since the monsters don't lose the ability to hit the player characters—instead they take out a smaller percentage chunk of the characters' hit points—the DM can continue to increase the number of monsters instead of needing to design or find whole new monsters. Thus, the repertoire of monsters available for DMs to use in an adventure only increases over time, as new monsters become acceptable challenges and old monsters simply need to have their quantity increased.

Bounded accuracy makes it easier to DM and easier to adjudicate improvised scenes. After a short period of DMing, DMs should gain a clear sense of how to assign DCs to various tasks. If the DM knows that for most characters a DC of 15 is a mildly difficult check, then the DM starts to associate DC values with in-world difficulties. Thus, when it comes time to improvise, a link has been created between the difficulty of the challenge in the world (balancing as you run across this rickety bridge is pretty tough due to the breaking planks, especially if you're not a nimble character) and the target number. Since those target numbers don't change, the longer a DM runs his or her game, the easier it is going to be to set quick target numbers, improvise monster attack bonuses and AC, or determine just what kind of bonus a skilled NPC has to a particular check. The DM's understanding of how difficult tasks are ceases to be a moving target under a bounded accuracy system.

It opens up new possibilities of encounter and adventure design. A 1st-level character might not fight the black dragon plaguing the town in a face-to-face fight and expect to survive. But if they rally the town to their side, outfit the guards with bows and arrows, and whittle the dragon down with dozens of attacks instead of only four or five, the possibilities grow. With the bounded accuracy system, lower-level creatures banding together can erode a higher-level creature's hit points, which cuts both ways; now, fights involving hordes of orcs against the higher-level party can be threatening using only the basic orc stat block, and the city militia can still battle against the fire giants rampaging at the gates without having to inflate the statistics of the city guards to make that possible.

It is easier for players and DMs to understand the relative strength and difficulty of things. Under the bounded accuracy system, a DM can describe a hobgoblin wearing chainmail, and, no matter what the level of the characters, a player can reasonably guess that the hobgoblin's AC is around 15; the description of the world matches up to mechanical expectations, and eventually players will see chainmail, or leather armor, or plate mail in game and have an instinctive response to how tough things are. Likewise, a DM knows that he or she can reasonably expect players to understand the difficulty of things based purely on their in-world description, and so the DM can focus more on the details of the world rather than on setting player expectations.

It's good for verisimilitude. The bounded accuracy system lets us perpetually associate difficulty numbers with certain tasks based on what they are in the world, without the need to constantly escalate the story behind those tasks. For example, we can say that breaking down an iron-banded wooden door is a DC 17 check, and that can live in the game no matter what level the players are. There's no need to constantly escalate the in-world descriptions to match a growing DC; an iron-banded door is just as tough to break down at 20th level as it was at 1st, and it might still be a challenge for a party consisting of heroes without great Strength scores. There's no need to make it a solid adamantine door encrusted with ancient runes just to make it a moderate challenge for the high-level characters. Instead, we let that adamantine door encrusted with ancient runes have its own high DC as a reflection of its difficulty in the world. If players have the means of breaking down the super difficult adamantine door, it's because they pursued player options that make that so, and it is not simply a side effect of continuing to adventure.

This feeds in with the earlier point about DMs and players understanding the relative strengths and weaknesses of things, since it not only makes it easier to understand play expectations, but it also ties those expectations very firmly to what those things are in the world. Now, we want to avoid situations where DMs feel bound by the numbers. ("Hey," says the player, "you said it was an iron-bound wooden door and I rolled a 17, what do you mean I didn't break it down?") We hope to do that by making sure we focus more on teaching DMs how to determine DCs and other numbers, and letting them adjust descriptions and difficulties based on their needs.

—Rodney Thompson, D&D designer, Originally posted at http://dnd.wizards.com/go/article.aspx?x=dnd/4ll/20120604 but taken down, an archived copy of the post can be found here.

What Bounded Accuracy is Not

There are a lot of misconceptions about BA, so let's just squash them all, right here and now.

  • BA has nothing to do with checks of any sort. The limits imposed upon ability modifiers thanks to BA alters the standard DC range, but this is an unintended side-effect.
  • BA has nothing to do with saves of any sort. Because saves are just a keyworded check variant in 5e, they are affected in exactly the same way.
  • BA has nothing to do with damage. It incidentally has implications for how damage winds up being delivered, because the same modifiers apply, and because it alters the hit frequency for attacks.
  • BA is not about reducing the power of character level. Level is still king. It does increase how long a lower-level character or lower-CR monster will last against a character of a given level.
  • BA is not about increasing the difficulty of lower CR enemies. Rather, it allows lower CR enemies to still produce some degree of actual threat- no matter how little- against a PC of any level, and the same for a PC against a high CR monster.
  • BA is not intended to alter the overall difficulty or risk of the game. Ultimately, how difficult the game is depends entirely on what the DM decides to throw at the players. BA just makes that job a lot easier, by giving them a wider range of options for how to achieve a given threat level.

What Does Bounded Accuracy Do?

To understand the real value of Bounded Accuracy, you need to understand how attack hit probability worked in previous editions, and the problems the older systems had. This is because BA is a solution to those problems. It's a stroll through RPG design theory and D&D history.

If you don't care about older editions of D&D, BA will have little meaning to you. It's just the way 5e works.

If you don't care about game design, BA has no meaning for you. (If you contribute to this wiki, you are almost certainly engaged in game design already, whether you realize it or not.)

If you are not, have never been, and do not plan to ever be a DM, it is unlikely that BA will be useful to you at all.

BA is important for anyone who is creating content for 5e or has interest in game design philosophies.

Before AC: To-Hit Charts & Thac0 (Original-2.5e)

Back in the 1970s, D&D was just 1-man-army Chainmail (a game) with Outdoor Survival (another game) to fill in the gaps. It was a homebrew game system invented by a bunch of young adults for fun. Combat was based on d6 rolls and required chart references. Compared to today's engines, it was brutally slow and arbitrarily convoluted. The odds were also almost always heavily stacked against the player; every attack was a gamble for the players.

Based on the patterns and terminology from those charts, people started talking about success being based on how hard it was to hit an armor-class (a term for an arbitrary defensive quantification on those charts) relative to a value of 0. From that, we got the phrase "to hit armor class 0", which was then abbreviated as "thac0".

When AD&D was released, the learning curve was very steep because the attack roll mechanic used thac0 directly as a formula, rather than having whole tables. The main point of confusion: a lower armor class meant you got hit less often, but rolling high increased your chances of hitting a target, and thac0 bonuses were negative. Everything in the game ran on positive progression, except thac0, which was backwards. Needless to say, it was counterintuitive. Later editions did a better job of explaining it, and by 2.5 they were releasing content with math balanced for playability rather than arbitrary simulation, but they already had years of arbitrary content in circulation.

Problems aside, thac0 was actually a pretty good system if you could wrap your head around it. It tended to use smaller numbers, so people with a good grasp of basic number manipulation enjoy it over the later positive-AC systems developed to increase accessibility. It was class-dependent, based on the types of armor you were allowed to wear, and had nothing to do with your level. It was a direct representation of how hard it was to hurt a creature. However, the scale had a downside; as PCs gained levels, they got higher attack bonuses, allowing them to hit all targets better, eventually rendering attack rolls a formality, because basically nothing could avoid being hit. This started an arms race in supplemental content, where thac0s for truly awe-inspiring monsters plummeted into the negatives. This early thac0 arms race was the root of the assumptions that wound up being built into the next two editions of the game.

D&D was bought by Wizards of the Coast and they put together a new development team to make a more intuitive, accessible edition of the classic game. They wanted to make the game more profitable, and that meant appealing to the anti-thac0 audience.

Positive AC: Rise & Crash (3.5e-4e)

For 3e they overhauled combat. In particular, they made it an intuitive system by having armor class, or AC, act as a stand-alone value. The higher it got, the better it was. They did this by simplifying the attack process to; rolling a die, applying modifiers, and checking to see if it was higher than the target's AC. This mechanic became the "core mechanic" for overcoming any and every obstacle in the game, with non-living targets being called "difficulty classes" or DCs, instead of ACs.

That sounds good! And it was good! To this day, people play and stand behind 3.5e, and its spinoff Pathfinder, as being the golden ideal of D&D.

...It is not without its flaws.

In 3.5e and 4e D&D, they accidentally chose numbers for their content which generated what came to be known as the "Treadmill" effect. How you feel about the treadmill depends on how you answer the following question:

Should a random nobody mook have a chance of stabbing the legendary demigod hero of the universe, even if the damage would be negligible?

If you said no, stop reading right now and go back to playing 3.5e, because 5e says, "yes he should".

See, back in 3.5e and 4e, AC was tied directly to a creature's level or challenge. That meant, as you gained levels, your AC generally went up. This on its own is not problematic. The problem is that the ACs went up so high, and so quickly, that the attack bonuses of lower level/challenge creatures became meaningless. So, as you gained levels, you would "graduate" from killing lesser monsters to killing more powerful monsters. This restricted the DM to only pull from a narrow range of monsters to threaten the players, because anything below that band needed to roll a critical to even land a hit, and anything above that band could one-shot any party member and walk away untouched. Monsters and PCs had a sort of implicit, "must-be-this-tall-to-ride" sign attached to them in the form of AC.

The general effect of this was two-fold. First, it forced developers to make a range of increasingly powerful versions of each monster, in order to make it useful at different tiers of play. This meant a great deal of work had to go into developing, testing, and publishing heaps of redundant variant content, just so some guy could run a whole campaign about one monster type. Second, it made all those increasing numbers meaningless, because players never encountered anything outside of their tier. Effectively, all it did was change the names of the monsters they were fighting. In essence, it gave the impression that the world "leveled up with you" because there is absolutely no reason to use content that would auto-win or auto-fail against the players.

Add on top of this a decade of supplemental material that allowed players to get AC scores in the hundreds, and the math underlying a d20 based system melts down into useless nonsense.

4e attempted to resolve this issue by building the treadmill into the game as a strict system. Everything in the game revolved around and supported this treadmill. This highly structured and formulaic system has been heavily criticized by almost the entire hobby community as being highly artificial and arbitrary. In other words, it feels like a combat simulation game, rather than a game which can simulate combat. (It should be noted that these criticisms are rather contrary to the roots of D&D. It evolved out of wargaming, which is pure combat simulation. In this sense, 4e actually got closer to the roots of D&D than any edition since the original publications under TSR!) The main problem with 4e though is that instead of eliminating the problems associated with the treadmill, it codified them. As a consequence, 4e was not successful. It has its market, but it also has a reputation.

WotC began a public playtesting and feedback program called D&D Next to develop a new edition of D&D, designed to cater to the entire audience. (Mostly as an apology for their closed-door policy on the development of 4e)

Bounded AC (5e)

When they began developing 5th edition, the treadmill was the top item on the list of things which had to go.

First, they deserted the magic item economy. This was an effect generated by developers assuming players would have magic items providing a minimum bonus at given levels and preemptively building those bonuses into monster ACs to compete. It made magic items worthless, because players could only use them for a short time before being forced to upgrade, and also made magic items mandatory because you couldn't function without them. It forced DMs to plant a regular progression of magic gear as rewards during play, regardless how shoehorned-in it became. The magic item economy was the main driving force behind the treadmill. Instead, monsters would be built on the assumption that players do not have any magic items.

Second, they sat down and decided that the total flat bonus a player could receive on a check could not exceed the value of one whole die. (Anything more than that, and you just have an eternal arbitrary arms race of increasing values; the "treadmill" of the past editions) In other words, +20 is the theoretical desired limit of all combined bonuses to an attack roll. There is some debate, but it appears as though, by core rules only, the highest check result possible is 47, a bonus of +27, and it requires a lot of fiddly build options which the developers probably hadn't anticipated, plus a good circumstantial situation, and is not applicable to attack rolls. In other words: they did a good job of staying in that limit. Generally, nobody will ever be able to roll higher than 31 for an attack, check, or save.

Third, they divided how that maximum bonus would be proportioned between standardized sources. In general, these sources are the only sources of bonuses to an attack, check, or save. The sources are: ability score modifier (maximum of +5, attainable even from first level), proficiency bonus (minimum of +2, maximum of +6, grows slowly with level), and magic gear bonus (max of +3, but it's unlikely you'll ever even see +1). This gives, under optimal conditions, without feature intervention, a maximum roll result of 34 with magic.

Fourth, they made sure that PC ACs could not exceed 21, and monster ACs do not exceed 31. (See how that 31 lines up perfectly with the maximum possible roll result without magic? Notice that the maximum PC AC is 10 less than the maximum monster AC, a full half-die lower.)

Notice that most of this doesn't actually put limits on players. It actually puts limits on the developers when designing content the players can use. The standardization of player attack bonuses allows them to anticipate the bonus range any character can put out at a given level, regardless of class. This allows them to design monsters which have ACs which alter the probability of a hit based on PC level. Rather than probability being rapidly pushed to 0% or 100%, the monster becomes viable for use against a much wider range of PCs. By having limits to player AC that are not tied to level, they can change the hit rate for monsters by adjusting only their attack bonuses. Because the two things are no longer tied together, it is now possible to have monsters that always hit and always get hit, always hit but rarely get hit, rarely hit but always get hit, or rarely hit or get hit, as well as anything within those four extremes. Finally, the whole point of all of this was to make lesser enemies still useful in larger numbers at higher levels, and powerful enemies still survivable at lower levels. (Survivable is not the same as defeat-able. TPKs still happen.) That means you no longer need to have special tier-balanced versions of each monsters, or special minion monsters, you can just use a higher CR monster to present extra challenges, or throw a whole bunch of lower CR monsters to make up a total CR equal to one big monster.

They called this design philosophy bounded accuracy, because none of the designers were from the marketing department because it put boundaries on the numbers they were allowed to use when changing hit probability for monsters, by codifying total check bonus and AC limits for player content.

What Does Bounded Accuracy Mean For Me?

If you're going to be contributing content for the 5th edition of D&D, you need to be aware of the intended boundaries imposed by bounded accuracy for developers, as their intent is part of the precedent of 5th edition. Creating content which allows players to exceed the AC boundaries or get attack bonuses which greatly exceed any monster's AC will break the game. Creating monsters with unreachable AC scores will have a similar effect. Because the core mechanic is retained in 5e, and is still based on the same math as attack rolls, DCs for checks and saves generated by content also wind up being restricted to within the BA limits for the sheer practicality of actually being passable. Because BA alters the hit-frequency of attacks, if you are designing new attacks for the game, you need to understand that combat effectiveness at increased levels is based on making and landing more attacks, not scaling damage. While some things, like spells, can deal increased damage at higher levels, this is more of a function of the limited action economy surrounding other action types.


Back to Main Page Help Portal Guidelines
Back to Main Page 5e Homebrew Guidelines

gollark: --remind 1m apio bees
gollark: Okay...
gollark: --remind "6pm" test
gollark: Excellent!
gollark: --remind 1m apio
This article is issued from Dandwiki. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.