The quality of consistency is directly related to Bertin’s (1967) principle of *single image*. Abiding to the quality of consistency satisfies the most important quality in data graphs: decoding accuracy.

All encoding tools and design choices must be applied in a consistent manner within the graph and across comparable visuals, so that any variation in design must reflect variation in the data. Let me emphasize that:

*Variation in design must reflect variation in the data*

For example, a simple web search on “bar chart” will come up with numerous examples of charts with similarly multi-colored bars such as this one:

This design approach violates the quality of encoding consistency because the variation in design (i.e. the variation in colours) does not encode any variation in the data. The variation in the categories is already encoded by the *x*-axis labels, i.e. Alpha, Beta, Gamma, Delta, Epsilon. The colours are completely superfluous and confuse visual perception.

Here is famous example with multiple blatant violations of consistency:

Believe it or not, this graph encodes only 5 values (proportions) over 5 categories (years). The violations in consistency are the following: (*i*) the graph encodes both the proportion *p* and its complement of 1-*p* thus suggesting that there are 10 values and not just 5, (*ii*) the graph encodes four colours for no apparent reason thus suggesting that there are four categories of some sort, (*iii*) the graph encodes a three-dimensional volume visual implantation thus suggesting that there are three variables in the graph whereas there are only two. The only excuse for the dismal quality of this graph is that it was done in the tumultuous 1970s.

Edwuard Tufte (1983, p.118) describes the above as possibly the worst graphic ever made. In this words:

“*A series of weird three-dimensional displays appearing in the magazine of American Education in the 1970s delighted the connoisseurs of the graphical preposterous. Here five colors report, almost by happenstance, only 5 pieces of data (since the division within each adds to 100%). This may well be the worst graphic ever to find its way into print*“

As another example of low consistency data graph, consider my analysis on the Semiconductor market share. That graph is right there at the top of most inconsistent graphs ever made.

Consistency also demands the application of standards of practice as benchmark templates, and any deviation from standards should be clearly identified in the graph. For example, increments of time should always be shown from the left to the right, and causal relations should always show the cause on the *x*-axis and the effect on the *y*-axis.

Back to Encoding relevance ⟵ ⟶ Continue to Decoding efficiency