 # Histogram

Go Back

Definition

A graphical display of the distribution of a set of continuous data. The x-axis of the graph represents the measurement scale of the dataset. The y-axis represents the frequencies (counts) of observations falling in the different intervals. Because the measurement scale is continuous, the bars representing the intervals are connected, unlike those in a bar chart. The histogram gives a good idea of the shape, centering and spread of a dataset.

Examples The histogram shows the distribution of 34 data points. The numbers above the bars represent the number of data points falling within a specific interval - thus they sum to 34, the total number of data points.

The shape of the histogram is determined by the number of intervals and their widths. Several methods and rules of thumb have been put forth to decide the widths and number of intervals:

For number of intervals:
1. Use 7 to 10 intervals.
2. Take the square root of the number of data points and round to the nearest whole number.

For interval width:
1. Compute the range = (maximum – minimum) of the dataset and divide it by the number of intervals.
2. Compute the standard deviation of the dataset and divide by 3.

For the example we chose 8 intervals and width = Range/8 = (21.2-18.9)/8 = 0.3