About Volatility

Volatility is one of the most important attributes of any tradeable instrument. It indicates how volatile, or ‘risky’ if you will, the price is. But how do you measure it? Well as a matter of fact there are several ways of doing it, and sometimes people (including myself) confuse them. In this article I will present a few simple functions, as well as a way of verfying the result.

From investopedia: “Volatility is a statistical measure of the dispersion of returns for a given security or market index. In most cases, the higher the volatility, the riskier the security. Volatility is often measured as either the standard deviation or variance between returns from that same security or market index.”

So, to reiterate: “Volatility is OFTEN measured as either the standard deviation of, or variance between returns” (my emphasis). There are other ways but the standard deviation of returns is the most common one. And standard deviation is the square root of the variance. So from now on in this article: volatility = stddev(log returns).

Calculating this is easy using numerical libraries like numpy, but I will also show a manual way of doing the same.

The first solution will be using python and pandas, so a basic understanding of this library is required.

import pandas as pd
import numpy as np

Assume data in a pd.DataFrame(), formatted like this:

DateSymbolDateTimeOpenHighLowCloseVolume
2021-05-12 15:30:00AAPL2021051215:30:00123.30124.64123.30123.9351864
2021-05-12 15:40:00AAPL2021051215:40:00123.94124.47123.63124.3032709
2021-05-12 15:50:00AAPL2021051215:50:00124.30124.48123.86123.9825105
2021-05-12 16:00:00AAPL2021051216:00:00123.98124.24123.63124.1325553
2021-05-12 16:10:00AAPL2021051216:10:00124.12124.13123.58123.7222831
2022-05-11 21:10:00AAPL2022051121:10:00146.60146.92146.06146.5030300
2022-05-11 21:20:00AAPL2022051121:20:00146.50146.77146.13146.5627135
2022-05-11 21:30:00AAPL2022051121:30:00146.57146.94146.14146.2932458
2022-05-11 21:40:00AAPL2022051121:40:00146.28146.63145.92145.9545101
2022-05-11 21:50:00AAPL2022051121:50:00145.94146.65145.81146.5971507

As you can see, the data is in 10.-minute bars. This will add an extra level of complexity, but as I will explain later doesn’t change the calculations, just the scaling.

However a number of questions arise. In this case, we have 450 rows of 10-minute bars, representing 75 hours of price movements. That is just over 9 trading days. Is this enough to calculate the volatility? What can we do with the answer? What if we are interested in daily volatility, can we use this data then?

Well yes, and no. The thumb rule for calculation is that the volatility is proportional to the square root of time. You can calculate the daily volatility, by multiplying the 10-minute volatilty with the square root of 48 (8 hours of 6 bars). You could also calculate the monthly volaitlity by multiplying with the square root of 1440 (30 days of 8 hours of 6 bars). It also works the other way around by dividing with the same constant. This is explained well in this article: HOW TO CONVERT VOLATILITY FROM ANNUAL TO DAILY, WEEKLY OR MONTHLY?

However this is a calculation that comes with its own problems. If you want daily volatility, I recommend using daily bars.

Annual Volatility may be the most common figure for comparisons, but what about Weekly volatility and Monthly volatility? Well I suggest you “compare apples to apples and oranges to oranges”. Keep them apart.

The manual method

We will use the ‘Close’ column, and the first step is to calculate the 1-period log return, and then the mean of that.

df['log_rtn'] = np.log(df['Close']).diff()
mean_log_rtn = df['log_rtn'].mean()

Now create another column, with the “squared deviation from mean” of each return

df['squared_dev'] = np.square(df['log_rtn1'] - mean_log_rtn)

Then you calculate the variance:

variance = df['squared_dev'].sum() / (len(df)-1)

And the standard deviation

stddev = np.sqrt(variance)

And then convert it to whichever period you want. Normally you talk about annual vola but as I said earlier, this is just the scale you use for comparison.

# annualized daily standard deviation
annual_std = stddev * (252 * 6 * 8) ** 0.5

The numpy method

There is however a much easier way of calculating exactly the same, namely using:

stddev = np.std(df['log_rtn'])

Now we have tha 1-period volatility. If we want to annualize it, we do this:

# annualized daily standard deviation
annual_std = stddev * (252 * 6 * 8) ** 0.5

The latter needs some explanation: We are using 10-minute bars, and there are 6 of those per hour, and 8 trading hours a day. We assume 252 trading days per year. Thereby the 252 * 6 * 8.

And ** 0.5 (meaning ^0.5) is just a fancy way of saying SQRT()

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.