Savings account compairison

Let’s say you have a spare $5,000.  You want this money for travel… or a down payment on a car, or something else in the one to two year time frame.  What’s the best current bet for getting a little bit of growth over a year?

Mutual funds and stocks could lose value.  That’s probably not a good bet over the short term.  Bonds could work in some situations, so that may warrant investigation.  For now, I’m just going to focus on savings accounts and the tax implications.

All interest rates were captured on 3 Nov 2012.  Let’s take a look at a few options.  Let’s also have a look at tax implications for 2012.

Assumptions:

  • One year investment period
  • Marginal tax rate is 32% (22% federal, 10% provincial)
  • There is contribution room for a TFSA
  • $5000 to invest
  • There is NO inflation adjustment necessary

Some Options:

Institution Rate (%) Interest ($) Taxes ($) Net ($)
AMA/Bridgewater Bank 1.85% $92.50 $29.60 $62.90
Canadian Direct Financial 1.90% $95.00 $30.40 $64.60
Canadian Direct Financial TFSA 3.00% $150.00 $0.00 $150.00
ING Direct Savings 1.35% $67.50 $21.60 $45.90
RBC eSavings 1.20% $60.00 $19.20 $40.80

Discussion:

I find it odd that some institutions are offering a *higher* interest rate for TFSA savings accounts. I won’t fight it – it’s a fantastic option for those who are saving for the short term. Even as an emergency fund the TFSA offers a significant advantage over taxed savings accounts. Naturally, there are limitations placed on TFSAs and it’s important to work within those limitations to ensure that there are no service fees applied to an otherwise exceptional deal.

Why the disparity in interest rates?  I can only assume that it’s because larger banks can get away with offering lower rates and still attracting customers.  Perhaps the inconvenience of dealing with multiple institutions is worth the cost of not keeping up with the rate of inflation.

Reasonable, worst case, mortgage rate.

Imagine that I were looking to get a mortgage.  What interest rates could I experience over the course of a 20 year mortgage?  Nobody can give me a good answer, so I thought I might write some python and see what I could come up with.

For the purpose of this exercise I will only be looking at the variable rate.  Fixed versus variable mortgages is a discussion for another time.

First, let’s get some upper and lower bounds for the past 25 years.  This period is interesting as it had some of the highest interest rates in Canada’s history.  In the late 80’s people were losing their houses because of the staggering double-digit mortgage rates.  Can you imagine locking-in at those rates?  April 1990 the 5-year variable mortgage rate hit 14.75%.  A year later it had dropped significantly to 9.75% [1].

Mortgage rates:

Max: 14.75%
Min: 2.25%

Assumptions:

Let’s assume that Canadian mortgage rates can change eight times per year [2].

Let’s also assume that on each of these occasions that interest rate can either go up by 0.25%, stay the same, or drop by 0.25%.  Technically is has changed by greater amounts in the past, but we’ll ignore that for this exercise.  Let’s also assume that each of these outcomes is equally likely.  THIS IS NOT A SAFE ASSUMPTION.

Finally, we’ll assume that we are bound to the upper and lower limits seen in the past 25 years.

Oh look… code:

$ cat generate-number.py
#1 - must be recursive.
#2 - number of recurses is controlled by variable at head of program
#3 - at each branch there are three EQUALLY LIKELY options - go up, go down, stay the same
#4 - the amount of the up/down is 0.25%
#5 - It can't go below 2.25%
#6 - it can't go above 14.75%

# recursion depth - 2 years, 8 branches per year:
MAXDEPTH = 8 * 2

DELTA = 0.25

#Let's start with an interest rate of 3%:
STARTING = 3

LIMIT_UPPER = 14.75
LIMIT_LOWER = 2.25

RESULTS = {}

def Branch(value, depth):
        # check limits
        if value > LIMIT_UPPER:
                value = LIMIT_UPPER
        elif value < LIMIT_LOWER:
                value = LIMIT_LOWER

        if depth == 0:
                RESULTS[value] = RESULTS[value] + 1
                return
        else:
                Branch(value + DELTA, depth - 1)
                Branch(value, depth - 1)
                Branch(value - DELTA, depth - 1)
                return

x=LIMIT_LOWER
while x<=LIMIT_UPPER:
        RESULTS[x]=0
        x=x+DELTA

Branch(STARTING, MAXDEPTH)

x=LIMIT_LOWER
while x<=LIMIT_UPPER:
        print repr(x) + ", " + repr(RESULTS[x])
        x=x+DELTA

The output for two years looks like this:

$ head 24mo.csv
2.25, 5986176
2.5, 6008616
2.75, 5965368
3.0, 5733267
3.25, 5227722
3.5, 4453608
3.75, 3507168
4.0, 2533664
4.25, 1669076
4.5, 997016

Now, let me be really clear here… this is a piece of code that does geometric expansion through recursion.  If you run a dataset that consists of 8 changes per year, and only 1 year, you’re looking at 6,561 nodes of data being returned.  3^(8*1) = 6,561.  Two years of data is 43,046,721 results.  If you want 5 years of data, it’s 3^(8*5) = 12,157,665,459,056,928,801.  That’s a lot of data.  Luckily, we won’t be storing anywhere near that much, but it’s still going to take a while to run.

I attempted to use R [3] to make the stats a little easier to deal with, but I was unable to figure out how to do the analysis I wanted in a reasonable amount of time.  I already had the frequency distributions, and although R was able to produce those from raw numbers, I was unable to figure out a method to take the summarized data and work back to Mean/Median/Min/Max and a couple different percentile values.  Strangely, Excel helped me do this quickly.  Here are my results:

Results

Months Min Max Mean Median 95th %ile 99th %ile
6 2.25 4 3.003086 3 3.75 4
12 2.25 5 3.032274 3 4 4.25
18 2.25 6 3.074085 3 4.25 4.5
24 2.25 7 3.119716 3 4.25 4.75
30 2.25 8 3.166085 3 4.5 5

The minimum value isn’t surprising.  Nor is the Max.  The average is creeping up simply with time and the opportunity to diverge from the starting point of 3.00.  The median isn’t too surprising.  The percentile values – now those are something I may be able to use.

The 95th percentile value essentially means that for 95% of the cases, the interest rate will be no greater than the corresponding value.  Specifically, in 95% of the cases, the Interest rate over 30 months will not exceed 4.5%.

Although this may be good for some VERY ROUGH planning, keep in mind that interest rates are governed by market forces.  I did not apply any likelihood to increasing / decreasing / unchanging interest rates.  It may be the case that interest rates are twice as likely to rise as they are to fall.  If that’s the case, this exercise is seriously flawed.

Now, as far as planning goes, if one were to take the 99th percentile value, one could expect interest rates to climb by 2% over the next 2.5 years.  This kind of growth has precedents.

Conclusions

This may or may not be a good way to approximate what interest rates will do over a period.  I guess in this case some people *are* betting the house on interest rates.  Do so with caution.  Talk to a financial planner.  Do not rely on the data I have provided.  This was a thought exercise and I can’t be held responsible if these number are nowhere near accurate.

I’d love to hear how others are planning for the inevitable increase in interest rates.  How are you approximating future interest rates?

[1] http://needamortgage.files.wordpress.com/2010/09/flm-historical-rate-sheets-august-2010.pdf
[2] http://www.bankofcanada.ca/monetary-policy-introduction/key-interest-rate/schedule/
[3] http://www.r-project.org/