Will Buggey

Will Buggey

Microsoft Excel: 5 bad practices too many accountants use

PLEASE SHARE!
 

Forget the clichés – most accountants I encounter use poor spreadsheet techniques. Improve your Microsoft Excel skillset by avoiding these 5 common pitfalls.

A neatly laid out desk with a computer, plant, filing box and a cup of coffee

1. Use of VLOOKUP

I am yet to hear any convincing argument for using VLOOKUP instead of the more robust INDEX/MATCH combination. There are two classic shortfalls with VLOOKUP: first, it can't work ‘backwords’ (result column to the left of the lookup column). Second, the majority of users routinely hard-code the column index number, such that if a column is added or deleted between the lookup column and the result column, the formula returns unintended results (often without returning an error, so mistakes don't always stand out). I lose confidence in someone's technical awareness and quality the moment I see a VLOOKUP in use – conversely, when I see a solid INDEX/MATCH combination in use, the individual responsible goes right up in my estimation!

2. Manually updating PivotTable data range

This is a recipe for disaster! If your data range is hard-coded (something like Sheet1!$A$1:$I$1000), you WILL one day forget to redefine it when you extract the latest data, and if the latest data has more rows than the old data, your PivotTable will be showing numbers extracted from an incomplete dataset. If you're lucky and your new data contains less rows than the old data, you'll still encounter the irritation of seeing an entry for “(blank)” in your PivotTable! The solution is to use a dynamic named range that resizes automatically when the data changes. Something like OFFSET($A$1,0,0,COUNTA($A:$A),COUNTA($1:$1)) should do the trick.

3. Routinely dragging down formulas

This is similar to #2 above. You refresh your source data and it has greater or fewer rows than before. You then manually deal with formulas you've positioned alongside the data, either dragging them down (‘more rows than before’ scenario) or deleting now-superfluous rows (less data than before). My preferred approach here is two-pronged: I take formulas down to a depth that comfortably (but not extravagantly) covers my expected maximum row volume, and I set up a checksum with conditional formatting that turns things explicitly visually ugly if the dataset size breaches the allowed-for level. Others will mess around with macros/VBA for this sort of thing but I advocate keeping things simple when it comes to spreadsheets.

4. Too many tabs

My heart sinks when I open a spreadsheet and it is full of tabs that I need to cycle through to make sense of things. Perhaps there's a tab for every entity within a group of companies, with each containing a trial balance or something like that. Generally, it's cleaner to go downwards within a single dataset tab where possible (with, to continue the group of companies example, a company identifier column) and to feed this through to a shared presentation layer tab (perhaps with dropdown selection for the company). This way you have fewer formula variations to manage and there is less ambiguity for anyone needing to understand how the spreadsheet hangs together.

5. Dispersed and/or sporadic cell ranges

I tend to favour whole-column (e.g $A:$A) ranges where possible – SUMIFS and COUNTIFS are great formulas for this (as an aside, you might as well forget that the single-criteria SUMIF and COUNTIF originals still exist). If you find yourself summing a few cells here then a few cells there, you're probably creating something that will quickly be found wanting in terms of integrity. Cutting/copying, pasting, deleting or inserting cells, columns or rows can easily inadvertently mess things up. In an extension of #4 above, the risk is heightened when you're referring to cells on another tab (especially where that tab is for presentation purposes as opposed to just storing data).