“There are three types of lies – lies, damn lies, and statistics.” – Benjamin Disrael
Local governments want more of your money but they want you to hand it over without a fight so they pull out statistics to convince taxpayers to open their wallets. Here are just a few things to look for to see if you’re being manipulated by the numbers.
Omissions: Leaving out something significant that, if known, would lead the reader to a different conclusion from the data presented:
- When a city, in an attempt to waylay fears of a tax increase, says their tax rate has stayed the same or is lower than it was before but neglects to reveal that your property valuation has increased substantially and, therefore, the amount you pay is going up.
- When the city tells you residents are willing to accept a property tax increase for road maintenance without telling you that 64% of Minnetrista residents indicated they opposed any property tax increase for roads in the community survey (see below).
Using percentages from a small sample size: When a survey uses an insignificant sample size, percentages will always be misleading:
Minnetrista’s community survey asks respondents if they would favor or oppose an increase in property taxes for city street repair/maintenance and 64% said they’d oppose an increase. A very small number (128 people out of 7,000 city residents) indicated they’d favor an increase. That subset of respondents (128 people) was questioned to see how much more they’d be willing to pay. When they indicated various amounts ($5-$30/mo) it was then repeated over and over again that, according to the community survey, the majority of people (which was actually just 118 people: 92% of the 128) are willing to accept an increase in their city property taxes for roads, when, in fact, 64% surveyed said they were opposed. Starting to get the picture of how this works?
Faulty polling: How questions are phrased can influence responses dramatically. A deceptive polling strategy is to precede a question with a narrative designed to prejudice the response or to omit (see above) important data qualifiers. The examples below use a combination of both omission and faulty polling strategies:
- Minnetrista’s community survey precedes a question (#49) about whether or not the city should build a gun range saying “there is an unfinished gun range” and “if finished” it would be used by residents. Communicating something as “unfinished” implies that it has been started (which it has not) but not completed and influences a positive response since people generally are averse to leaving things “unfinished”. The truth of the matter is there is empty space with nothing in it that could be built out as a gun range. This survey question also omitted the fact there would be significant, ongoing annual operational and maintenance costs that will increase residents’ taxes over and above the build out costs. Had that been revealed and the phrasing less biased, the responses would likely have been much different. Even so there was little support to use tax dollars to fund the build out, and one would assume no support for tax dollars to fund the maintenance (if they had been aware of it).
- This one is my favorite: This survey question precedes another (74) regarding the approval rating of the Mayor and Council with a question that reveals the majority of respondents know “very little” to “nothing at all” about the work of the Mayor and Council but then goes on to ask if they approve or disapprove of the job the Mayor and Council are doing. Remember that next time you hear about the council’s 80% approval rating. Apparently ignorance is bliss.
Community surveys are merely vehicles designed to justify tax increases and reelect incumbents that support them.