People are wildly wrong when we ask them about many aspects of life in Britain, as highlighted in a new survey by Ipsos MORI for the Royal Statistical Society and King's College London.
Just to pick out three big errors on who lives here: on average, we think 24% of the population (one in every four people!) are Muslims - when the real figure is around 5%; we think 31% are immigrants - when the official figure is 13%; and we think 36% are aged 65+ - when in fact only 16% are.
And it's a similar picture of startlingly wrong attitudes on key social issues, like crime: 58% of us refuse to believe that crime is falling, despite a pretty consistent decline in the last two decades, where it's now half the level of 1995. It's the same with violent crime - that's also falling, but 51% believe it's rising.
We also have an extraordinary view of the extent of teenage pregnancy: the average guess of how many girls under the age of 16 get pregnant each year is 15%, whereas the best estimate of the actual figure is well under 1%. And many people are even further out: one in fifteen of the general public think 40% or more of young teenage girls get pregnant each year - that would be at least 12 girls in an average all-girl class of 30.
We also take a pretty jaundiced view of how committed other people are to the democratic process. When asked how many people voted in the last general election, the average guess is 43%, when in fact 65% voted. This is a problem: if people feel it is the norm not to vote, we're more likely to see falling turnout.
We're also wildly wrong on what the government spends our money on, and what will save the most. For example, as we often find, people grossly overestimate the amount that is spent on foreign aid: a quarter of us think it is one of the 2-3 things government spends the most money on, when it is actually only around 1% of expenditure. More people pick foreign aid as top item of expenditure than state pensions - but we spend nearly 10 times as much on pensions than aid.
Not surprisingly then, people are just as wrong on the relative impact of different benefit cuts. From our list, the one that people think would save the most is capping benefits so that no household receives more than £26,000 per year. This in fact saves a (relatively-speaking) very modest £185-290m per year (depending on whose figures you use). Another item on our list - raising the pension age to 66 for both men and women - saves 17 times as much (£5bn), but people were twice as likely to think the household benefit cap saves the most.
The biggest single error in our survey is on the scale of benefit fraud: people think that out of every £100 spend on benefits, £24 is claimed fraudulently (one in every four pounds!), when the best government estimate is that it's actually only around 70p.
But this points to one of the key findings from the survey: when we ask people what they were thinking of as benefit fraud when they guessed at its scale, they select items that would never be counted as actual fraud: in people's minds, it includes claimants not having paid tax in the past and people having children so they can claim more benefits.
So we shouldn't dismiss these estimates as worthless because they are so wildly wrong - they are often just measuring something different. It's true that they reflect problems with statistical literacy: people really struggle with very large and very small numbers, they find it hard to distinguish between rates and levels, they take a long time and repeated exposure to notice change.
It's also true that our misperceptions reflect media treatment of issues and the political discourse, where, naturally, the focus is often on vivid anecdotes and less on the hard figures or scale of an issue. This is no accident, and to a large extent we get what we ask for: we admit ourselves that we base our views more on personal experience and anecdote than hard facts.
But our misperceptions also reflect our concerns - and this is why any number of "myth-busting" exercises are likely to have limited impact. Our exaggerated estimates are as much an effect as a cause of our concerns. Academics call this "emotional innumeracy": we're making a point about what's worrying us, whether we know it or not.
But we also shouldn't accept our over-estimates as unchangeable: reducing misunderstanding is still important. We need to continue to focus on statistical literacy and people's confidence to challenge a figure or a story, through education that starts in schools. We need the UK Statistics Authority to continue to challenge the misuse of statistics and organisations like FullFact to highlight the issues.
But we also need to accept that people are more like Einstein than their answers to our survey might lead us to believe: as he said, if the facts don't fit the theory, change the facts. Many of us do.