Yeah, you've invented the 'mean absolute deviation' (or 'average absolute deviation'), which might be better than variance or standard deviation (the square root of variance) in some circumstances. It's been debated for 100 years: http://www.leeds.ac.uk/educol/documents/00003759.htm
Part of the reason for using variance might be like you said, to give more weight to outliers.
Part of the reason variance and standard deviation might be more popular is because usually the spread of a set of data has a normal distribution. And there are all these formulas and calculations that were invented before computers that are easier to do with variance and standard deviation apparently. Manipulating equations with absolute values is trickier.
There are also some mental shortcuts you can take if you know the standard deviation of a set of data. If a car is rated 8 on average, then about 95% of all of the ratings are within 2 standard deviations of the mean. Thus, if you want to buy a car rated 8 on average and want to be 95% sure that the particular car you buy is at least a 7, check that the standard deviation of the ratings is less than 0.5. Probably not a great example. Imagine instead you are buying oranges that are 8 out of 10 quality-wise on average, and you want to be confident that 95% of the oranges are at least a 7, so that you don't have to throw out too many. See https://en.wikipedia.org/wiki/68%E2%80%9395%E2%80%9399.7_rul...
I don't know, here are some other suggested reasons for using variance & standard deviation instead of absolute mean differences: https://stats.stackexchange.com/questions/118/why-square-the... https://www.quora.com/Why-do-we-square-instead-of-using-the-...