There's 2080 work hours in a year (52 weeks, 40 hours per week).
So technically you should divide by 2080, but dividing by 2000 is a lot faster in your head (drop three zeroes, and halve).
Hope that's helpful.
The rule of thumb I arrived at years ago was to start by dividing the annual rate by a factor of two, to account for the additional tax burden. Then to divide by an additional factor of two to account for the additional insurance burden.
Then you start factoring in your other costs.
Years ago, I was contracting with Apple Retail Software Engineering in Cupertino, and making $95/hr. I calculated that they would have to give me a $250,000 annual salary to be able to take home the same amount of money on a monthly basis, and virtually no one at Apple makes that kind of money unless they're an SVP.
So, I was actually quite happy when the contract ended and I got to go home to Austin.
Work hours in a year vs salary gets you to a very close approximation in seconds.
In the next iteration I'm considering allowing users to use either option, sliding scale or direct input.