> Adding persistence logic to a data object adds all kinds of bloat
That's not what you do if you write your own queries. That's what a novice does after learning object oriented programming.
For a more accurate example, look at things like Hibernate / NHibernate / Entity Framework and lazy loading. They inherit from classes at runtime and will transparently run queries as business logic navigates relationships on an object graph. It's "not wrong," but it can lead to all kinds of problems and painful refactors. (But it's totally "worth it" in prototypes, one-offs, and throwaway code.)
If you don't do lazy loading, Entity Framework still requires that you pre-declare which relationships you will traverse. It's intended that your business logic uses Entity Framework APIs to say what part of the object graph it will use. (Thus tightly coupling your application to your ORM, which means your trading tightly coupling to a database to tightly coupling to a framework / library.)
But, keep in mind that the ORMs I mentioned tie the objects to the database connection. It isn't quite as intense as "Adding persistence logic to a data object", but they do track that, if you modify an object, it can only be saved on the transaction that it was loaded from; or if you're using lazy loading, lazy loading only works if the transaction is still open.
If you build a layer around your data access code that fully shields your business logic from your ORM, you've "done it right." But, at that point your ORM's value becomes negligible, because from the perspective of your business logic, it doesn't matter if your data access layer has hand-optimized SQL or an ORM; but you've lost one of the real selling points of an ORM, which is that you can easily do your data access from within your business logic. Which is why I say that the biggest mistake is assuming you should, or shouldn't, use an ORM.