I’m not completely serious but I genuinely see the database as a means for persisting object state. I used to take the completely opposite position too, but I grew up a while ago ;-).
My preferred M.O. is to essentially ignore the database design early in the development process, and to gradually pay more attention as the performance begins to suffer. Unless you have a really simple application, data retrieval will gradually slow things down to an unacceptable level of performance.
I have had my teams use simple text files, and object serialization instead of RDBMS’s, but ultimately you need to consider (real) persistence.
It’s a good idea to add performance smoke testing early and keep an eye on it. When you can extrapolate some numbers in terms of transactions required and elapsed times, do some analysis and see if it’s within the tolerable range.
This is where a mapping tool like iBATIS is great. You can control the actual database access methods and map them to your classes in arbitrarily complex ways, so it leaves your options more open than an ORM like Hibernate might. iBATIS tends to handle legacy databases well too, as you hopefully migrate to something more manageable.
In my current situation, we’re looking at the daily production of tens of millions of rows and the attendant maintenance of 100s to 1000s of millions over time, and there is legacy data to consider. Ouch.
It would be great if you could count on the ActiveRecord pattern exclusively, but you really can’t.