Hi, I saw
db/development_structure.sql on https://github.com/thoughtbot/guides/tree/master/best-practices
I am wondering when the project is growing big, how do you test / develop your features? (sometimes develop new feature needs real data, and need to open browser to test it)
My solutions is
Write rake to generate fake data, keep close with current status, but it only works when the project is still beta or still small.
When the project is medinum size. We dump real data to staging server, or connect to slave db server to development feature.
I can’t find a gracefully to handle big database these years. It’s painful to us. Currently I have a big db with 4M records, it almost seems torture to my team. We end up dumping our data to a closet RDS and connect it to develop website. Because our laptop can’t afford running such big database. And my next client’s project probably is 20M
Wondering is there a “best practices” or better way to those things?
Hi, @xdite, sorry for the delay in you getting a response. This, unfortunately, is not a simple issue, but it’s one we faced ourselves on Airbrake.
On Airbrake, yes we would often rely on staging having a dump from production and do final testing there.
We also had a rake task that could load some fake data into the system for local development.
In addition, we also developed a rake task that would allow us to export only a subset of the data from staging or product as an sql dump which could then be imported locally. If you’re familiar with Airbrake, almost all the data size was taken up by one table, the table that held all the individual occurrences to errors. This table has hundreds of millions of rows.
Our script was able to get a dump that included all the users, projects, error summaries, etc and only exported a couple thousand errors.
This allowed us to strike a balance between testing on completely fake data and testing with a full database.
We were using mysql at the time, but in case your using postgres, I did a quick search and found what seems to be a reasonable way to do this partial dump on postgres.