I want to setup a dev and test environment for website that has a huge MySQL database. Since this DB is so big, it is quite inconvenient to clone it 1-on-1, because it simply takes too long and uses too much storage.
My ideas was to only dump / copy certain records, matching some SQL. For example, only the last 1000 users and last 1000 products in the DB, assuming we only need this much data for testing and development.
My questions are:
- How to best go about this, keeping in mind the DB is live at the time of the dumping (referential integrity), and keep it from going offline
- The data is complex and has many joins, making insert into ... select ... statements a bit complex, but still possible.
My ideas was to use mysqldump to set up only the structure (no data), then use a long list of insert into ... select ... statements to fill the tables 1 by 1.
Only issues I am uncertain of is the referential integrity and processing the tables in the right order, and maybe there is a better way?
The DEV environment is my personal laptop, so not a server with huge disks.