sql - Best practices for daily MySQL (partial and filtered) replication? -


i have reasonable large database > 40 tables. need replicated few tables (+/- 5). , each tables filtered.

i'm looking best practices replicating data (daily enough), can select few tables , included clauses each table.

i'm thinking of starting mysqldump each table (with clause) , make each table separate .sql file. can truncate tables (all data daily overwritten) on destination db, , run mysql importing each table separate.

example:

   # dump each table     mysqldump -u repl_user my_database my_table -w 'id between 1000 , 1005' > my_table.sql 

im aware of replicating full database , use blackhole table type. since 35 tables not needed, seems somehow overkill. besides, tables needs filtered version, , can't solve via blackhole.

any better solutions?

mysql natively supports replication filters, @ database or table level. doesn't meet requirement filter subset of rows these tables.

flexviews tool read binary log , replay changes relevant keeping materialized view date. define materialized view in such way implement table filtering.


Comments

Popular posts from this blog

windows - Single EXE to Install Python Standalone Executable for Easy Distribution -

c# - Access objects in UserControl from MainWindow in WPF -

javascript - How to name a jQuery function to make a browser's back button work? -