The "DB::Exception: Cannot backup table" error occurs when ClickHouse is unable to include a specific table in a backup operation. The CANNOT_BACKUP_TABLE error code is raised when the table's engine does not support backup, the user lacks the necessary permissions, or the table is in a state that prevents it from being backed up.
Impact
The backup operation fails for the problematic table. Depending on how the backup command is structured, this may cause the entire backup to fail or just skip the affected table. Either way, you end up with an incomplete backup that does not cover all the data you intended to protect.
Common Causes
- Unsupported table engine -- some table engines like
Kafka,RabbitMQ,JDBC, orODBCrepresent external data streams rather than stored data and cannot be backed up. - Temporary tables -- temporary tables exist only for the session duration and are not eligible for backup.
- Insufficient permissions -- the user executing the backup does not have
SELECTaccess on the table. - Table is in a read-only or detached state -- tables that are read-only due to disk issues or that have been detached cannot participate in backups.
- Live views and window views -- these special table types maintain state that cannot be captured in a standard backup.
- Concurrent DDL operations -- the table is being altered, renamed, or dropped while the backup is running.
Troubleshooting and Resolution Steps
Identify the problematic table. The error message typically names the table:
DB::Exception: Cannot backup table my_db.kafka_events: Table engine Kafka does not support backupsCheck the table engine:
SELECT database, name, engine FROM system.tables WHERE database = 'my_db' AND name = 'kafka_events';Exclude unsupported tables from the backup. Back up specific tables instead of the entire database:
BACKUP TABLE my_db.table_a, my_db.table_b, my_db.table_c TO Disk('backups', 'my_backup');For Kafka/RabbitMQ tables, back up the target MergeTree table that stores the consumed data instead of the stream table itself:
-- Instead of backing up the Kafka table, back up where data lands BACKUP TABLE my_db.events_stored TO Disk('backups', 'my_backup');Check and fix permissions:
GRANT SELECT ON my_db.my_table TO backup_user;If the table is detached, re-attach it first:
ATTACH TABLE my_db.my_table; -- Then retry the backupAvoid running DDL operations during backup windows. Schedule schema changes outside of backup times.
Best Practices
- Maintain a list of tables that support backup and target those explicitly in your backup commands rather than backing up entire databases blindly.
- For streaming engine tables (Kafka, RabbitMQ), always back up the downstream MergeTree tables that persist the data.
- Grant a dedicated backup user
SELECTaccess to all tables that need to be backed up. - Add pre-backup checks in your automation scripts that verify table engines and skip unsupported ones with a warning.
- Schedule backups during maintenance windows when DDL operations are unlikely.
Frequently Asked Questions
Q: Which table engines cannot be backed up?
A: Engines that represent external data streams -- Kafka, RabbitMQ, JDBC, ODBC, and similar -- do not support backups. Temporary tables and certain special views (live views, window views) are also excluded.
Q: Can I back up a materialized view?
A: Yes. Materialized views that store data in an underlying table (which is the default behavior) can be backed up. The backup captures both the view definition and its stored data.
Q: Will the backup fail entirely if one table cannot be backed up?
A: By default, yes. If you use BACKUP DATABASE and one table is not eligible, the entire operation fails. To work around this, back up tables individually or exclude the problematic ones.
Q: How do I handle tables that are temporarily read-only?
A: Investigate why the table is read-only -- usually this is due to disk space issues or a SYSTEM SET READONLY command. Resolve the underlying issue, then retry the backup.