pingcap / dumpling

Dumpling is a fast, easy-to-use tool written by Go for dumping data from the database(MySQL, TiDB...) to local/cloud(S3, GCP...) in multifarious formats(SQL, CSV...).
Apache License 2.0
280 stars 85 forks source link

where's the --sql description in help information? #330

Closed Fanduzi closed 3 years ago

Fanduzi commented 3 years ago
/usr/local/tidb-toolkit-v5.0.1-linux-amd64/bin/dumpling --version
Release version: v5.0.1
Git commit hash: 4cb115746bb658b6d1a12c0e49932bfd3a08afac
Git branch:      heads/refs/tags/v5.0.1
Build timestamp: 2021-04-23 06:01:59Z
Go version:      go version go1.13 linux/amd64

image

 /usr/local/tidb-toolkit-v5.0.1-linux-amd64/bin/dumpling --help
Dumpling is a CLI tool that helps you dump MySQL/TiDB data

Usage:
  dumpling [flags]

Flags:
      --allow-cleartext-passwords         Allow passwords to be sent in cleartext (warning: don't use without TLS)
      --ca string                         The path name to the certificate authority file for TLS connection
      --case-sensitive                    whether the filter should be case-sensitive
      --cert string                       The path name to the client certificate file for TLS connection
      --complete-insert                   Use complete INSERT statements that include column names
  -c, --compress string                   Compress output file type, support 'gzip', 'no-compression' now
      --consistency string                Consistency level during dumping: {auto|none|flush|lock|snapshot} (default "auto")
      --csv-delimiter string              The delimiter for values in csv files, default '"' (default "\"")
      --csv-null-value string             The null value used when export to csv (default "\\N")
      --csv-separator string              The separator for csv files, default ',' (default ",")
  -B, --database strings                  Databases to dump
      --dump-empty-database               whether to dump empty database (default true)
      --escape-backslash                  use backslash to escape special characters (default true)
  -F, --filesize string                   The approximate size of output file
      --filetype string                   The type of export file (sql/csv)
  -f, --filter strings                    filter to select which tables to dump (default [*.*,!/^(mysql|sys|INFORMATION_SCHEMA|PERFORMANCE_SCHEMA|METRICS_SCHEMA|INSPECTION_SCHEMA)$/.*])
      --gcs.credentials-file string       (experimental) Set the GCS credentials file path
      --gcs.endpoint string               (experimental) Set the GCS endpoint URL
      --gcs.predefined-acl string         (experimental) Specify the GCS predefined acl for objects
      --gcs.storage-class string          (experimental) Specify the GCS storage class for objects
      --help                              Print help message and quit
  -h, --host string                       The host to connect to (default "127.0.0.1")
      --key string                        The path name to the client private key file for TLS connection
  -L, --logfile path                      Log file path, leave empty to write to console
      --logfmt format                     Log format: {text|json} (default "text")
      --loglevel string                   Log level: {debug|info|warn|error|dpanic|panic|fatal} (default "info")
  -d, --no-data                           Do not dump table data
      --no-header                         whether not to dump CSV table header
  -m, --no-schemas                        Do not dump table schemas with the data
  -W, --no-views                          Do not dump views (default true)
  -o, --output string                     Output directory (default "./export-2021-08-04T10:05:44+08:00")
      --output-filename-template string   The output filename template (without file extension)
      --params stringToString             Extra session variables used while dumping, accepted format: --params "character_set_client=latin1,character_set_connection=latin1" (default [])
  -p, --password string                   User password
  -P, --port int                          TCP/IP port to connect to (default 4000)
  -r, --rows uint                         Split table into chunks of this many rows, default unlimited
      --s3.acl string                     (experimental) Set the S3 canned ACLs, e.g. authenticated-read
      --s3.endpoint string                (experimental) Set the S3 endpoint URL, please specify the http or https scheme explicitly
      --s3.provider string                (experimental) Set the S3 provider, e.g. aws, alibaba, ceph
      --s3.region string                  (experimental) Set the S3 region, e.g. us-east-1
      --s3.sse string                     Set S3 server-side encryption, e.g. aws:kms
      --s3.sse-kms-key-id string          KMS CMK key id to use with S3 server-side encryption.Leave empty to use S3 owned key.
      --s3.storage-class string           (experimental) Set the S3 storage class, e.g. STANDARD
      --snapshot string                   Snapshot position (uint64 from pd timestamp for TiDB). Valid only when consistency=snapshot
  -s, --statement-size uint               Attempted size of INSERT statement in bytes (default 1000000)
      --status-addr string                dumpling API server and pprof addr (default ":8281")
  -T, --tables-list strings               Comma delimited table list to dump; must be qualified table names
  -t, --threads int                       Number of goroutines to use, default 4 (default 4)
      --tidb-mem-quota-query uint         The maximum memory limit for a single SQL statement, in bytes.
  -u, --user string                       Username with privileges to run the dump (default "root")
  -V, --version                           Print Dumpling version
      --where string                      Dump only selected records

This parameter is missing? Or did you make another cheap mistake?

lichunzhu commented 3 years ago

This argument is marked hidden in https://github.com/pingcap/dumpling/pull/245. But you can still use this argument in dumpling.

Fanduzi commented 3 years ago

This argument is marked hidden in #245. But you can still use this argument in dumpling.

Hi, thanks for your reply

Mark --sql as hidden since it's incompatible with lightning

Can you explain this, that CSV files exported using --sql cannot be imported using lightning?

lichunzhu commented 3 years ago

Because exported data doesn't have a table structure sql file. It can be used by lightning "directly".

Fanduzi commented 3 years ago

Because exported data doesn't have a table structure sql file. It can be used by lightning "directly".

Thank you, I understand now. Strictly speaking, when --filetype = csv dumpling exported files contains db.table-schema.sql files. However, these table structure files are problematic

# cat db.table-schema.sql
/*!40101 SET NAMES binary*/;
;

Perhaps a revision of the documentation should be a necessary step in the iterative process of the product. Personally, I'd suggest a note in the official documentation as well

lichunzhu commented 3 years ago

Because exported data doesn't have a table structure sql file. It can be used by lightning "directly".

Thank you, I understand now. Strictly speaking, when --filetype = csv dumpling exported files contains db.table-schema.sql files. However, these table structure files are problematic

# cat db.table-schema.sql
/*!40101 SET NAMES binary*/;
;

Perhaps a revision of the documentation should be a necessary step in the iterative process of the product. Personally, I'd suggest a note in the official documentation as well

Thanks for your suggestion! We will update this document ASAP.