MySQL :: MySQL 8.4 Reference Manual :: 15.2.9 LOAD DATA Statement (original) (raw)

15.2.9 LOAD DATA Statement

LOAD DATA
    [LOW_PRIORITY | CONCURRENT] [LOCAL]
    INFILE 'file_name'
    [REPLACE | IGNORE]
    INTO TABLE tbl_name
    [PARTITION (partition_name [, partition_name] ...)]
    [CHARACTER SET charset_name]
    [{FIELDS | COLUMNS}
        [TERMINATED BY 'string']
        [[OPTIONALLY] ENCLOSED BY 'char']
        [ESCAPED BY 'char']
    ]
    [LINES
        [STARTING BY 'string']
        [TERMINATED BY 'string']
    ]
    [IGNORE number {LINES | ROWS}]
    [(col_name_or_user_var
        [, col_name_or_user_var] ...)]
    [SET col_name={expr | DEFAULT}
        [, col_name={expr | DEFAULT}] ...]

The LOAD DATA statement reads rows from a text file into a table at a very high speed. The file can be read from the server host or the client host, depending on whether the LOCAL modifier is given.LOCAL also affects data interpretation and error handling.

LOAD DATA is the complement ofSELECT ... INTO OUTFILE. (See Section 15.2.13.1, “SELECT ... INTO Statement”.) To write data from a table to a file, useSELECT ... INTO OUTFILE. To read the file back into a table, useLOAD DATA. The syntax of theFIELDS and LINES clauses is the same for both statements.

The mysqlimport utility provides another way to load data files; it operates by sending aLOAD DATA statement to the server. See Section 6.5.5, “mysqlimport — A Data Import Program”.

For information about the efficiency ofINSERT versusLOAD DATA and speeding upLOAD DATA, seeSection 10.2.5.1, “Optimizing INSERT Statements”.

Input File Character Set

The file name must be given as a literal string. On Windows, specify backslashes in path names as forward slashes or doubled backslashes. The server interprets the file name using the character set indicated by thecharacter_set_filesystem system variable.

By default, the server interprets the file contents using the character set indicated by thecharacter_set_database system variable. If the file contents use a character set different from this default, it is a good idea to specify that character set by using the CHARACTER SET clause. A character set of binary specifies “no conversion.”

SET NAMES and the setting ofcharacter_set_client do not affect interpretation of file contents.

LOAD DATA interprets all fields in the file as having the same character set, regardless of the data types of the columns into which field values are loaded. For proper interpretation of the file, you must ensure that it was written with the correct character set. For example, if you write a data file with mysqldump -T or by issuing a SELECT ... INTO OUTFILE statement inmysql, be sure to use a--default-character-set option to write output in the character set to be used when the file is loaded with LOAD DATA.

Note

It is not possible to load data files that use theucs2, utf16,utf16le, or utf32 character set.

Input File Location

These rules determine the LOAD DATA input file location:

The non-LOCAL rules mean that the server reads a file named as ./myfile.txt relative to its data directory, whereas it reads a file named asmyfile.txt from the database directory of the default database. For example, if the followingLOAD DATA statement is executed while db1 is the default database, the server reads the file data.txt from the database directory for db1, even though the statement explicitly loads the file into a table in thedb2 database:

LOAD DATA INFILE 'data.txt' INTO TABLE db2.my_table;

Note

The server also uses the non-LOCAL rules to locate .sdi files for theIMPORT TABLE statement.

Security Requirements

For a non-LOCAL load operation, the server reads a text file located on the server host, so these security requirements must be satisfied:

For a LOCAL load operation, the client program reads a text file located on the client host. Because the file contents are sent over the connection by the client to the server, using LOCAL is a bit slower than when the server accesses the file directly. On the other hand, you do not need the FILE privilege, and the file can be located in any directory the client program can access.

Duplicate-Key and Error Handling

The REPLACE and IGNORE modifiers control handling of new (input) rows that duplicate existing table rows on unique key values (PRIMARY KEY or UNIQUE index values):

The LOCAL modifier has the same effect asIGNORE. This occurs because the server has no way to stop transmission of the file in the middle of the operation.

If none of REPLACE,IGNORE, or LOCAL is specified, an error occurs when a duplicate key value is found, and the rest of the text file is ignored.

In addition to affecting duplicate-key handling as just described, IGNORE andLOCAL also affect error handling:

Field and Line Handling

For both the LOAD DATA andSELECT ... INTO OUTFILE statements, the syntax of theFIELDS and LINES clauses is the same. Both clauses are optional, butFIELDS must precede LINES if both are specified.

If you specify a FIELDS clause, each of its subclauses (TERMINATED BY,[OPTIONALLY] ENCLOSED BY, andESCAPED BY) is also optional, except that you must specify at least one of them. Arguments to these clauses are permitted to contain only ASCII characters.

If you specify no FIELDS orLINES clause, the defaults are the same as if you had written this:

FIELDS TERMINATED BY '\t' ENCLOSED BY '' ESCAPED BY '\\'
LINES TERMINATED BY '\n' STARTING BY ''

Backslash is the MySQL escape character within strings in SQL statements. Thus, to specify a literal backslash, you must specify two backslashes for the value to be interpreted as a single backslash. The escape sequences '\t' and '\n' specify tab and newline characters, respectively.

In other words, the defaults cause LOAD DATA to act as follows when reading input:

Conversely, the defaults causeSELECT ... INTO OUTFILE to act as follows when writing output:

Note

For a text file generated on a Windows system, proper file reading might require LINES TERMINATED BY '\r\n' because Windows programs typically use two characters as a line terminator. Some programs, such asWordPad, might use \r as a line terminator when writing files. To read such files, useLINES TERMINATED BY '\r'.

If all the input lines have a common prefix that you want to ignore, you can use LINES STARTING BY '_`prefixstring`_' to skip the prefix and anything before it. If a line does not include the prefix, the entire line is skipped. Suppose that you issue the following statement:

LOAD DATA INFILE '/tmp/test.txt' INTO TABLE test
  FIELDS TERMINATED BY ','  LINES STARTING BY 'xxx';

If the data file looks like this:

xxx"abc",1
something xxx"def",2
"ghi",3

The resulting rows are ("abc",1) and("def",2). The third row in the file is skipped because it does not contain the prefix.

The IGNORE _`number`_ LINES clause can be used to ignore lines at the start of the file. For example, you can use IGNORE 1 LINES to skip an initial header line containing column names:

LOAD DATA INFILE '/tmp/test.txt' INTO TABLE test IGNORE 1 LINES;

When you use SELECT ... INTO OUTFILE in tandem withLOAD DATA to write data from a database into a file and then read the file back into the database later, the field- and line-handling options for both statements must match. Otherwise, LOAD DATA does not interpret the contents of the file properly. Suppose that you useSELECT ... INTO OUTFILE to write a file with fields delimited by commas:

SELECT * INTO OUTFILE 'data.txt'
  FIELDS TERMINATED BY ','
  FROM table2;

To read the comma-delimited file, the correct statement is:

LOAD DATA INFILE 'data.txt' INTO TABLE table2
  FIELDS TERMINATED BY ',';

If instead you tried to read the file with the statement shown following, it would not work because it instructsLOAD DATA to look for tabs between fields:

LOAD DATA INFILE 'data.txt' INTO TABLE table2
  FIELDS TERMINATED BY '\t';

The likely result is that each input line would be interpreted as a single field.

LOAD DATA can be used to read files obtained from external sources. For example, many programs can export data in comma-separated values (CSV) format, such that lines have fields separated by commas and enclosed within double quotation marks, with an initial line of column names. If the lines in such a file are terminated by carriage return/newline pairs, the statement shown here illustrates the field- and line-handling options you would use to load the file:

LOAD DATA INFILE 'data.txt' INTO TABLE tbl_name
  FIELDS TERMINATED BY ',' ENCLOSED BY '"'
  LINES TERMINATED BY '\r\n'
  IGNORE 1 LINES;

If the input values are not necessarily enclosed within quotation marks, use OPTIONALLY before theENCLOSED BY option.

Any of the field- or line-handling options can specify an empty string (''). If not empty, theFIELDS [OPTIONALLY] ENCLOSED BY andFIELDS ESCAPED BY values must be a single character. The FIELDS TERMINATED BY,LINES STARTING BY, and LINES TERMINATED BY values can be more than one character. For example, to write lines that are terminated by carriage return/linefeed pairs, or to read a file containing such lines, specify a LINES TERMINATED BY '\r\n' clause.

To read a file containing jokes that are separated by lines consisting of %%, you can do this

CREATE TABLE jokes
  (a INT NOT NULL AUTO_INCREMENT PRIMARY KEY,
  joke TEXT NOT NULL);
LOAD DATA INFILE '/tmp/jokes.txt' INTO TABLE jokes
  FIELDS TERMINATED BY ''
  LINES TERMINATED BY '\n%%\n' (joke);

FIELDS [OPTIONALLY] ENCLOSED BY controls quoting of fields. For output (SELECT ... INTO OUTFILE), if you omit the wordOPTIONALLY, all fields are enclosed by theENCLOSED BY character. An example of such output (using a comma as the field delimiter) is shown here:

"1","a string","100.20"
"2","a string containing a , comma","102.20"
"3","a string containing a \" quote","102.20"
"4","a string containing a \", quote and comma","102.20"

If you specify OPTIONALLY, theENCLOSED BY character is used only to enclose values from columns that have a string data type (such asCHAR,BINARY,TEXT, orENUM):

1,"a string",100.20
2,"a string containing a , comma",102.20
3,"a string containing a \" quote",102.20
4,"a string containing a \", quote and comma",102.20

Occurrences of the ENCLOSED BY character within a field value are escaped by prefixing them with theESCAPED BY character. Also, if you specify an empty ESCAPED BY value, it is possible to inadvertently generate output that cannot be read properly byLOAD DATA. For example, the preceding output just shown would appear as follows if the escape character is empty. Observe that the second field in the fourth line contains a comma following the quote, which (erroneously) appears to terminate the field:

1,"a string",100.20
2,"a string containing a , comma",102.20
3,"a string containing a " quote",102.20
4,"a string containing a ", quote and comma",102.20

For input, the ENCLOSED BY character, if present, is stripped from the ends of field values. (This is true regardless of whether OPTIONALLY is specified; OPTIONALLY has no effect on input interpretation.) Occurrences of the ENCLOSED BY character preceded by the ESCAPED BY character are interpreted as part of the current field value.

If the field begins with the ENCLOSED BY character, instances of that character are recognized as terminating a field value only if followed by the field or lineTERMINATED BY sequence. To avoid ambiguity, occurrences of the ENCLOSED BY character within a field value can be doubled and are interpreted as a single instance of the character. For example, ifENCLOSED BY '"' is specified, quotation marks are handled as shown here:

"The ""BIG"" boss"  -> The "BIG" boss
The "BIG" boss      -> The "BIG" boss
The ""BIG"" boss    -> The ""BIG"" boss

FIELDS ESCAPED BY controls how to read or write special characters:

In certain cases, field- and line-handling options interact:

Handling of NULL values varies according to the FIELDS and LINES options in use:

An attempt to load NULL into a NOT NULL column produces either a warning or an error according to the rules described inColumn Value Assignment.

Some cases are not supported by LOAD DATA:

FIELDS TERMINATED BY '"' ENCLOSED BY '"'  

Column List Specification

The following example loads all columns of thepersondata table:

LOAD DATA INFILE 'persondata.txt' INTO TABLE persondata;

By default, when no column list is provided at the end of theLOAD DATA statement, input lines are expected to contain a field for each table column. If you want to load only some of a table's columns, specify a column list:

LOAD DATA INFILE 'persondata.txt' INTO TABLE persondata
(col_name_or_user_var [, col_name_or_user_var] ...);

You must also specify a column list if the order of the fields in the input file differs from the order of the columns in the table. Otherwise, MySQL cannot tell how to match input fields with table columns.

Input Preprocessing

Each instance of colnameoruservar in LOAD DATA syntax is either a column name or a user variable. With user variables, theSET clause enables you to perform preprocessing transformations on their values before assigning the result to columns.

User variables in the SET clause can be used in several ways. The following example uses the first input column directly for the value of t1.column1, and assigns the second input column to a user variable that is subjected to a division operation before being used for the value of t1.column2:

LOAD DATA INFILE 'file.txt'
  INTO TABLE t1
  (column1, @var1)
  SET column2 = @var1/100;

The SET clause can be used to supply values not derived from the input file. The following statement setscolumn3 to the current date and time:

LOAD DATA INFILE 'file.txt'
  INTO TABLE t1
  (column1, column2)
  SET column3 = CURRENT_TIMESTAMP;

You can also discard an input value by assigning it to a user variable and not assigning the variable to any table column:

LOAD DATA INFILE 'file.txt'
  INTO TABLE t1
  (column1, @dummy, column2, @dummy, column3);

Use of the column/variable list and SET clause is subject to the following restrictions:

Column Value Assignment

To process an input line, LOAD DATA splits it into fields and uses the values according to the column/variable list and theSET clause, if they are present. Then the resulting row is inserted into the table. If there areBEFORE INSERT or AFTER INSERT triggers for the table, they are activated before or after inserting the row, respectively.

Interpretation of field values and assignment to table columns depends on these factors:

Those factors combine to produce restrictive or nonrestrictive data interpretation by LOAD DATA:

Restrictive data interpretation uses these rules:

By contrast, nonrestrictive data interpretation uses these rules:

TIMESTAMP columns are set to the current date and time only if there is a NULL value for the column (that is, \N) and the column is not declared to permit NULL values, or if the TIMESTAMP column default value is the current timestamp and it is omitted from the field list when a field list is specified.

LOAD DATA regards all input as strings, so you cannot use numeric values forENUM orSET columns the way you can withINSERT statements. AllENUM andSET values must be specified as strings.

BIT values cannot be loaded directly using binary notation (for example,b'011010'). To work around this, use theSET clause to strip off the leadingb' and trailing ' and perform a base-2 to base-10 conversion so that MySQL loads the values into the BIT column properly:

$> cat /tmp/bit_test.txt
b'10'
b'1111111'
$> mysql test
mysql> LOAD DATA INFILE '/tmp/bit_test.txt'
       INTO TABLE bit_test (@var1)
       SET b = CAST(CONV(MID(@var1, 3, LENGTH(@var1)-3), 2, 10) AS UNSIGNED);
Query OK, 2 rows affected (0.00 sec)
Records: 2  Deleted: 0  Skipped: 0  Warnings: 0

mysql> SELECT BIN(b+0) FROM bit_test;
+----------+
| BIN(b+0) |
+----------+
| 10       |
| 1111111  |
+----------+
2 rows in set (0.00 sec)

For BIT values in0b binary notation (for example,0b011010), use this SET clause instead to strip off the leading 0b:

SET b = CAST(CONV(MID(@var1, 3, LENGTH(@var1)-2), 2, 10) AS UNSIGNED)

Partitioned Table Support

LOAD DATA supports explicit partition selection using the PARTITION clause with a list of one or more comma-separated names of partitions, subpartitions, or both. When this clause is used, if any rows from the file cannot be inserted into any of the partitions or subpartitions named in the list, the statement fails with the error Found a row not matching the given partition set. For more information and examples, see Section 26.5, “Partition Selection”.

Concurrency Considerations

With the LOW_PRIORITY modifier, execution of the LOAD DATA statement is delayed until no other clients are reading from the table. This affects only storage engines that use only table-level locking (such as MyISAM, MEMORY, and MERGE).

With the CONCURRENT modifier and aMyISAM table that satisfies the condition for concurrent inserts (that is, it contains no free blocks in the middle), other threads can retrieve data from the table whileLOAD DATA is executing. This modifier affects the performance of LOAD DATA a bit, even if no other thread is using the table at the same time.

Replication Considerations

LOAD DATA is considered unsafe for statement-based replication. If you useLOAD DATA withbinlog_format=STATEMENT, each replica on which the changes are to be applied creates a temporary file containing the data. This temporary file is not encrypted, even if binary log encryption is active on the source, If encryption is required, use row-based or mixed binary logging format instead, for which replicas do not create the temporary file. For more information on the interaction betweenLOAD DATA and replication, seeSection 19.5.1.19, “Replication and LOAD DATA”.

Miscellaneous Topics

On Unix, if you need LOAD DATA to read from a pipe, you can use the following technique (the example loads a listing of the / directory into the table db1.t1):

mkfifo /mysql/data/db1/ls.dat
chmod 666 /mysql/data/db1/ls.dat
find / -ls > /mysql/data/db1/ls.dat &
mysql -e "LOAD DATA INFILE 'ls.dat' INTO TABLE t1" db1

Here you must run the command that generates the data to be loaded and the mysql commands either on separate terminals, or run the data generation process in the background (as shown in the preceding example). If you do not do this, the pipe blocks until data is read by themysql process.