HiveBrain v1.2.0
Get Started
← Back to all entries
snippetsqlMinor

How can I optimize this mySQL table that will need to hold 5+ million rows?

Submitted by: @import:stackexchange-dba··
0
Viewed 0 times
thismillioncanrowsneedmysqlwillthatoptimizehold

Problem

The table has about 12 rows.

Indexes:

idint(11), Unique, auto-increment, primary

ATTACHMENTIDvarchar(14), Unique

MLSNUMBERbigint(20)

POSITIONint(10)

FILETYPEvarchar(32)

I'm doing an extended insert with an ON DUPLICATE KEY UPDATE clause.
Each insert has anywhere from 300-10,000 records.
The more records in the table, the longer the query takes.

With 0 rows, the query took about 5-10 seconds.
After about 300,000 rows, the query was taking 50 seconds.

What would be the best way to optimize this table? Thanks in advance!

Solution

The reason it takes progressively longer when your table has a lot of rows is due to having to write the indexes. Try turning offing indexes for the table before inserting, then re-enable afterwards:

ALTER TABLE my_table DISABLE KEYS;

-- Your insert statement

ALTER TABLE my_table ENABLE KEYS;


Disabling keys only affects the updating of non-unique keys, so ON DUPLICATE UPDATE will still work as intended:


ALTER TABLE ... DISABLE KEYS tells MySQL to stop updating nonunique indexes. ALTER TABLE ... ENABLE KEYS then should be used to re-create missing indexes. [src]

Code Snippets

ALTER TABLE my_table DISABLE KEYS;

-- Your insert statement

ALTER TABLE my_table ENABLE KEYS;

Context

StackExchange Database Administrators Q#21583, answer score: 4

Revisions (0)

No revisions yet.