MySQL CONCAT limitation

So you're trying to concat two large strings and it keeps resulting in a truncated or empty string?

The problem may be that CONCAT can only result in a string of length 1024 bytes long. Worse is there is no setting in MySQL that allows you to increate the limits of CONCAT. However there is another way. Meet GROUP_CONCAT which is made for concatenating a list of rows into one string. We can use this function as a way of doing concatenation with out the 1024 limit of CONCAT. GROUP_CONCAT has a default setting of 1024 in MySQL but where it is different from CONCAT is that there is a setting that allows us to change this limit setting called group_concat_max_len. Whatever size you set this variable to, will allow GROUP_CONCAT to return a resulting value up to that length. The maximum size of group_concat_max_len is defined by another setting called max_allowed_packet. The maximum size of max_allowed_packet depends on if you are running a 64 bit or 32 bit system. On a 32 bit system it is 4 GB.

So say you are using CONCAT in the following way:

UPDATE some_table
SET some_field=CONCAT(some_field,'super long string to append to the end of the data in some_field')
WHERE some_criteria_field = 'match on this string';

And in this example some_field would change from this:

ID some_criteria_field some_field
1 match on this string

to this:

ID some_criteria_field some_field
1 match on this string
12345;super long string to append to the end of the data in some_field

But now lets say the data you are trying to concatenate on the end of some_field's original contents is getting truncated or just plain setting some_field to null/empty, because it is too long. That wasn't the case of course in the above example because the concatenated string is so short. Lets say the string you are trying to add to the end of '12345;' is 200 KB long. So in this case we need to switch from CONCAT to GROUP_CONCAT and increase the GROUP_CONCAT result limit from 1024 bytes to the maximum allowed size, to get around the limitation.

The following is the way the query will have to look to alieviate the problems with CONCAT above (we will use the data from above so you can see where your longer strings would go):

SET @@session.group_concat_max_len = @@global.max_allowed_packet;
UPDATE some_table SET some_table.some_field=(
SELECT GROUP_CONCAT( queue.append_to_end SEPARATOR '') as new_some_field
SELECT append_to_end FROM some_table WHERE some_criteria_field = 'match on this string'
SELECT 'super long string to append to the end of the data in some_field' as append_to_end
) as queue
) WHERE some_criteria_field = 'match on this string'

The key is the following line from above:

SET @@session.group_concat_max_len = @@global.max_allowed_packet;

This will up the limit of GROUP_CONCAT for the duration of the session which means when you close the connection it returns to the default for the other queries.

Sometimes this temporary increase is not enough. If you are doing concatenations that result in strings greater than the system setting @@global.max_allowed_packet which is usually 1MB then you'll need to have special DB permissions on the system to run a query like the following:

SET GLOBAL max_allowed_packet=1073741824

1073741824 is just 1 GB it could be whatever size you need in bytes.

You can also make the change in your config file on the server. Go to your my.ini or my.cnf file (depending on your MySQL version) and find the following line and change it to what we have bellow:
max_allowed_packet = 1025M

I'm afraid that without special permissions on your MySQL server you will not be able to concat strings with a resulting length longer than max_allowed_packets which again is usually set to a 1 MB default. In this case you will need to upgrade from shared hosting to hosting where you have control over things like this.

Finally keep in mind that memory concerns in your backend programming language that are running these queries can be a concern. For instance if you are working with PHP then, go to your php.ini file and find the following:

Then set it to something appropriate for the amount of data you are using.

comments powered by Disqus