Max_allowed_packet error

Travis F.'s Avatar

Travis F.

09 Jan, 2012 10:13 PM

I have a user attempting to upload a ~37mb zip file. The global max upload is set to 40mb (40480kb to be precise) so I feel there shouldn't be a problem and yet when creating the file asset of the zip file I get:

`An error occurred during editing: Hibernate flushing: Could not execute JDBC batch update; SQL [update cxml_blob set data=? where id=?]; Packet for query is too large (37183979 > 16777216). You can change this value on the server by setting the max_allowed_packet' variable.; nested exception is java.sql.BatchUpdateException: Packet for query is too large (37183979 > 16777216). You can change this value on the server by setting the max_allowed_packet' variable.`

Where is this 16mb limit set and is it easily changeable? Any drawbacks or possible issues with changing it? Is the global max only applicable to the use of the zip archive tool?

  1. 1 Posted by Joel on 09 Jan, 2012 10:25 PM

    Joel's Avatar

    Hi Travis,

    While the setting within Cascade Server may be 40MB, your MySQL or database settings may be lower. Please take a look at this article for more information and the solution to this issue.

    Thanks!

  2. 2 Posted by Travis F. on 11 Jan, 2012 05:08 PM

    Travis F.'s Avatar

    That'll do it.

    Thanks.

  3. Travis F. closed this discussion on 11 Jan, 2012 05:08 PM.

Discussions are closed to public comments.
If you need help with Cascade CMS please start a new discussion.

Keyboard shortcuts

Generic

? Show this help
ESC Blurs the current field

Comment Form

r Focus the comment reply box
^ + ↩ Submit the comment

You can use Command ⌘ instead of Control ^ on Mac

 

26 Aug, 2016 01:19 PM
25 Aug, 2016 03:02 PM
25 Aug, 2016 12:50 PM
24 Aug, 2016 08:43 PM
24 Aug, 2016 07:20 PM
21 Aug, 2016 01:20 PM