Max_allowed_packet error
I have a user attempting to upload a ~37mb zip file. The global max upload is set to 40mb (40480kb to be precise) so I feel there shouldn't be a problem and yet when creating the file asset of the zip file I get:
`An error occurred during editing: Hibernate flushing: Could not execute JDBC batch update; SQL [update cxml_blob set data=? where id=?]; Packet for query is too large (37183979 > 16777216). You can change this value on the server by setting the max_allowed_packet' variable.; nested exception is java.sql.BatchUpdateException: Packet for query is too large (37183979 > 16777216). You can change this value on the server by setting the max_allowed_packet' variable.`
Where is this 16mb limit set and is it easily changeable? Any drawbacks or possible issues with changing it? Is the global max only applicable to the use of the zip archive tool?
Discussions are closed to public comments.
If you need help with Cascade CMS please
start a new discussion.
Keyboard shortcuts
Generic
? | Show this help |
---|---|
ESC | Blurs the current field |
Comment Form
r | Focus the comment reply box |
---|---|
^ + ↩ | Submit the comment |
You can use Command ⌘
instead of Control ^
on Mac
1 Posted by Joel on 09 Jan, 2012 10:25 PM
Hi Travis,
While the setting within Cascade Server may be 40MB, your MySQL or database settings may be lower. Please take a look at this article for more information and the solution to this issue.
Thanks!
2 Posted by Travis F. on 11 Jan, 2012 05:08 PM
That'll do it.
Thanks.
Travis F. closed this discussion on 11 Jan, 2012 05:08 PM.