-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Default schema exceeds mysql row limits #550
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@laszlof thanks for the report. It seems possible that this began happening in mysql with 14cb58d. As mentioned in this comment:
In this case, I don't think mysql is non-standard. But it still applies that it is difficult to make a simplified JDBC implementation work across the board out of the box. This schema is really intended as a starting point, and the entire schema and implementation of If that doesn't work, take a look at You may also be interested in implementing this using Spring Data. See this gist if you're interested in trying that approach: JpaOAuth2AuthorizationService. Note that we will hopefully be able to provide a concrete example of some of this in our upcoming reference documentation and how-to guides. Please upvote #545 if this is of interest to you. I'm going to leave this issue open for now, so we can review it relative to the change to increase attributes to 15,000. However, I recommend you pursue customizing the schema and/or implementation and see how far you can get, as it really is our intention to provide a starting point more than an out of the box solution for all databases. |
@sjohnr Thanks for the reply. We've already customized the schema on our end. I just wanted to bring awareness to the fact that the recent change in 14cb58d effectively breaks its out of the box usage for all mysql databases. I agree its difficult to provide an implementation that will work out of the box with all database. But perhaps building the schema in a way that works with at least the 2 most common, mysql, and postgres, would be worthwhile. For our personal implementation, we're using an NDB cluster, so the issue is even more compounded as we're faced with a 30K byte row limitation rather than the typical 65535 byte's. I realize this is a bit more of an edge-case, so I dont expect it to be supported out of the box. |
is it possible to provide separate script for different databases, rather than try to initialize all with a single script? |
Describe the bug
The default schema included with this exceeds the physical limits set by mysql (65,535 bytes) when using a utf8 character set. This issue is further compounded if you are using a clustered storage engine (NDB) which has an even lower row size limit (30K bytes).
To Reproduce
oauth2_authorization
schema.Expected behavior
Database tables are imported successfully
What actually happens
Suggested fix
The simplest fix here is to convert those large VARCHAR columns over to TEXT. VARCHAR will allocate their entire size in the row size calculation, where as TEXT/BLOB type columns will not, as they are stored by reference.
This issue is even more compounded when you factor in that most databases default to some utf8 type of encoding. UTF8 is going to make each of those VARCHAR sizes be
n*4 + 1
. that single 15K character VARCHAR forattributes
is consuming > 60K bytes.The text was updated successfully, but these errors were encountered: