Description
Describe the bug
The default schema included with this exceeds the physical limits set by mysql (65,535 bytes) when using a utf8 character set. This issue is further compounded if you are using a clustered storage engine (NDB) which has an even lower row size limit (30K bytes).
To Reproduce
- Build mysql-cluster docker image: https://hub.docker.com/r/mysql/mysql-cluster/
- create an empty database
- attempt to import the
oauth2_authorization
schema.
Expected behavior
Database tables are imported successfully
What actually happens
mysql> CREATE TABLE oauth2_authorization (
-> id varchar(100) NOT NULL,
-> registered_client_id varchar(100) NOT NULL,
-> principal_name varchar(200) NOT NULL,
-> authorization_grant_type varchar(100) NOT NULL,
-> attributes varchar(15000) DEFAULT NULL,
-> state varchar(500) DEFAULT NULL,
-> authorization_code_value blob DEFAULT NULL,
-> authorization_code_issued_at timestamp DEFAULT NULL,
-> authorization_code_expires_at timestamp DEFAULT NULL,
-> authorization_code_metadata varchar(2000) DEFAULT NULL,
-> access_token_value blob DEFAULT NULL,
-> access_token_issued_at timestamp DEFAULT NULL,
-> access_token_expires_at timestamp DEFAULT NULL,
-> access_token_metadata varchar(2000) DEFAULT NULL,
-> access_token_type varchar(100) DEFAULT NULL,
-> access_token_scopes varchar(1000) DEFAULT NULL,
-> oidc_id_token_value blob DEFAULT NULL,
-> oidc_id_token_issued_at timestamp DEFAULT NULL,
-> oidc_id_token_expires_at timestamp DEFAULT NULL,
-> oidc_id_token_metadata varchar(2000) DEFAULT NULL,
-> refresh_token_value blob DEFAULT NULL,
-> refresh_token_issued_at timestamp DEFAULT NULL,
-> refresh_token_expires_at timestamp DEFAULT NULL,
-> refresh_token_metadata varchar(2000) DEFAULT NULL,
-> PRIMARY KEY (id)
-> ) Engine=InnoDB;
ERROR 1118 (42000): Row size too large. The maximum row size for the used table type, not counting BLOBs, is 65535. This includes storage overhead, check the manual. You have to change some columns to TEXT or BLOBs
Suggested fix
The simplest fix here is to convert those large VARCHAR columns over to TEXT. VARCHAR will allocate their entire size in the row size calculation, where as TEXT/BLOB type columns will not, as they are stored by reference.
This issue is even more compounded when you factor in that most databases default to some utf8 type of encoding. UTF8 is going to make each of those VARCHAR sizes be n*4 + 1
. that single 15K character VARCHAR for attributes
is consuming > 60K bytes.