Spanner Adapter type conversion/matching problem


yunus@...
 

Hi everyone,

Recently I have created a pull request for Spanner database adapter. While running the Datanucleus tests I have encountered an issue which made me realize that I have not fully understood the type conversions.
In the Datanucleus tests there is a Person class with int age attribute. This field is correctly created as INT64 type in Spanner, but during a read, I get an error saying: it was impossible to set the field "AGE" type "java.lang.Long".
Looks like Spanner adapter considers this field as LONG java type while reading. As a side note, Spanner has a single integer type, which is INT64. In database metadata, it is mapped to BIGINT.

So here is my question. I have added sqlTypesforJdbcTypes which performs a mapping. But I also have registerColumnMapping which maps a java type to jdbc and sql type.
Could you please explain how the Java type, SQL type and JDBC type are related to each other? How does datanucleus make a decision on type mapping?

Best regards
yunus


Andy
 
Edited

Hi,

if you have a class with a field of type "int" then, to find its JavaTypeMapping, DataNucleus will look at the column mappings that the adapter has registered. In your adapters case, it has Integer so will find JDBC type of INTEGER as the DEFAULT for that java type, and the IntegerColumnMapping for the column. It will also make use of either what the JDBC driver provides for JDBC INTEGER or what your adapter provides for INTEGER. To persist data it will probably call IntegerColumnMapping.setInt and to read data from the database it will probably call IntegerColumnMapping.getInt.

You should look in the log and find entries like this
Field [mydomain.model.A.id] -> Column(s) [A.ID] using mapping of type "org.datanucleus.store.rdbms.mapping.java.LongMapping" (org.datanucleus.store.rdbms.mapping.column.BigIntColumnMapping)
This tells you what mappings are being used, and hence what behaviour you should expect for read/write operations.

The CloudSpannerTypeInfo entries are added when the JDBC driver doesn't provide them. Does it provide them itself? Run DataNucleus SchemaTool "dbinfo" mode, which tells you what is either provided by the JDBC driver or added by your adapter. You can look at the info under here for other datastores for reference. Maybe contribute this output to GitHub also?