Custom keys and models with objectIdClass, cause an exception when quering using mongo


Manuel Castillo <contact@...>
 

Hi Andy,

Thanks for the response! I didn't change the queries since we need pretty much our current key configuration; but you gave me an idea: instead of direclty passing the Key object to getObjectById; we changed this to building a String with the sintax: Key.class.getName() +":" +key.toString() so we delegate the key construction to an ObjectId inside the fetchObject method. This is working great! No other changes were required.

Thanks!


2018-04-17 2:40 GMT-05:00 Andy <andy@...>:

I can run an abstract base class + concrete subclass with a basic own definition of PK, and then persist an object (of the concrete type), and then call getObjectById using that PK (with it doing a database call). It works (with MongoDB, and also with RDBMS, and likely any other datastores).

Consequently you should look at your code, and run the same thing but without your PK class (i.e using the built-in PK classes) and see what happens.
Then just create a very simple own definition of PK class (as the `objectIdClass`) ... as per the example in the docs ... and see if that works. If it does work then your PK class is the problem.

A "second class type" is defined in the docs and the JDO spec.

It is fairly safe to conclude that if something works with RDBMS then you should NOT be considering changing ANY code in `datanucleus-core` just to get some minority case on MongoDB to work (if that indeed ends up as the conclusion).



Andy
 
Edited

I can run an abstract base class + concrete subclass with a basic own definition of PK, and then persist an object (of the concrete type), and then call getObjectById using that PK (with it doing a database call). It works (with MongoDB, and also with RDBMS, and likely any other datastores).

Consequently you should look at your code, and run the same thing but without your PK class (i.e using the built-in PK classes) and see what happens.
Then just create a very simple own definition of PK class (as the `objectIdClass`) ... as per the example in the docs ... and see if that works. If it does work then your PK class is the problem.

A "second class type" is defined in the docs and the JDO spec.

FWIW `targetClassName` is a special field name for PK classes, defined in the DN 5.1 documentation (search the Mapping Guide). This will store the fully-qualified type name of the object that it represents. If you want to store something other than that, you should use a different field name.


It is fairly safe to conclude that if something works with RDBMS then you should NOT be considering changing ANY code in `datanucleus-core` just to get some minority case on MongoDB to work (if that indeed ends up as the conclusion).


Manuel Castillo <contact@...>
 

Hello,

I've some classes with a custom key which I believe satisfy datanucleus-JDO requirements; the key class is as follows:
public final class Key implements Serializable {

	static final long serialVersionUID = -448150158203091507L;
	public String targetClassName;
	public String id;
	
	public Key() {	}
	
	public Key(String str) {
		init(str);
	}
	
	private void init(String str) {
		if(StringUtils.isEmpty(str)) {
			targetClassName = "";
			id = "";
			return;
		}
		String[] parts = str.split("\\(");
		parts[1] = parts[1].replaceAll("\\)", " ");
		parts[1] = parts[1].replace("\"", " ");
		parts[1] = parts[1].trim();
		this.targetClassName = parts[0];
		this.id = parts[1];
	}
	
	public void complete() {
		init(id);
	}

	public Key(String classCollectionName, String id) {
		if (StringUtils.isEmpty(classCollectionName)) {
			throw new IllegalArgumentException("No collection/class name specified.");
		}
		if (id == null) {
			throw new IllegalArgumentException("ID cannot be null");
		}
		targetClassName = getTargetClassName(classCollectionName);
		this.id = id;
	}

	public int hashCode() {
		int prime = 31;
		int result = 1;
		if(!isComplete()) init(id);
		result = prime * result + id.hashCode();
		result = prime * result + targetClassName.hashCode();
		return result;
	}

	public boolean equals(Object object) {
		if(!isComplete()) init(id);
		if (object instanceof Key) {
			Key key = (Key) object;
			if (this == key)
				return true;
			return targetClassName.equals(key.targetClassName) && Objects.equals(id, key.id);
		} else {
			return false;
		}
	}

	public String toString() {
		if(!isComplete()) init(id);
		StringBuilder buffer = new StringBuilder();
		int index = targetClassName.lastIndexOf('.');
		buffer.append(index < 0 ? targetClassName : targetClassName.substring(++index));
		buffer.append("(");
		if (id != null) {
			buffer.append("\"").append(id)
					.append("\"").toString());
		} else {
			buffer.append("no-id-yet");
		}
		buffer.append(")");
		return buffer.toString();
	}

	public boolean isComplete() {
		return targetClassName != null && id != null;
	}

}
I'm aware this key class is kinda silly, since models using it only declare a primary key of type String; so using the String as a primary key would just do it... But this is part of a system which has been in production for a while and older versions used to build keys in a pattern that is easily supported with this key class; so it helps with support and consistency for data of all versions.

So anyway; we have and need this key class and classes which use it of course have a signature like this:
@PersistenceCapable(detachable="true", objectIdClass=Key.class)
@Inheritance(strategy=InheritanceStrategy.COMPLETE_TABLE)
public abstract class Entity {
    @PrimaryKey
    @Persistent(valueStrategy = IdGeneratorStrategy.UNSPECIFIED, column="_id")
    public String id;
...
}
Child classes have a very similar signature; except for the "objectIdClass" metadata which isn't allowed; since only the super class can define the key fields for a given inheritance tree. Following those rules, the id class attribute is also omitted.

Our persistence.xml is like this:

<?xml version="1.0" encoding="UTF-8" ?>
<persistence xmlns="http://java.sun.com/xml/ns/persistence"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://java.sun.com/xml/ns/persistence
        http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd" version="2.0">

    <!-- JOSAdmin "unit" -->
    <persistence-unit name="ourdatastore">
        <class>mx.ourdomain.Entiy</class>
        <class>mx.ourdomain.Ticket</class>
        <exclude-unlisted-classes/>

    </persistence-unit>
</persistence>
And package-mongo.orm is:
<?xml version="1.0"?>
<!DOCTYPE orm SYSTEM "file:/javax/jdo/orm.dtd">
<orm>
    <package name="mx.ourdomain" >
        <class name="Entity" table="Entity">
            <field name="id" primary-key="true" >
                <column name="_id" length="100" />
            </field >
        </class>

        <class name="Ticket" table="Ticket">
            <primary-key >
                <column name="_id" target="_id" />
            </primary-key>
        </class>
     </package>
</orm>
And most of the time all of this works just fine. We can write and read into our mongo database fine... except when using PersistenceManager.getObjectById(Ticket.class, someKeyOfTypeKey);

Every time we run this we got a NullPointerException.

Tracking down the problem to check if a missconfiguration was clear, we found that when datanucleus tries to "getClassDetailsForClass" (ExecutionContextImpl line 3502 at datanucleus-core v5.1.8); it will be holding a null objectClassName; since findObject doesn't provide one (it just gives null; which looks fine, since this code is the one responsible for determining where to fetch the object).

So, getClassDetailsForClass will call StoreManager.manageClassForIdentity and since MongoDBStoreManager doesn't override this method; the AbstractStoreManager class will answer the call. The first thing it does (line 660, datanucleus-core v5.1.8) is to check if the key class is supported as a second class (whathever that means).

It does (it returns true since the Key class isn't null and isn't a Java type) Then it just returns null; and skips adding the given ClassLoaderResolver to the MongoDBStoreManager's map of managed classes.

Returning null is not the problem here; the following instructions get the correct class name of the object we want to fetch; but a little later when trying to validate the object (ExecutionContextImpl, findObject method, line 3557 in datanucleus-core v5.1.8); and when tries to populate the hollow object with values from the database (StateManagerImpl,validate method, line 5503 at datanucleus-core v5.1.8); its when it fires the NullPointerException. The exception goes unwrapped and all the way up breaking the query.

More specifically, the exception occurs when the MongoDBPersistenceHandler tries to get the table description of the class we want to fetch; but the storeMgr attribue (of type MongoDBStoreManager; the one which skipped adding the ClassLoaderResolver to its managed classes map) returns null when its asked for the desired object's class (since it never registered it). This happens at method fetchObject line 597 in datanucleus-mongodb v5.1.0-release and this works fine when not using the PersistenceManager.getObjectById method, it adds the classes to the managed map in those cases.

So, are we missing some configuration to get this to work? Or should the AbstractStoreManager, manageClassForIdentity method should register class of the object with custom Key at some point?

Thank you!