Date   

Re: Problems with DataNucleus JDO and MS Sql Server 2012

Andy
 

As the log shows, your database supports MixedCase datastore identifiers
Supported Identifier Cases : MixedCase "MixedCase" MixedCase-Sensitive "MixedCase-Sensitive"
and it is looking up a table name of "MSE". So have you created the table as that name (in capitals)? because if you haven't then it explains clearly why you get a problem.
Persistence property datanucleus.identifier.case  provides the options you need to control what name it looks up.


Re: Problems with DataNucleus JDO and MS Sql Server 2012

Max Treptow
 

This is the logfile after the enhancing of o/r class Mse.java:

 

The class to get MSE data from the database:

public class MseData {

 

private PersistenceManagerFactory pmf;

private PersistenceManager pm;

 

public static void main( String[] args )

{

MseData mseData = new MseData();

mseData.listMseData2();

}

private void listMseData2() {

 

pmf = JDOHelper.getPersistenceManagerFactory("RdstTest");

pm = pmf.getPersistenceManager();

 

Extent<Mse> tExtent = pm.getExtent(Mse.class, false);

 

Iterator<Mse> mseData = tExtent.iterator();

 

while(mseData.hasNext()) {

 

Mse mse = (Mse) mseData.next();

System.out.println(mse.getTitle());

}

pm.close();

pmf.close();

}

}

The command Extent<Mse> tExtent = pm.getExtent(Mse.class, false);
throws the following exception:

Exception in thread "main" javax.jdo.JDODataStoreException: Required table missing : "DBO.MSE" in Catalog "" Schema "DBO". DataNucleus requires this table to perform its persistence operations. Either your MetaData is incorrect, or you need to enable "datanucleus.schema.autoCreateTables"


at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:553)


at org.datanucleus.api.jdo.JDOPersistenceManager.getExtent(JDOPersistenceManager.java:1550)


at eu.jcz.dnt1.app.MseData.listMseData2(MseData.java:30)


at eu.jcz.dnt1.app.MseData.main(MseData.java:22)

 

 

 

NestedThrowablesStackTrace:

 

Required table missing : "DBO.MSE" in Catalog "" Schema "DBO". DataNucleus requires this table to perform its persistence operations. Either your MetaData is incorrect, or you need to enable "datanucleus.schema.autoCreateTables"

 

 

org.datanucleus.store.rdbms.exceptions.MissingTableException: Required table missing : "DBO.MSE" in Catalog "" Schema "DBO". DataNucleus requires this table to perform its persistence operations. Either your MetaData is incorrect, or you need to enable "datanucleus.schema.autoCreateTables"


at org.datanucleus.store.rdbms.table.AbstractTable.exists(AbstractTable.java:607)


at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:3400)


at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2911)


at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:118)


at org.datanucleus.store.rdbms.RDBMSStoreManager.manageClasses(RDBMSStoreManager.java:1643)


at org.datanucleus.store.AbstractStoreManager.getExtent(AbstractStoreManager.java:987)


at org.datanucleus.ExecutionContextImpl.getExtent(ExecutionContextImpl.java:5209)


at org.datanucleus.api.jdo.JDOPersistenceManager.getExtent(JDOPersistenceManager.java:1546)


at eu.jcz.dnt1.app.MseData.listMseData2(MseData.java:30)


at eu.jcz.dnt1.app.MseData.main(MseData.java:22)

 

The whole logfile datanucleus.log is attached.


Re: Problems with DataNucleus JDO and MS Sql Server 2012

Andy
 
Edited

Exceptions have stack traces, so post it; they show where the exception came from.
Also the log (DEBUG) tells you what was going on at that point, hence may have some SQL statement, or may be using DatabaseMetaData to check things. If it is SQL then invoke that same SQL yourself directly and see what result you get.

The other thing that the log would show is the CASE supported by the SQLServer JDBC driver, so think whether case sensitivity of the table name is a factor ...


Problems with DataNucleus JDO and MS Sql Server 2012

Max Treptow
 

When I try to read data from a very simple database table on the MS Sql Server 2012 by a DataNucleus JDO query, I get the following error message:

 

Exception in thread "main" javax.jdo.JDODataStoreException: Required table missing : "DBO.MSE" in Catalog "" Schema "DBO". DataNucleus requires this table to perform its persistence operations. Either your MetaData is incorrect, or you need to enable "datanucleus.schema.autoCreateTables"

 

This is the persistence.xml  file, I use to configure the database connection:

persistence-unit name="RdstTest">
        <properties>

 

            <property name="javax.jdo.option.ConnectionURL" value="jdbc:sqlserver://<db_server>:6201;databaseName=rdst;SelectMethod=cursor"/>

 

            <property name="javax.jdo.option.ConnectionDriverName" value="com.microsoft.sqlserver.jdbc.SQLServerDriver"/>

 

            <property name="javax.jdo.option.ConnectionUserName" value="<user>"/>

 

            <property name="javax.jdo.option.ConnectionPassword" value="<password>"/>

 

            <property name="javax.jdo.mapping.Schema" value="dbo"/>

 

            <property name="javax.jdo.PersistenceManagerFactoryClass" value="org.datanucleus.api.jdo.JDOPersistenceManagerFactory"/>

 

        </properties>

 

    </persistence-unit>

 

And this is the O/R class to map the database table:

 

@PersistenceCapable (identityType = IdentityType.DATASTORE, table = "Mse")

 

public class Mse
{
   

    public Mse() {}   

    @Persistent()
    private String _title; 

 

    public String getTitle() {
        return _title;
    }

    public void setTitle(String title) {
        _title = title;
    }
}

 

 

I tested this with several identity types and Id generator strategies and using 2 different software environments:

 

  1. DataNucleus 3, Java 7, Ant, MS Sql Server 2012, mssql-jdbc-6.4.0.jre7.jar

  2. DataNucleus 5, Java 8, Maven, Sql Server 2012, mssql-jdbc-6.1.0.jre8.jar

All tests without success.

 

Additional information:
1. The access to this table (data reading) using Hibernate (Hibernate 4, Java 8, Maven 3, MS Sql Server 2012, mssql-jdbc-6.1.0.jre8.jar) works fine. So I assume connection parameters and user privileges seem to be correct.
2. Data access to an Oracle 10g database with the same DataNucleus versions (3 and 5, Java 7 and 8) works without any problems too.

 

Because the database table exists and I can read the table content using Hibernate, I’m afraid my O/R class for DataNucleus mapper is wrong.
Could you tell me, what else can I try / what else can I do better to get data access on a MS SQL Server?

 

Thank you in advice

 

 

 

 


Re: Possible optimization for "startsWith" in JDOQL (MySQL implementation)

Andy
 

You can easily enough develop your own handler for "startsWith" method, and provide a benchmark comparison of current versus that. This extension point http://www.datanucleus.org:15080/products/accessplatform_5_1/extensions/extensions.html#rdbms_sql_method


Possible optimization for "startsWith" in JDOQL (MySQL implementation)

Page bloom
 

I noticed that if a JDOQL query filter has something like:

firstNameLow.startsWith("fred")

it gets translated to a LOCATE in the where clause

where LOCATE('fred',a0.first_name_low) = 1;

In MySQL (tested on version 5.6) and likely other RDBMSes a LOCATE will not take advantage of any index on the column.

Indexes can only be used when the wildcard appears at the end of the 'like' string - and this is the exact case for a 'starts with' operation.

So, if the where clause generated for a 'startsWith' used a 'like' with a trailing % then the query, in my dataset, is measured as 0.00 seconds instead of the LOCATE version which takes multiple seconds to complete.

where first_name_low like 'fred%'

I can achieve the above using the "matches"  method:

firstNameLow.matches("fred.*")

but I thought the startsWith would seem more intuitive and convenient than having to resort to matches with regex - plus it's not obvious that a startsWith generates a LOCATE and not a 'like' until you look under the hood.


Re: Calling PostgreSQL functions with StoredProcedureQuery

Andy
 
Edited

Simple answer is that a postgresql 'function' is not the same as a jdbc stored procedure. Just try to write a jdbc CallableStatement and execute it using jdbc. And what happens? That is all this jpa StoredProcedureQuery does.

If it works using jdbc then you can easily get the code for PostgreSQLAdapter in datanucleus-rdbms and develop a fix and contribute it back.

If it doesnt work with jdbc then execute a native query to execute the function as you would normally with postgresql


Calling PostgreSQL functions with StoredProcedureQuery

@Mogaba
 

Hello,

I have problems with calling PostgreSQL functions with StoredProcedureQuery.

My persistence.xml file:

<?xml version="1.0" encoding="UTF-8" ?>
<persistence xmlns="http://xmlns.jcp.org/xml/ns/persistence"
    xmlns:xsi="http//www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http//xmlns.jcp.org/xml/ns/persistence
        http://xmlns.jcp.org/xml/ns/persistence/persistence_2_2.xsd"
    version="2.2">
    <persistence-unit name="mydbPU">
        <exclude-unlisted-classes />
        <properties>
            <property name="javax.persistence.jdbc.url" value="jdbc:postgresql//myserver:5432/mydb" />
            <property name="javax.persistence.jdbc.driver" value="org.postgresql.Driver" />
            <property name="javax.persistence.jdbc.user" value="myuser" />
            <property name="javax.persistence.jdbc.password" value="mypassword" />
            <property name="datanucleus.mapping.Schema" value="public" />
            <property name="datanucleus.identifier.case" value="MixedCase" />
        </properties>
    </persistence-unit>
</persistence>

PostgreSQL function:

CREATE OR REPLACE FUNCTION public.mysum(v1 integer, v2 integer, OUT vout integer)
 RETURNS integer
 LANGUAGE plpgsql
AS $function$
BEGIN
    SELECT v1+v2 INTO vout;
END;
$function$

Code:

EntityManagerFactory emf = Persistence.createEntityManagerFactory("mydbPU");
EntityManager em = emf.createEntityManager();

StoredProcedureQuery spq = em.createStoredProcedureQuery("mysum");
spq.registerStoredProcedureParameter("v1", Integer.class, ParameterMode.IN);
spq.registerStoredProcedureParameter("v2", Integer.class, ParameterMode.IN);
spq.registerStoredProcedureParameter("vout", Integer.class, ParameterMode.OUT);
spq.setParameter("v1", 1);   // exception here
spq.setParameter("v2", 2);
spq.execute();
System.out.println(spq.getOutputParameterValue("vout"));

em.close();
emf.close();

I get an exception:

Exception in thread "main" This RDBMS does not support stored procedures!
org.datanucleus.exceptions.NucleusUserException: This RDBMS does not support stored procedures!
	at org.datanucleus.store.rdbms.query.StoredProcedureQuery.compileInternal(StoredProcedureQuery.java:97)
	at org.datanucleus.store.query.Query.setImplicitParameter(Query.java:975)
	at org.datanucleus.api.jpa.JPAQuery.setParameter(JPAQuery.java:548)
	at org.datanucleus.api.jpa.JPAStoredProcedureQuery.setParameter(JPAStoredProcedureQuery.java:85)
	at org.datanucleus.api.jpa.JPAStoredProcedureQuery.setParameter(JPAStoredProcedureQuery.java:41)
	at vma.demo.FunctionTest.main(FunctionTest.java:19)

I also tried creating an empty function without parameters:

CREATE OR REPLACE FUNCTION public.my_empty_func()
 RETURNS void
 LANGUAGE plpgsql
AS $function$
BEGIN
END
$function$

The exception is different in this case:

Exception in thread "main" Error encountered when extracting results for SQL query "my_empty_func"
org.datanucleus.exceptions.NucleusDataStoreException: Error encountered when extracting results for SQL query "my_empty_func"
	at org.datanucleus.store.rdbms.query.StoredProcedureQuery.performExecute(StoredProcedureQuery.java:594)
	at org.datanucleus.store.rdbms.query.StoredProcedureQuery.executeQuery(StoredProcedureQuery.java:143)
	at org.datanucleus.store.query.Query.executeWithArray(Query.java:1855)
	at org.datanucleus.store.query.Query.execute(Query.java:1837)
	at org.datanucleus.api.jpa.JPAStoredProcedureQuery.execute(JPAStoredProcedureQuery.java:209)
	at vma.demo.FunctionTest.main(FunctionTest.java:24)
Caused by: org.postgresql.util.PSQLException: ERROR: syntax error at or near "CALL"
  Позиция: 1
	at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2422)
	at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2167)
	at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:306)
	at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:441)
	at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:365)
	at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:155)
	at org.postgresql.jdbc.PgCallableStatement.executeWithFlags(PgCallableStatement.java:78)
	at org.postgresql.jdbc.PgPreparedStatement.execute(PgPreparedStatement.java:144)
	at org.datanucleus.store.rdbms.datasource.dbcp2.DelegatingPreparedStatement.execute(DelegatingPreparedStatement.java:198)
	at org.datanucleus.store.rdbms.datasource.dbcp2.DelegatingPreparedStatement.execute(DelegatingPreparedStatement.java:198)
	at org.datanucleus.store.rdbms.query.StoredProcedureQuery.performExecute(StoredProcedureQuery.java:444)
	... 5 more

Not sure if this is relevant but I also get these warnings:

1507:20,552 (main) WARN  [DataNucleus.MetaData] - MetaData Parser encountered an error in file "file:/C/Users/Administrator/eclipse-workspace/demo/target/classes/META-INF/persistence.xml" at line 6, column 16 : cvc-complex-type.3.1: Value '2.2' of attribute 'version' of element 'persistence' is not valid with respect to the corresponding attribute use. Attribute 'version' has a fixed value of '2.1'. - Please check your specification of DTD/XSD and the validity of the MetaData XML header that you have specified.
15:0721,303 (main) WARN  [DataNucleus.Datastore.Schema] - You have specified the default schema as public but for this datastore this has been changed to "public". This is likely due to missing quote characters, or the datastore storing things in a different case

I use PostgreSQL 9.6.7 in Debian 9.4. Here're the artifacts I used for running the code:

  • javax.persistence-api: 2.2

  • datanucleus-core: 5.1.6

  • datanucleus-api-jpa: 5.1.4

  • datanucleus-rdbms: 5.1.6

  • postgresql: 42.2.1


Re: Custom keys and models with objectIdClass, cause an exception when quering using mongo

Manuel Castillo <contact@...>
 

Hi Andy,

Thanks for the response! I didn't change the queries since we need pretty much our current key configuration; but you gave me an idea: instead of direclty passing the Key object to getObjectById; we changed this to building a String with the sintax: Key.class.getName() +":" +key.toString() so we delegate the key construction to an ObjectId inside the fetchObject method. This is working great! No other changes were required.

Thanks!


2018-04-17 2:40 GMT-05:00 Andy <andy@...>:

I can run an abstract base class + concrete subclass with a basic own definition of PK, and then persist an object (of the concrete type), and then call getObjectById using that PK (with it doing a database call). It works (with MongoDB, and also with RDBMS, and likely any other datastores).

Consequently you should look at your code, and run the same thing but without your PK class (i.e using the built-in PK classes) and see what happens.
Then just create a very simple own definition of PK class (as the `objectIdClass`) ... as per the example in the docs ... and see if that works. If it does work then your PK class is the problem.

A "second class type" is defined in the docs and the JDO spec.

It is fairly safe to conclude that if something works with RDBMS then you should NOT be considering changing ANY code in `datanucleus-core` just to get some minority case on MongoDB to work (if that indeed ends up as the conclusion).



Re: Custom keys and models with objectIdClass, cause an exception when quering using mongo

Andy
 
Edited

I can run an abstract base class + concrete subclass with a basic own definition of PK, and then persist an object (of the concrete type), and then call getObjectById using that PK (with it doing a database call). It works (with MongoDB, and also with RDBMS, and likely any other datastores).

Consequently you should look at your code, and run the same thing but without your PK class (i.e using the built-in PK classes) and see what happens.
Then just create a very simple own definition of PK class (as the `objectIdClass`) ... as per the example in the docs ... and see if that works. If it does work then your PK class is the problem.

A "second class type" is defined in the docs and the JDO spec.

FWIW `targetClassName` is a special field name for PK classes, defined in the DN 5.1 documentation (search the Mapping Guide). This will store the fully-qualified type name of the object that it represents. If you want to store something other than that, you should use a different field name.


It is fairly safe to conclude that if something works with RDBMS then you should NOT be considering changing ANY code in `datanucleus-core` just to get some minority case on MongoDB to work (if that indeed ends up as the conclusion).


Custom keys and models with objectIdClass, cause an exception when quering using mongo

Manuel Castillo <contact@...>
 

Hello,

I've some classes with a custom key which I believe satisfy datanucleus-JDO requirements; the key class is as follows:
public final class Key implements Serializable {

	static final long serialVersionUID = -448150158203091507L;
	public String targetClassName;
	public String id;
	
	public Key() {	}
	
	public Key(String str) {
		init(str);
	}
	
	private void init(String str) {
		if(StringUtils.isEmpty(str)) {
			targetClassName = "";
			id = "";
			return;
		}
		String[] parts = str.split("\\(");
		parts[1] = parts[1].replaceAll("\\)", " ");
		parts[1] = parts[1].replace("\"", " ");
		parts[1] = parts[1].trim();
		this.targetClassName = parts[0];
		this.id = parts[1];
	}
	
	public void complete() {
		init(id);
	}

	public Key(String classCollectionName, String id) {
		if (StringUtils.isEmpty(classCollectionName)) {
			throw new IllegalArgumentException("No collection/class name specified.");
		}
		if (id == null) {
			throw new IllegalArgumentException("ID cannot be null");
		}
		targetClassName = getTargetClassName(classCollectionName);
		this.id = id;
	}

	public int hashCode() {
		int prime = 31;
		int result = 1;
		if(!isComplete()) init(id);
		result = prime * result + id.hashCode();
		result = prime * result + targetClassName.hashCode();
		return result;
	}

	public boolean equals(Object object) {
		if(!isComplete()) init(id);
		if (object instanceof Key) {
			Key key = (Key) object;
			if (this == key)
				return true;
			return targetClassName.equals(key.targetClassName) && Objects.equals(id, key.id);
		} else {
			return false;
		}
	}

	public String toString() {
		if(!isComplete()) init(id);
		StringBuilder buffer = new StringBuilder();
		int index = targetClassName.lastIndexOf('.');
		buffer.append(index < 0 ? targetClassName : targetClassName.substring(++index));
		buffer.append("(");
		if (id != null) {
			buffer.append("\"").append(id)
					.append("\"").toString());
		} else {
			buffer.append("no-id-yet");
		}
		buffer.append(")");
		return buffer.toString();
	}

	public boolean isComplete() {
		return targetClassName != null && id != null;
	}

}
I'm aware this key class is kinda silly, since models using it only declare a primary key of type String; so using the String as a primary key would just do it... But this is part of a system which has been in production for a while and older versions used to build keys in a pattern that is easily supported with this key class; so it helps with support and consistency for data of all versions.

So anyway; we have and need this key class and classes which use it of course have a signature like this:
@PersistenceCapable(detachable="true", objectIdClass=Key.class)
@Inheritance(strategy=InheritanceStrategy.COMPLETE_TABLE)
public abstract class Entity {
    @PrimaryKey
    @Persistent(valueStrategy = IdGeneratorStrategy.UNSPECIFIED, column="_id")
    public String id;
...
}
Child classes have a very similar signature; except for the "objectIdClass" metadata which isn't allowed; since only the super class can define the key fields for a given inheritance tree. Following those rules, the id class attribute is also omitted.

Our persistence.xml is like this:

<?xml version="1.0" encoding="UTF-8" ?>
<persistence xmlns="http://java.sun.com/xml/ns/persistence"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://java.sun.com/xml/ns/persistence
        http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd" version="2.0">

    <!-- JOSAdmin "unit" -->
    <persistence-unit name="ourdatastore">
        <class>mx.ourdomain.Entiy</class>
        <class>mx.ourdomain.Ticket</class>
        <exclude-unlisted-classes/>

    </persistence-unit>
</persistence>
And package-mongo.orm is:
<?xml version="1.0"?>
<!DOCTYPE orm SYSTEM "file:/javax/jdo/orm.dtd">
<orm>
    <package name="mx.ourdomain" >
        <class name="Entity" table="Entity">
            <field name="id" primary-key="true" >
                <column name="_id" length="100" />
            </field >
        </class>

        <class name="Ticket" table="Ticket">
            <primary-key >
                <column name="_id" target="_id" />
            </primary-key>
        </class>
     </package>
</orm>
And most of the time all of this works just fine. We can write and read into our mongo database fine... except when using PersistenceManager.getObjectById(Ticket.class, someKeyOfTypeKey);

Every time we run this we got a NullPointerException.

Tracking down the problem to check if a missconfiguration was clear, we found that when datanucleus tries to "getClassDetailsForClass" (ExecutionContextImpl line 3502 at datanucleus-core v5.1.8); it will be holding a null objectClassName; since findObject doesn't provide one (it just gives null; which looks fine, since this code is the one responsible for determining where to fetch the object).

So, getClassDetailsForClass will call StoreManager.manageClassForIdentity and since MongoDBStoreManager doesn't override this method; the AbstractStoreManager class will answer the call. The first thing it does (line 660, datanucleus-core v5.1.8) is to check if the key class is supported as a second class (whathever that means).

It does (it returns true since the Key class isn't null and isn't a Java type) Then it just returns null; and skips adding the given ClassLoaderResolver to the MongoDBStoreManager's map of managed classes.

Returning null is not the problem here; the following instructions get the correct class name of the object we want to fetch; but a little later when trying to validate the object (ExecutionContextImpl, findObject method, line 3557 in datanucleus-core v5.1.8); and when tries to populate the hollow object with values from the database (StateManagerImpl,validate method, line 5503 at datanucleus-core v5.1.8); its when it fires the NullPointerException. The exception goes unwrapped and all the way up breaking the query.

More specifically, the exception occurs when the MongoDBPersistenceHandler tries to get the table description of the class we want to fetch; but the storeMgr attribue (of type MongoDBStoreManager; the one which skipped adding the ClassLoaderResolver to its managed classes map) returns null when its asked for the desired object's class (since it never registered it). This happens at method fetchObject line 597 in datanucleus-mongodb v5.1.0-release and this works fine when not using the PersistenceManager.getObjectById method, it adds the classes to the managed map in those cases.

So, are we missing some configuration to get this to work? Or should the AbstractStoreManager, manageClassForIdentity method should register class of the object with custom Key at some point?

Thank you!


Re: Application identity ("custom keys") might be breaking queries for inheritance classes

Manuel Castillo <contact@...>
 

thank you Andy, using objectIdClass metadata on super classes solved the problem. I'm not sure if inheritance is not involved at all; performing some minor changes to the Key class (removing the hashCode and toString attributes and the final qualifiers) and to the super classes (replacing the Key key attribute for String id as in the Key class; with some minor modifications to the Key getters and setters) did the trick.

No modifications were necessary on any other classes which does not involve inheritance, DAOs or anywhere else.

I'm sorry for not adding any LOG reference, it seemed to me pretty straightfoward that adding a non-recognized data type on a Mongo's BasicDBOject will result in an exception; the code that follows the snippet I copied is:
DBObject foundObj = dbColl.findOne(query);

Which will result in a CodecConfigurationException when the key object class is not recognized by Mongo.

But after all, there is a way to get this to work, which is really good; but I'm not sure if the code should go all the way there with an invalid key (for mongo) or not wrapping the exception and telling that a objectId and/or attribute converter should be used.


Re: Application identity ("custom keys") might be breaking queries for inheritance classes

Andy
 
Edited

Hi,

Take a step backwards. How do you think DataNucleus knows how to persist a field of type `Key`? It doesn't, unless you tell it. You don't define that as an "objectIdClass" for that persistable class, and you don't define an `@AttributeConverter` for that `Key` type, so it knows not. Defining an `@AttributeConverter` to convert `Key` to String would be the normal way of handling that. Inheritance has nothing to do with that.

I also don't see any reference to the LOG which tells you what MongoDB query it performs for your `getObjectById` call, or indeed what MongoDB call is made to do the INSERT of the object in the first place. They would likely reveal what it is trying to do with it.


Exception when retrieving items modeled by inheritance classes from MongoDB using Datanucleus

Manuel Castillo <contact@...>
 

me and my team are working on an upgrade of our company's system which as getting kind of forgotten and was running old versions of everything it uses; so developing newer features was becoming a pain with newer and unsupported technologies.

So far we have managed to produce an almost fully working version of the system; but we got stuck at a feature which involves Datanucleus-JDO, MongoDB and inheritance.

We have some models which are tremendously simmilar (from the code's prespective). In the current in-production version, to apply a change to it usually involves to rewrite the same piece of code in all classes, so we thought that inheritance would make the job easier and better. So we have two interfaces at the top hirarchy level (which as far we know, Datanuclues nor MongoDB doesn't care about them at all); which go like this:

public interface Entity extends Serializable {

    String getDate();
    double getQty();
    void setQty(double qty);
    void setDate(String date);
    void setKey(Key key);

}

And

public interface HourEntity extends Entity {

    String getHour();

}

We use application defined keys, we use this unique class to build different kind of keys. We only want the toString representation of this class to sotre and retrieve data in Mongo.

public final class Key implements Serializable {
    static final long serialVersionUID = -448150158203091507L;
    public final String targetClassName;
    public final String id;
    public final String toString;
    public final int hashCode;

    public Key() {
        targetClassName = null;
        id = null;
        toString = null;
        hashCode = -1;
    }

    public Key(String str) {
        String[] parts = str.split("\\(");
        parts[1] = parts[1].replaceAll("\\)", " ");
        parts[1] = parts[1].replace("\"", " ");
        parts[1] = parts[1].trim();
        this.targetClassName = parts[0];
        this.id = parts[1];
        toString = this.toString();
        hashCode = this.hashCode();
    }

    public Key(String classCollectionName, String id) {
        if (StringUtils.isEmpty(classCollectionName)) {
            throw new IllegalArgumentException("No collection/class name specified.");
        }
        if (id == null) {
            throw new IllegalArgumentException("ID cannot be null");
        }
        targetClassName = classCollectionName;
        this.id = id;
        toString = this.toString();
        hashCode = this.hashCode();
    }

    public String getTargetClassName() {
        return targetClassName;
    }

    public int hashCode() {
        if(hashCode != -1) return hashCode; 
        int prime = 31;
        int result = 1;
        result = prime * result + (id != null ? id.hashCode() : 0);
        result = prime * result + (targetClassName != null ? targetClassName.hashCode() : 0);
        return result;
    }

    public boolean equals(Object object) {
    if (object instanceof Key) {
        Key key = (Key) object;
        if (this == key)
            return true;
        return targetClassName.equals(key.targetClassName) && Objects.equals(id, key.id);
    } else {
        return false;
    }
}

    public String toString() {
        if(toString != null) return toString;
        StringBuilder buffer = new StringBuilder();
        buffer.append(targetClassName);
         buffer.append("(");
        if (id != null) {
            buffer.append((new StringBuilder()).append("\"").append(id)
                    .append("\"").toString());
        } else {
            buffer.append("no-id-yet");
        }
        buffer.append(")");
        return buffer.toString();
    }

}

This apllication defined identiy is working fine on all other models which does not involve iheritance.

This is one of the actual models that we intend to store in our datastore:

@PersistenceCapable(detachable="true")
@Inheritance(strategy=InheritanceStrategy.COMPLETE_TABLE)
public class Ticket implements Entity {

    @PrimaryKey
    @Persistent(valueStrategy = IdGeneratorStrategy.UNSPECIFIED, column="_id")
    protected Key key;

    protected String date;
    protected int qty;

    public Ticket() {
        this.qty = 0;
    }

    public Key getKey() {
        return key;
    }

    @Override
    public void setKey(Key key) {
        this.key = key;
    }

    public double getQty() {
        return qty;
    }

    public void setQty(double qty) {
        this.qty = (int) qty;
    }

    public String getDate() {
        return date;
    }

    public void setDate(String date) {
        this.date = date;
    }

    @Override
    public int hashCode() {
        final int prime = 31;
        int result = 1;
        result = prime * result + ((key == null) ? 0 : key.hashCode());
        return result;
    }

    @Override
    public boolean equals(Object obj) {
        if (this == obj)
            return true;
        if (obj == null)
            return false;
        if (getClass() != obj.getClass())
            return false;
        Ticket other = (Ticket) obj;
        if (key == null) {
            if (other.key != null)
                return false;
        } else if (!key.equals(other.key))
            return false;
        return true;
    }

    @Override
    public String toString() {
        return "Ticket [key=" + key + ", date=" + date + ", qty="
                + qty + "]";
    }

}

And this is its subclass (all models which involve this problem just involve one super class and only one children per every super class):

@PersistenceCapable(detachable="true")
@Inheritance(strategy=InheritanceStrategy.COMPLETE_TABLE)
public class HourTicket extends Ticket implements HourEntity {

    private String hour;

    public HourTicket() {
        super();
    }

    public Key getKey() {
        return key;
    }

    @Override
    public void setKey(Key key) {
        this.key = key;
    }

    public String getHour() {
        return hour;
    }

    public void setHour(String hour) {
        this.hour = hour;
    }

    @Override
    public int hashCode() {
        final int prime = 31;
        int result = 1;
        result = prime * result + ((key == null) ? 0 : key.hashCode());
        return result;
    }

    @Override
    public boolean equals(Object obj) {
        if (this == obj)
            return true;
        if (obj == null)
            return false;
        if (getClass() != obj.getClass())
            return false;
        HourTicket other = (HourTicket) obj;
        if (key == null) {
            if (other.key != null)
                return false;
        } else if (!key.equals(other.key))
            return false;
        return true;
    }

    @Override
    public String toString() {
        return "HourTicket [key=" + key + ", date=" + date
                + ", hour=" + hour + ", qty=" + qty + "]";
    }

}

Finally, the persisntance.xml is like this

<?xml version="1.0" encoding="UTF-8" ?>
<persistence xmlns="http://java.sun.com/xml/ns/persistence"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://java.sun.com/xml/ns/persistence
        http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd" version="2.0">

    <!-- JOSAdmin "unit" -->
    <persistence-unit name="ourdatastore">
        <class>mx.ourdomain.Ticket</class>
        <class>mx.ourdomain.HourTicket</class>
        <exclude-unlisted-classes/>

    </persistence-unit>
</persistence>

And package-mongo.orm

<?xml version="1.0"?>
<!DOCTYPE orm SYSTEM "file:/javax/jdo/orm.dtd">
<orm>
    <package name="mx.ourdomain" >
        <class name="Ticket" table="Ticket">
            <field name="key" primary-key="true" >
                <column name="_id" length="100" />
            </field >
        </class>

        <class name="HourTicket" table="HourTicket">
            <primary-key >
                <column name="_id" target="_id" />
            </primary-key>
        </class>
     </package>
</orm>

So, the problems comes when trying to perform any read or write opperations using either the super class or the subclass. This has happned with the same exact results in several (all posible as far we know) scenarios, but the test scenario we are study begins with this call:

Ticket ticket = persistenceManager.getObjectById(Ticket.class, key);

The key is generated with an standard procedure which is used by other models which do store and read successfully; and of course, it is of the previously shown key class.

We've tried to redifine the Key class as a DatastoreId and using KeyTraslator with no success; the further we got was to perform a successful read, but when mapping the values into the model object; the String returned by the traslator was being casted to our Key class; which of course resulted on a ClassCastException.

We tried to use a String from the traslator since the spectrum of data types that the mongo driver can natively support is very narrow, and it includes String, the String representation of our key is the actual desired value to use within mongo (which is done successful on any other classes which does not involve inheritance) and the Datanuclues docs states that (http://www.datanucleus.org/products/datanucleus/jdo/mapping.html#application_identity --> Application Identity : Accessing objects by Identity):

If you are using your own PK class then the mykey value is the toString() form of the identity of your PK class. 

If this means that we have to explicitly call the toString method of our Key class, I think is a bit unclear since the method accepts Object and works fine with the Key object itself in all classes without inheritance.

So, I have the feeling that this might be a bug in datanucleus-mongodb (or just some missconfiguration in our project), because in datanucleus-mongodb v5.1.0-release, class MongoDBUtils, method  getClassNameForIdentity(Object, AbstractClassMetaData, ExecutionContext, ClassLoaderResolver)) there is the following code:

...
BasicDBObject query = new BasicDBObject();
if (rootCmd.getIdentityType() == IdentityType.DATASTORE)
{
    ...
} else if (rootCmd.getIdentityType() == IdentityType.APPLICATION) { if (IdentityUtils.isSingleFieldIdentity(id)) { Object key = IdentityUtils.getTargetKeyForSingleFieldIdentity(id); /// <-- HERE int[] pkNums = rootCmd.getPKMemberPositions(); AbstractMemberMetaData pkMmd = rootCmd.getMetaDataForManagedMemberAtAbsolutePosition(pkNums[0]); String pkPropName = table.getMemberColumnMappingForMember(pkMmd).getColumn(0).getName(); query.put(pkPropName, key); /// <--- AND MAYBE ALSO HERE } ...

The Mongo Java Driver has very few data types which it supports; which of course excludes our Key class, so I think that datanuclues should put into the query the toString() form of the key object above to prevent a  CodecConfigurationException; which is the one broking our code.

So, should I refactor most queries to explicity use the toString of the key, are we missing some configurations, and/or should we contribute with a methot to perform this data type map (calling toString when the key class is not in this list: http://mongodb.github.io/mongo-java-driver/3.7/bson/documents/)

Thanks!


Application identity ("custom keys") might be breaking queries for inheritance classes

Manuel Castillo <contact@...>
 

Hello, I'm having a trouble when performing any kind of queries into collections mapped to classes with inheritance.

I'm using datanucleus JDO with MongoDB. The key class I'm are using is like this:
public final class Key implements Serializable {
    static final long serialVersionUID = -448150158203091507L;
    public final String targetClassName;
    public final String id;
    public final String toString;
    public final int hashCode;

    public Key() {
        targetClassName = null;
        id = null;
        toString = null;
        hashCode = -1;
    }

    public Key(String str) {
        String[] parts = str.split("\\(");
        parts[1] = parts[1].replaceAll("\\)", " ");
        parts[1] = parts[1].replace("\"", " ");
        parts[1] = parts[1].trim();
        this.targetClassName = parts[0];
        this.id = parts[1];
        toString = this.toString();
        hashCode = this.hashCode();
    }

    public Key(String classCollectionName, String id) {
        if (StringUtils.isEmpty(classCollectionName)) {
            throw new IllegalArgumentException("No collection/class name specified.");
        }
        if (id == null) {
            throw new IllegalArgumentException("ID cannot be null");
        }
        targetClassName = classCollectionName;
        this.id = id;
        toString = this.toString();
        hashCode = this.hashCode();
    }

    public String getTargetClassName() {
        return targetClassName;
    }

    public int hashCode() {
        if(hashCode != -1) return hashCode; 
        int prime = 31;
        int result = 1;
        result = prime * result + (id != null ? id.hashCode() : 0);
        result = prime * result + (targetClassName != null ? targetClassName.hashCode() : 0);
        return result;
    }

    public boolean equals(Object object) {
    if (object instanceof Key) {
        Key key = (Key) object;
        if (this == key)
            return true;
        return targetClassName.equals(key.targetClassName) && Objects.equals(id, key.id);
    } else {
        return false;
    }
}

    public String toString() {
        if(toString != null) return toString;
        StringBuilder buffer = new StringBuilder();
        buffer.append(targetClassName);
         buffer.append("(");
        if (id != null) {
            buffer.append((new StringBuilder()).append("\"").append(id)
                    .append("\"").toString());
        } else {
            buffer.append("no-id-yet");
        }
        buffer.append(")");
        return buffer.toString();
    }

}

This identity model is working fine on all other classes which don't involve inheritance. Examples of the models which are giving trouble are; the supper class:
@PersistenceCapable(detachable="true")
@Inheritance(strategy=InheritanceStrategy.COMPLETE_TABLE)
public class Ticket implements Entity {

    @PrimaryKey
    @Persistent(valueStrategy = IdGeneratorStrategy.UNSPECIFIED, column="_id")
    protected Key key;

    protected String date;
    protected int qty;

    public Ticket() {
        this.qty = 0;
    }

    public Key getKey() {
        return key;
    }

    @Override
    public void setKey(Key key) {
        this.key = key;
    }

    public double getQty() {
        return qty;
    }

    public void setQty(double qty) {
        this.qty = (int) qty;
    }

    public String getDate() {
        return date;
    }

    public void setDate(String date) {
        this.date = date;
    }

    @Override
    public int hashCode() {
        final int prime = 31;
        int result = 1;
        result = prime * result + ((key == null) ? 0 : key.hashCode());
        return result;
    }

    @Override
    public boolean equals(Object obj) {
        if (this == obj)
            return true;
        if (obj == null)
            return false;
        if (getClass() != obj.getClass())
            return false;
        Ticket other = (Ticket) obj;
        if (key == null) {
            if (other.key != null)
                return false;
        } else if (!key.equals(other.key))
            return false;
        return true;
    }

    @Override
    public String toString() {
        return "Ticket [key=" + key + ", date=" + date + ", qty="
                + qty + "]";
    }

}
And the subclass:
@PersistenceCapable(detachable="true")
@Inheritance(strategy=InheritanceStrategy.COMPLETE_TABLE)
public class HourTicket extends Ticket implements HourEntity {

    private String hour;

    public HourTicket() {
        super();
    }

    public Key getKey() {
        return key;
    }

    @Override
    public void setKey(Key key) {
        this.key = key;
    }

    public String getHour() {
        return hour;
    }

    public void setHour(String hour) {
        this.hour = hour;
    }

    @Override
    public int hashCode() {
        final int prime = 31;
        int result = 1;
        result = prime * result + ((key == null) ? 0 : key.hashCode());
        return result;
    }

    @Override
    public boolean equals(Object obj) {
        if (this == obj)
            return true;
        if (obj == null)
            return false;
        if (getClass() != obj.getClass())
            return false;
        HourTicket other = (HourTicket) obj;
        if (key == null) {
            if (other.key != null)
                return false;
        } else if (!key.equals(other.key))
            return false;
        return true;
    }

    @Override
    public String toString() {
        return "HourTicket [key=" + key + ", date=" + date
                + ", hour=" + hour + ", qty=" + qty + "]";
    }

}
Our persistence.xml is like this:
<?xml version="1.0" encoding="UTF-8" ?>
<persistence xmlns="http://java.sun.com/xml/ns/persistence"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://java.sun.com/xml/ns/persistence
        http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd" version="2.0">

    <!-- JOSAdmin "unit" -->
    <persistence-unit name="ourdatastore">
        <class>mx.ourdomain.Ticket</class>
        <class>mx.ourdomain.HourTicket</class>
        <exclude-unlisted-classes/>

    </persistence-unit>
</persistence>
And package-mongo.orm is:
<?xml version="1.0"?>
<!DOCTYPE orm SYSTEM "file:/javax/jdo/orm.dtd">
<orm>
    <package name="mx.ourdomain" >
        <class name="Ticket" table="Ticket">
            <field name="key" primary-key="true" >
                <column name="_id" length="100" />
            </field >
        </class>

        <class name="HourTicket" table="HourTicket">
            <primary-key >
                <column name="_id" target="_id" />
            </primary-key>
        </class>
     </package>
</orm>
The problems comes when trying to perform any read or write opperations using either the super class or the subclass. This has happned with the same exact results in several (all posible as far we know) scenarios, but the test scenario we are study begins with this call:
Ticket ticket = persistenceManager.getObjectById(Ticket.class, key);
I followed the debbuger to see if there was some obvious misconfugration going on. I already tried to use KeyTraslators and DatastoreIds, but every time we got the same result.

I have the feeling that this could be a bug, since there is this piece of code in the datanuclues-mongodb 5.1.0-release in the method getClassNameForIdentity(Object, AbstractClassMetaData, ExecutionContext, ClassLoaderResolver)):
BasicDBObject query = new BasicDBObject();
if (rootCmd.getIdentityType() == IdentityType.DATASTORE) {
...
} else if (rootCmd.getIdentityType() == IdentityType.APPLICATION) { if (IdentityUtils.isSingleFieldIdentity(id)) { Object key = IdentityUtils.getTargetKeyForSingleFieldIdentity(id); /// <--- HERE int[] pkNums = rootCmd.getPKMemberPositions(); AbstractMemberMetaData pkMmd = rootCmd.getMetaDataForManagedMemberAtAbsolutePosition(pkNums[0]); String pkPropName = table.getMemberColumnMappingForMember(pkMmd).getColumn(0).getName(); query.put(pkPropName, key);
/// <--- AND MAYBE HERE
}
....

The type of key is Object; but the spectrum of data types that Mongo BasicDBObject is very narrow and ofcourse it excludes the Key type we've defined; hence, it need a Coded and throws a CodecConfigurationException which is the one breaking the code. Shouldn't the key use the toString of the object when its class its not supported by the mongo driver? (http://mongodb.github.io/mongo-java-driver/3.1/bson/documents/) It is stated in the Datanuclues docs that when using application managed keys the toString form of it will be used so... are I wrong or did I found a bug?

Thanks

I 've already checked that


Re: PersistentClassROF#getObject method performs a shallow search so not finding all subclasses

Page bloom
 

I have added a pull request - i created some two extra local booleans to help track the history because the message is constructed well after a null className is returned. These two booleans could probably be condensed into one.


Re: PersistentClassROF#getObject method performs a shallow search so not finding all subclasses

Andy
 
Edited

Yes, make whatever error you see as explicit as possible to whatever situation you have. Since i've not had these things then the message was likely un-specific.


Re: ClassCastException: EnhancementNucleusContextImpl --> PersistenceNucleusContext

Christopher Mosher <cmosher01@...>
 

Andy,
Thank you for investigating so quickly. And thanks for all your work on Datanucleus.
Regards,
Chris Mosher

On Sat, Apr 7, 2018, 02:41 Andy <andy@...> wrote:
Thx for the report.
See https://github.com/datanucleus/datanucleus-api-jdo/issues/68
Seems like when this CDI handling code was copied across from the JPA plugin it wasn't copied exactly, so was applying at enhancement and at runtime, when only runtime makes any sense.


Re: ClassCastException: EnhancementNucleusContextImpl --> PersistenceNucleusContext

Andy
 

Thx for the report.
See https://github.com/datanucleus/datanucleus-api-jdo/issues/68
Seems like when this CDI handling code was copied across from the JPA plugin it wasn't copied exactly, so was applying at enhancement and at runtime, when only runtime makes any sense.


ClassCastException: EnhancementNucleusContextImpl --> PersistenceNucleusContext

cmosher01@...
 

I'm trying to use the Ant task to enhance a class having a field with a custom converter attached, and I'm getting the following class-cast exception. Any ideas what might be causing it, or how to work around it?
I'd be glad to create a minimized test-case if needed.

[ant:enhance] 21:06:42.254 [main] DEBUG DataNucleus.Enhancer - Enhancing classes
[ant:enhance] 21:06:42.255 [main] DEBUG DataNucleus.MetaData - MetaData Management : Loading Metadata for metadata files "[/home/user/dev/github_cmosher01/Genealdb/build/resources/main/META-INF/package.jdo]" ...
[ant:enhance] 21:06:42.259 [main] DEBUG DataNucleus.MetaData - Parsing MetaData file "file:/home/user/dev/github_cmosher01/Genealdb/build/resources/main/META-INF/package.jdo" using handler "jdo" (validation="false")
[ant:enhance] 21:06:42.275 [main] ERROR DataNucleus.MetaData - An error occurred while parsing <"field"> nested within "org.datanucleus.metadata.ClassMetaData@3c9bfddc [nu.mine.mosher.genealdb.model.Day], modifier=persistence-capable, members.size=0" for URI "http://xmlns.jcp.org/xml/ns/jdo/jdo"
[ant:enhance] java.lang.ClassCastException: org.datanucleus.enhancer.EnhancementNucleusContextImpl cannot be cast to org.datanucleus.PersistenceNucleusContext
[ant:enhance]   at org.datanucleus.api.jdo.metadata.JDOMetaDataHandler.newFieldObject(JDOMetaDataHandler.java:300) ~[datanucleus-api-jdo-5.1.5.jar:?]
[ant:enhance]   at org.datanucleus.api.jdo.metadata.JDOMetaDataHandler.startElement(JDOMetaDataHandler.java:677) [datanucleus-api-jdo-5.1.5.jar:?]
[ant:enhance]   at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.startElement(AbstractSAXParser.java:509) [?:?]
[ant:enhance]   at com.sun.org.apache.xerces.internal.parsers.AbstractXMLDocumentParser.emptyElement(AbstractXMLDocumentParser.java:183) [?:?]
[ant:enhance]   at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.scanStartElement(XMLNSDocumentScannerImpl.java:351) [?:?]
[ant:enhance]   at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl$FragmentContentDriver.next(XMLDocumentFragmentScannerImpl.java:2706) [?:?]
[ant:enhance]   at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:601) [?:?]
[ant:enhance]   at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.next(XMLNSDocumentScannerImpl.java:112) [?:?]
[ant:enhance]   at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:531) [?:?]
[ant:enhance]   at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:885) [?:?]
[ant:enhance]   at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:821) [?:?]
[ant:enhance]   at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:141) [?:?]
[ant:enhance]   at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1213) [?:?]
[ant:enhance]   at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl$JAXPSAXParser.parse(SAXParserImpl.java:639) [?:?]
[ant:enhance]   at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl.parse(SAXParserImpl.java:323) [?:?]
[ant:enhance]   at javax.xml.parsers.SAXParser.parse(SAXParser.java:196) [?:?]
[ant:enhance]   at org.datanucleus.metadata.xml.MetaDataParser.parseMetaDataStream(MetaDataParser.java:290) [datanucleus-core-5.1.8.jar:?]
[ant:enhance]   at org.datanucleus.metadata.xml.MetaDataParser.parseMetaDataURL(MetaDataParser.java:190) [datanucleus-core-5.1.8.jar:?]
[ant:enhance]   at org.datanucleus.api.jdo.metadata.JDOMetaDataManager.parseFile(JDOMetaDataManager.java:253) [datanucleus-api-jdo-5.1.5.jar:?]
[ant:enhance]   at org.datanucleus.metadata.MetaDataManagerImpl.loadFiles(MetaDataManagerImpl.java:1425) [datanucleus-core-5.1.8.jar:?]
[ant:enhance]   at org.datanucleus.metadata.MetaDataManagerImpl.loadMetadataFiles(MetaDataManagerImpl.java:527) [datanucleus-core-5.1.8.jar:?]
[ant:enhance]   at org.datanucleus.enhancer.DataNucleusEnhancer.getFileMetadataForInput(DataNucleusEnhancer.java:732) [datanucleus-core-5.1.8.jar:?]
[ant:enhance]   at org.datanucleus.enhancer.DataNucleusEnhancer.enhance(DataNucleusEnhancer.java:501) [datanucleus-core-5.1.8.jar:?]
[ant:enhance]   at org.datanucleus.enhancer.DataNucleusEnhancer.main(DataNucleusEnhancer.java:1157) [datanucleus-core-5.1.8.jar:?]

Thank you,
Chris Mosher

361 - 380 of 446