3 Understanding the GORM API - Reference Documentation
Authors: Graeme Rocher
Version: 5.0.8.RELEASE
Table of Contents
3 Understanding the GORM API
Introduction
The GORM Datastore API is split into a low-level API that implementors need to implement for each individual datastore and then set of higher level APIs that enhance domain classes with things regular users see such as dynamic finders, criteria queries and so on.The low-level API classes are found in thegrails-datastore-core
subproject, whilst the higher level APIs used to enhance domain classes are found in grails-datastore-gorm
. In this section we will discuss the low-level API.
3.1 Datastore Basics
The MappingContext
Theorg.grails.datastore.mapping.model.MappingContext
interface is used to obtain metadata about the classes that are configured for persistence. There are org.grails.datastore.mapping.model.PersistentEntity
and org.grails.datastore.mapping.model.PersistentProperty
interfaces that represent a class and its properties respectively. These can be obtained and introspected via the MappingContext
.There are various concrete implementations of the MappingContext
interface such as:
DocumentMappingContext
- Used for document stores, subclassed byMongoMappingContext
JpaMappingContext
- Used for JPAKeyValueMappingContext
- Used by key/value stores
MappingContext
may be useful because it allows users to configure how a class is mapped to the underlying datastore using GORM's mapping
block as well as allowing registration of custom type converters and so on. The implementation for Neo4j looks like this:class Neo4jMappingContext extends AbstractMappingContext { MappingFactory<Collection, Attribute> mappingFactory MappingConfigurationStrategy syntaxStrategy Neo4jMappingContext() { mappingFactory = new GraphGormMappingFactory() syntaxStrategy = new GormMappingConfigurationStrategy(mappingFactory) //addTypeConverter(new StringToNumberConverterFactory().getConverter(BigDecimal)) addTypeConverter(new StringToShortConverter()) addTypeConverter(new StringToBigIntegerConverter()) … } @Override protected PersistentEntity createPersistentEntity(Class javaClass) { GraphPersistentEntity persistentEntity = new GraphPersistentEntity(javaClass, this) mappingFactory.createMappedForm(persistentEntity) // populates mappingFactory.entityToPropertyMap as a side effect persistentEntity } MappingConfigurationStrategy getMappingSyntaxStrategy() { syntaxStrategy } MappingFactory getMappingFactory() { mappingFactory } }
GraphGormMappingFactory
and GraphPersistentEntity
to allow the domain class configuration to be changed for a given Neo4j Node
.The Datastore Interface
Theorg.grails.datastore.mapping.core.Datastore
interface is the equivalent of a SQL DataSource
where by it provides the necessary capability to create a connection. In most cases one can simply subclass the AbstractDatastore
super class and implement the createSession
method. The following implementation is from the SimpleMapDatastore
which implements GORM ontop of a ConcurrentHashMap
:@Override protected Session createSession(PropertyResolver connDetails) { return new SimpleMapSession(this, getMappingContext(), getApplicationEventPublisher()); }
@Override protected Session createSession(PropertyResolver connDetails) { return new MongoSession(this, getMappingContext(), getApplicationEventPublisher(), false); }
Datastore
also has a reference to the MappingContext
discussed in the previous section.
The Session Interface
Theorg.grails.datastore.mapping.core.Session
interface represents an active connection. It can be either stateful or stateless, depending on the implementation. For example of embedded databases where there is no network connection, a stateful session is not particularly useful, but a datastore that creates network connections you may want to cache returned instances to reduce load.The AbstractSession
class provides some support for creating stateful sessions, if you prefer a stateless implementation then simply implement Session
or subclass AbstractAttributeStoringSession
.In general if you subclass AbstractSession
the minimum you need to do is implement the createPersister
method:protected Persister createPersister(Class cls, MappingContext mappingContext) { PersistentEntity entity = mappingContext.getPersistentEntity(cls.getName()); if (entity == null) { return null; } return new SimpleMapEntityPersister(mappingContext, entity, this, (SimpleMapDatastore) getDatastore(), publisher); }
SimpleMapSession
implementation, which creates a SimpleMapEntityPersister
instance and returns it. Returning null indicates that the class cannot be persisted and an exception will be thrown
3.2 Implementing CRUD
The EntityPersister Interface
TheEntityPersister
interface is used to implement the basic Create, Read, Update and Delete (CRUD) operations. There are individual methods to implement such as persistEntity
, updateEntity
, deleteEntity
and so on.In many cases there is a representation of an entity in its "native" form as supplied by the datastore driver. For example in Cassandra this could be a ColumnFamily
, or in MongoDB a DBCollection
.To support implementation such cases there is an abstract NativeEntryEntityPersister<T, K>
super class that provides the basis for an implementation that maps a native entry, such as a MongoDB DBObject
or a Neo4j Node
to a persist entity and back again.The 2 generic types of this superclass indicate the native entry type (example DBObject
in MongoDB) and the native key type (example ObjectId
in MongoDB). The MongoDB implementation looks like this:public class MongoEntityPersister extends NativeEntryEntityPersister<DBObject, Object>
Object
is used for the key since MongoDB also supports Long and String-based identifiers.They key methods that need implementing are defined below:
getEntityFamily()
- Defines the the name of the entity group or family. This could be a database table, a Cassandra Column Family or a MongoDB collection.T createNewEntry(String family)
- Creates a native entry ready to be insertedObject getEntryValue(T nativeEntry, String property)
- Retrieves a value of entry and returns its Java object form. For example a "date" property stored as a String in the datastore would need to b returned as a java.util.Date at this pointsetEntryValue(T nativeEntry, String key, Object value)
- Sets a value of the native entry, converting any Java objects to the required native formatdeleteEntry(String family, K key, Object entry)
- Deletes an entry for the given family, native key and entryT retrieveEntry(PersistentEntity persistentEntity, String family, Serializable key)
- Retrieves a native entry for the given entity, family and keyK storeEntry(PersistentEntity persistentEntity, EntityAccess entityAccess, K storeId, T nativeEntry)
- Stores a native entry for the given idupdateEntry(PersistentEntity persistentEntity, EntityAccess entityAccess, K key, T entry)
- Updates an entryK generateIdentifier(PersistentEntity persistentEntity, T entry)
- Generate an identifier for the given native entryPropertyValueIndexer getPropertyIndexer(PersistentProperty property)
- If the datastore requires manual indexing you'll need to implement aPropertyIndexer
otherwise return nullAssociationIndexer getAssociationIndexer(T nativeEntry, Association association)
- If the datastore requires manual indexing you'll need to implement aAssociationIndexer
otherwise return null
Create
ThecreateNewEntry
method is used to create a native record that will be inserted into the datastore. In MongoDB this is a DBObject
whilst in the implementation for ConcurrentHashMap
it is another Map
:@Override protected DBObject createNewEntry(String family) { return new BasicDBObject(); }
Read
TheretrieveEntry
method is used to retrieve a native record for a given key:protected DBObject retrieveEntry(final PersistentEntity persistentEntity, String family, final Serializable key) { return mongoTemplate.execute(new DbCallback<DBObject>() { public DBObject doInDB(DB con) throws MongoException, DataAccessException { DBCollection dbCollection = con.getCollection(getCollectionName(persistentEntity)); return dbCollection.findOne(key); } }); }
MongoDB
implementation that uses a Spring Data MongoTemplate
to find a DBObject
for the given key. There is a separate storeEntry
method that is used to actually store the native object. In MongoDB
this looks like:@Override protected Object storeEntry(final PersistentEntity persistentEntity, final EntityAccess entityAccess, final Object storeId, final DBObject nativeEntry) { return mongoTemplate.execute(new DbCallback<Object>() { public Object doInDB(DB con) throws MongoException, DataAccessException { nativeEntry.put(MONGO_ID_FIELD, storeId); return storeId; } }); }
MongoDB
the MongoSession
implementation overrides the flushPendingInserts
method of AbstractSession
and performs a batch insert of multiple MongoDB documents (ie DBObject
s) at once:collection.insert(dbObjects.toArray(new DBObject[dbObjects.size()]), writeConcernToUse);
storeEntry
method itself. For example the implementation for ConcurrentHashMap
looks like (note Groovy code):protected storeEntry(PersistentEntity persistentEntity, EntityAccess entityAccess, storeId, Map nativeEntry) { if (!persistentEntity.root) { nativeEntry.discriminator = persistentEntity.discriminator } datastore[family].put(storeId, nativeEntry) return storeId }
Update
TheupdateEntry
method is used to update an entry:public void updateEntry(final PersistentEntity persistentEntity, final EntityAccess ea, final Object key, final DBObject entry) { mongoTemplate.execute(new DbCallback<Object>() { public Object doInDB(DB con) throws MongoException, DataAccessException { String collectionName = getCollectionName(persistentEntity, entry); DBCollection dbCollection = con.getCollection(collectionName); if (isVersioned(ea)) { // TODO this should be done with a CAS approach if possible DBObject previous = dbCollection.findOne(key); checkVersion(ea, previous, persistentEntity, key); } MongoSession mongoSession = (MongoSession) session; dbCollection.update(dbo, entry, false, false, mongoSession.getWriteConcern()); return null; } }); }
update
method is used, in this case the DBCollection
's update
method.Delete
ThedeleteEntry
method is used to delete an entry. For example in the ConcurrentHashMap
implementation it is simply removed from the map:protected void deleteEntry(String family, key, entry) { datastore[family].remove(key) }
MongoDB
the DBCollection
object's remove
method is called:@Override protected void deleteEntry(String family, final Object key, final Object entry) { mongoTemplate.execute(new DbCallback<Object>() { public Object doInDB(DB con) throws MongoException, DataAccessException { DBCollection dbCollection = getCollection(con); MongoSession mongoSession = (MongoSession) session; dbCollection.remove(key, mongoSession.getWriteConcern()); return null; } protected DBCollection getCollection(DB con) { return con.getCollection(getCollectionName(getPersistentEntity())); } }); }
deleteEntries
method which allows for deleting multiple entries in a single operation. The implementation for MongoDB looks like:protected void deleteEntries(String family, final List<Object> keys) { mongoTemplate.execute(new DbCallback<Object>() { public Object doInDB(DB con) throws MongoException, DataAccessException { String collectionName = getCollectionName(getPersistentEntity()); DBCollection dbCollection = con.getCollection(collectionName); MongoSession mongoSession = (MongoSession) getSession(); MongoQuery query = mongoSession.createQuery(getPersistentEntity().getJavaClass()); query.in(getPersistentEntity().getIdentity().getName(), keys); dbCollection.remove(query.getMongoQuery()); return null; } }); }
MongoQuery
instance. Note that implementing an EntityPersister
you have enabled basic CRUD operations, but not querying, which is a topic of the following sections. First, however secondary indices need to covered since they are required for querying.
3.3 Secondary Indexing
Many datastores do not support secondary indexing or require you to build your own. In cases like this you will need to implement aPropertyIndexer
.
If the underlying datastore supports secondary indexes then it is ok to just return a null
PropertyIndexer and let the datastore handle the indexing
For example the ConcurrentHashMap
implementation creates secondary indices by populating another Map
containing the indices:void index(value, primaryKey) { def index = getIndexName(value) def indexed = indices[index] if (indexed == null) { indexed = [] indices[index] = indexed } if (!indexed.contains(primaryKey)) { indexed << primaryKey } }
public void index(final Object value, final Long primaryKey) { if (value == null) { return; } final String primaryIndex = createRedisKey(value); redisTemplate.sadd(primaryIndex, primaryKey); }
query
method that needs to be implemented on PropertyIndexer
. The ConcurrentHashMap
implementation looks like this:List query(value, int offset, int max) { def index = getIndexName(value) def indexed = indices[index] if (!indexed) { return Collections.emptyList() } return indexed[offset..max] }
deindex
method:void deindex(value, primaryKey) {
def index = getIndexName(value)
def indexed = indices[index]
if (indexed) {
indexed.remove(primaryKey)
}
}
3.4 Implementing Querying
Introduction
Theorg.grails.datastore.mapping.query.Query
abstract class defines the query model and it is the job of the GORM implementor to translate this query model into an underlying database query. This is different depending on the implementation and may involve:
- Generating a String-based query such as SQL or JPA-QL
- Creating a query object such as MongoDB's use of a
Document
to define queries - Generating for use with manually created Secondary indices as is the case with Redis
Query
object defines the following:
- One or many
Criterion
that define the criteria to query by. - Zero or many
Projection
instances that define what the data you want back will look like. - Pagination parameters such as
max
,offset
- Sorting parameters
Criterion
for each specific type of query, examples include Equals
, Between
, Like
etc. Depending on the capabilities of the underlying datastore you may implement only a few of these.There are also many types of Projection
such as SumProjection
, MaxProjection
and CountProjection
. Again you may implement only a few of these.If the underlying datastore doesn't for example support calculating aWriting asum
ormax
of a particular property, there is aManualProjections
class that you can use to perform these operations in memory on the client.
Query
implementation is probably the most complex part of implementing a GORM provider, but starts by subclassing the Query
class and implementing the executeQuery
method:public class MongoQuery extends Query implements QueryArgumentsAware { ...}
Using the Query Model
To implement querying you need to understand the Query model. As discussed aQuery
contains a list of Criterion
, however the root Criterion
could be a conjunction (an AND query) or a disjunction (an OR query). The Query
may also contain a combination of regular criterion (=, !=, LIKE etc.) and junctions (AND, OR or NOT). Implementing a Query
therefore requires writing a recursive method. The implementation for ConcurrentHashMap
looks likeCollection executeSubQueryInternal(criteria, criteriaList) { SimpleMapResultList resultList = new SimpleMapResultList(this) for (Query.Criterion criterion in criteriaList) { if (criterion instanceof Query.Junction) { resultList.results << executeSubQueryInternal(criterion, criterion.criteria) } else { PersistentProperty property = getValidProperty(criterion) def handler = handlers[criterion.getClass()] def results = handler?.call(criterion, property) ?: [] resultList.results << results } } }
Junction
is encountered (which represents an AND, OR or NOT) then the method recurses to handle the junctions, otherwise a handler for the Criterion
class is obtained and executed. The handlers
map is a map of Criterion
class to query handlers. The implementation for Equals
looks like:def handlers = [ … (Query.Equals): { Query.Equals equals, PersistentProperty property-> def indexer = entityPersister.getPropertyIndexer(property) final value = subqueryIfNecessary(equals) return indexer.query(value) } … ]
ConcurrentHashMap
) which doesn't support secondary indices. It may be that instead of manually querying the secondary indices in this way that you simply build a String-based or native query. For example in MongoDB this looks like:queryHandlers.put(Equals.class, new QueryHandler<Equals>() { public void handle(PersistentEntity entity, Equals criterion, Document query) { String propertyName = getPropertyName(entity, criterion); Object value = criterion.getValue(); PersistentProperty property = entity.getPropertyByName(criterion.getProperty()); MongoEntityPersister.setDBObjectValue(query, propertyName, value, entity.getMappingContext()); } });
DBObject
. For Gemfire again the implementation is different:queryHandlers.put(Equals.class, new QueryHandler() { public int handle(PersistentEntity entity, Criterion criterion, StringBuilder q, List params, int index) { Equals eq = (Equals) criterion; final String name = eq.getProperty(); validateProperty(entity, name, Equals.class); q.append(calculateName(entity, name)); return appendOrEmbedValue(q, params, index, eq.getValue(), EQUALS); } });
StringBuilder
is used to construct a OQL query from the Query
model.
3.5 GORM Enhancer
Once you have implemented the lower-level APIs you can trivially provide a GORM API to a set of Grails domain classes. For example consider the following simple domain class:import grails.persistence.*@Entity class Book { String title }
// create context def context = new MongoMappingContext(databaseName) context.addPersistentEntity(Book)// create datastore def mongoDatastore = new MongoDatastore(context) mongoDatastore.afterPropertiesSet()// enhance def enhancer = new MongoGormEnhancer(mongoDatastore, new DatastoreTransactionManager(datastore: mongoDatastore)) enhancer.enhance()// use GORM! def books = Book.list()
list()
, dynamic finders etc.) is the usage of the MongoGormEnhancer
. This class subclasses org.grails.datastore.gorm.GormEnhancer
and provides some extensions to GORM specific to MongoDB. A subclass is not required however and if you don't require any datastore specific extensions you can just as easily use the regular GormEnhancer
:def enhancer = new GormEnhancer(mongoDatastore, new DatastoreTransactionManager(datastore: mongoDatastore)) enhancer.enhance()
3.6 Adding to GORM APIs
By default the GORM compiler will make all GORM entities implement theGormEntity
trait. Which provide all of the default GORM methods. However if you want to extend GORM to provide more methods specific to a given data store you can do so by extending this trait.For example Neo4j adds methods for Cypher querying:trait Neo4jEntity<D> extends GormEntity<D> { static Result cypherStatic(String queryString, Map params ) { def session = AbstractDatastore.retrieveSession(Neo4jDatastore) def graphDatabaseService = (GraphDatabaseService)session.nativeInterface graphDatabaseService.execute(queryString, params) } }
TraitProvider
:package org.grails.datastore.gorm.neo4jimport grails.neo4j.Neo4jEntity import groovy.transform.CompileStatic import org.grails.compiler.gorm.GormEntityTraitProvider@CompileStatic class Neo4jEntityTraitProvider implements GormEntityTraitProvider { final Class entityTrait = Neo4jEntity }
src/main/resources/META-INF/services/org.grails.compiler.gorm.GormEntityTraitProvider
file specifying the name of your trait provider:org.grails.datastore.gorm.neo4j.Neo4jEntityTraitProvider
grails-app/domain
or annotated with the Entity
annotation, unless Hibernate is on the classpath in which case you have to tell GORM to map the domain class with Neo4j:static mapWith = "neo4j"