NOTE: _Copies of this document may be made for your own use and for distribution to others, provided that you do not charge any fee for such copies and further provided that each copy contains this Copyright Notice, whether distributed in print or electronically._
@ -10,8 +10,8 @@ Assuming that you have a working JPA application and would like to add some cros
@@ -10,8 +10,8 @@ Assuming that you have a working JPA application and would like to add some cros
First of all you need to add a dependency on the module. Using Maven this is done by adding a dependency to your pom:
=== Example Maven pom.xml with spring-data-mongodb-cross-store dependency
.Example Maven pom.xml with spring-data-mongodb-cross-store dependency
@ -31,11 +31,12 @@ First of all you need to add a dependency on the module. Using Maven this is do
@@ -31,11 +31,12 @@ First of all you need to add a dependency on the module. Using Maven this is do
</project>
----
====
Once this is done we need to enable AspectJ for the project. The cross-store support is implemented using AspectJ aspects so by enabling compile time AspectJ support the cross-store features will become available to your project. In Maven you would add an additional plugin to the <build> section of the pom:
=== Example Maven pom.xml with AspectJ plugin enabled
.Example Maven pom.xml with AspectJ plugin enabled
@ -100,11 +101,12 @@ Once this is done we need to enable AspectJ for the project. The cross-store sup
@@ -100,11 +101,12 @@ Once this is done we need to enable AspectJ for the project. The cross-store sup
</project>
----
====
Finally, you need to configure your project to use MongoDB and also configure the aspects that are used. The following XML snippet should be added to your application context:
=== Example application context with MongoDB and cross-store aspect support
.Example application context with MongoDB and cross-store aspect support
====
[source,xml]
----
<?xml version="1.0" encoding="UTF-8"?>
@ -150,14 +152,15 @@ Finally, you need to configure your project to use MongoDB and also configure th
@@ -150,14 +152,15 @@ Finally, you need to configure your project to use MongoDB and also configure th
</beans>
----
====
[[mongodb_cross-store-application]]
== Writing the Cross Store Application
We are assuming that you have a working JPA application so we will only cover the additional steps needed to persist part of your Entity in your Mongo database. First you need to identify the field you want persisted. It should be a domain class and follow the general rules for the Mongo mapping support covered in previous chapters. The field you want persisted in MongoDB should be annotated using the `@RelatedDocument` annotation. That is really all you need to do!. The cross-store aspects take care of the rest. This includes marking the field with `@Transient` so it won't be persisted using JPA, keeping track of any changes made to the field value and writing them to the database on successful transaction completion, loading the document from MongoDB the first time the value is used in your application. Here is an example of a simple Entity that has a field annotated with `@RelatedEntity`.
=== Example of Entity with @RelatedDocument
.Example of Entity with @RelatedDocument
====
[source,java]
----
@Entity
@ -177,9 +180,10 @@ public class Customer {
@@ -177,9 +180,10 @@ public class Customer {
// getters and setters omitted
}
----
====
=== Example of domain class to be stored as document
.Example of domain class to be stored as document
====
[source,java]
----
public class SurveyInfo {
@ -208,11 +212,12 @@ public class SurveyInfo {
@@ -208,11 +212,12 @@ public class SurveyInfo {
}
}
----
====
Once the SurveyInfo has been set on the Customer object above the MongoTemplate that was configured above is used to save the SurveyInfo along with some metadata about the JPA Entity is stored in a MongoDB collection named after the fully qualified name of the JPA Entity class. The following code:
=== Example of code using the JPA Entity configured for cross-store persistence
.Example of code using the JPA Entity configured for cross-store persistence
====
[source,java]
----
Customer customer = new Customer();
@ -225,11 +230,12 @@ SurveyInfo surveyInfo = new SurveyInfo()
@@ -225,11 +230,12 @@ SurveyInfo surveyInfo = new SurveyInfo()
customer.setSurveyInfo(surveyInfo);
customerRepository.save(customer);
----
====
Executing the code above results in the following JSON document stored in MongoDB.
=== Example of JSON document stored in MongoDB
.Example of JSON document stored in MongoDB
====
[source,javascript]
----
{ "_id" : ObjectId( "4d9e8b6e3c55287f87d4b79e" ),
@ -241,3 +247,4 @@ Executing the code above results in the following JSON document stored in MongoD
@@ -241,3 +247,4 @@ Executing the code above results in the following JSON document stored in MongoD
@ -8,8 +8,8 @@ The JMX support for MongoDB exposes the results of executing the 'serverStatus'
@@ -8,8 +8,8 @@ The JMX support for MongoDB exposes the results of executing the 'serverStatus'
Spring's Mongo namespace enables you to easily enable JMX functionality
=== XML schema to configure MongoDB
.XML schema to configure MongoDB
====
[source,xml]
----
<?xml version="1.0" encoding="UTF-8"?>
@ -45,6 +45,7 @@ Spring's Mongo namespace enables you to easily enable JMX functionality
@@ -45,6 +45,7 @@ Spring's Mongo namespace enables you to easily enable JMX functionality
@ -43,8 +43,8 @@ Unless explicitly configured, an instance of `MongoMappingConverter` is created
@@ -43,8 +43,8 @@ Unless explicitly configured, an instance of `MongoMappingConverter` is created
You can configure the `MongoMappingConverter` as well as `com.mongodb.Mongo` and MongoTemplate either using Java or XML based metadata. Here is an example using Spring's Java based configuration
=== @Configuration class to configure MongoDB mapping support
.@Configuration class to configure MongoDB mapping support
====
[source,java]
----
@Configuration
@ -83,6 +83,7 @@ public class GeoSpatialAppConfig extends AbstractMongoConfiguration {
@@ -83,6 +83,7 @@ public class GeoSpatialAppConfig extends AbstractMongoConfiguration {
}
}
----
====
`AbstractMongoConfiguration` requires you to implement methods that define a `com.mongodb.Mongo` as well as provide a database name. `AbstractMongoConfiguration` also has a method you can override named '`getMappingBasePackage`' which tells the converter where to scan for classes annotated with the `@org.springframework.data.mongodb.core.mapping.Document` annotation.
@ -94,8 +95,8 @@ You can also override the method `UserCredentials getUserCredentials()` to provi
@@ -94,8 +95,8 @@ You can also override the method `UserCredentials getUserCredentials()` to provi
Spring's MongoDB namespace enables you to easily enable mapping functionality in XML
=== XML schema to configure MongoDB mapping support
.XML schema to configure MongoDB mapping support
====
[source,xml]
----
<?xml version="1.0" encoding="UTF-8"?>
@ -134,6 +135,7 @@ Spring's MongoDB namespace enables you to easily enable mapping functionality in
@@ -134,6 +135,7 @@ Spring's MongoDB namespace enables you to easily enable mapping functionality in
</beans>
----
====
The `base-package` property tells it where to scan for classes annotated with the `@org.springframework.data.mongodb.core.mapping.Document` annotation.
@ -142,8 +144,8 @@ The `base-package` property tells it where to scan for classes annotated with th
@@ -142,8 +144,8 @@ The `base-package` property tells it where to scan for classes annotated with th
To take full advantage of the object mapping functionality inside the Spring Data/MongoDB support, you should annotate your mapped objects with the `@org.springframework.data.mongodb.core.mapping.Document` annotation. Although it is not necessary for the mapping framework to have this annotation (your POJOs will be mapped correctly, even without any annotations), it allows the classpath scanner to find and pre-process your domain objects to extract the necessary metadata. If you don't use this annotation, your application will take a slight performance hit the first time you store a domain object because the mapping framework needs to build up its internal metadata model so it knows about the properties of your domain object and how to persist them.
=== Example domain object
.Example domain object
====
[source,java]
----
package com.mycompany.domain;
@ -163,17 +165,18 @@ public class Person {
@@ -163,17 +165,18 @@ public class Person {
private String lastName;
}
----
====
The `@Id` annotation tells the mapper which property you want to use for the MongoDB `_id` property and the `@Indexed` annotation tells the mapping framework to call `ensureIndex` on that property of your document, making searches faster.
IMPORTANT: The `@Id` annotation tells the mapper which property you want to use for the MongoDB `_id` property and the `@Indexed` annotation tells the mapping framework to call `ensureIndex` on that property of your document, making searches faster.
Automatic index creation is only done for types annotated with `@Document`.
IMPORTANT: Automatic index creation is only done for types annotated with `@Document`.
[[mapping-usage-annotations]]
=== Mapping annotation overview
The MappingMongoConverter can use metadata to drive the mapping of objects to documents. An overview of the annotations is provided below
* `@Id`- applied at the field level to mark the field used for identiy purpose.
* `@Id`- applied at the field level to mark the field used for identiy purpose.
* `@Document` - applied at the class level to indicate this class is a candidate for mapping to the database. You can specify the name of the collection where the database will be stored.
* `@DBRef` - applied at the field to indicate it is to be stored using a com.mongodb.DBRef.
* `@Indexed` - applied at the field level to describe how to index the field.
@ -292,8 +295,8 @@ NOTE: Compound indexes are very important to improve the performance of queries
@@ -292,8 +295,8 @@ NOTE: Compound indexes are very important to improve the performance of queries
Here's an example that creates a compound index of `lastName` in ascending order and `age` in descending order:
==== Example Compound Index Usage
.Example Compound Index Usage
====
[source,java]
----
package com.mycompany.domain;
@ -312,6 +315,7 @@ public class Person {
@@ -312,6 +315,7 @@ public class Person {
}
----
====
[[mapping-usage-indexes.text-index]]
=== Text Indexes
@ -320,8 +324,8 @@ NOTE: The text index feature is disabled by default for mongodb v.2.4.
@@ -320,8 +324,8 @@ NOTE: The text index feature is disabled by default for mongodb v.2.4.
Creating a text index allows to accumulate several fields into a searchable full text index. It is only possible to have one text index per collection so all fields marked with `@TextIndexed` are combined into this index. Properties can be weighted to influence document score for ranking results. The default language for the text index is english, to change the default language set `@Document(language="spanish")` to any language you want. Using a property called `language` or `@Language` allows to define a language override on a per document base.
==== Example Text Index Usage
.Example Text Index Usage
====
[source,java]
----
@Document(language = "spanish")
@ -340,6 +344,7 @@ class Nested {
@@ -340,6 +344,7 @@ class Nested {
String roo;
}
----
====
[[mapping-usage-references]]
=== Using DBRefs
@ -348,6 +353,7 @@ The mapping framework doesn't have to store child objects embedded within the do
@@ -348,6 +353,7 @@ The mapping framework doesn't have to store child objects embedded within the do
Here's an example of using a DBRef to refer to a specific document that exists independently of the object in which it is referenced (both classes are shown in-line for brevity's sake):
====
[source,java]
----
@Document
@ -369,10 +375,11 @@ public class Person {
@@ -369,10 +375,11 @@ public class Person {
private List<Account> accounts;
}
----
====
There's no need to use something like `@OneToMany` because the mapping framework sees that you're wanting a one-to-many relationship because there is a List of objects. When the object is stored in MongoDB, there will be a list of DBRefs rather than the `Account` objects themselves.
The mapping framework does not handle cascading saves. If you change an `Account` object that is referenced by a `Person` object, you must save the Account object separately. Calling `save` on the `Person` object will not automatically save the `Account` objects in the property `accounts`.
IMPORTANT: The mapping framework does not handle cascading saves. If you change an `Account` object that is referenced by a `Person` object, you must save the Account object separately. Calling `save` on the `Person` object will not automatically save the `Account` objects in the property `accounts`.
@ -11,8 +11,8 @@ This chapter will point out the specialties for repository support for MongoDB.
@@ -11,8 +11,8 @@ This chapter will point out the specialties for repository support for MongoDB.
To access domain entities stored in a MongoDB you can leverage our sophisticated repository support that eases implementing those quite significantly. To do so, simply create an interface for your repository:
=== Sample Person entity
.Sample Person entity
====
[source,java]
----
public class Person {
@ -26,11 +26,12 @@ public class Person {
@@ -26,11 +26,12 @@ public class Person {
// … getters and setters omitted
}
----
====
We have a quite simple domain object here. Note that it has a property named `id` of type`ObjectId`. The default serialization mechanism used in `MongoTemplate` (which is backing the repository support) regards properties named id as document id. Currently we support`String`, `ObjectId` and `BigInteger` as id-types.
=== Basic repository interface to persist Person entities
.Basic repository interface to persist Person entities
====
[source]
----
public interface PersonRepository extends PagingAndSortingRepository<Person, Long> {
@ -38,11 +39,12 @@ public interface PersonRepository extends PagingAndSortingRepository<Person, Lon
@@ -38,11 +39,12 @@ public interface PersonRepository extends PagingAndSortingRepository<Person, Lon
// additional custom finder methods go here
}
----
====
Right now this interface simply serves typing purposes but we will add additional methods to it later. In your Spring configuration simply add
=== General MongoDB repository Spring configuration
.General MongoDB repository Spring configuration
====
[source,xml]
----
<?xml version="1.0" encoding="UTF-8"?>
@ -65,13 +67,14 @@ Right now this interface simply serves typing purposes but we will add additiona
@@ -65,13 +67,14 @@ Right now this interface simply serves typing purposes but we will add additiona
</beans>
----
====
This namespace element will cause the base packages to be scanned for interfaces extending `MongoRepository` and create Spring beans for each of them found. By default the repositories will get a `MongoTemplate` Spring bean wired that is called `mongoTemplate`, so you only need to configure `mongo-template-ref` explicitly if you deviate from this convention.
If you'd rather like to go with JavaConfig use the `@EnableMongoRepositories` annotation. The annotation carries the very same attributes like the namespace element. If no base package is configured the infrastructure will scan the package of the annotated configuration class.
=== JavaConfig for repositories
.JavaConfig for repositories
====
[source,java]
----
@Configuration
@ -94,11 +97,12 @@ class ApplicationConfig extends AbstractMongoConfiguration {
@@ -94,11 +97,12 @@ class ApplicationConfig extends AbstractMongoConfiguration {
}
}
----
====
As our domain repository extends `PagingAndSortingRepository` it provides you with CRUD operations as well as methods for paginated and sorted access to the entities. Working with the repository instance is just a matter of dependency injecting it into a client. So accessing the second page of `Person`s at a page size of 10 would simply look something like this:
=== Paging access to Person entities
.Paging access to Person entities
====
[source,java]
----
@RunWith(SpringJUnit4ClassRunner.class)
@ -115,6 +119,7 @@ public class PersonRepositoryTests {
@@ -115,6 +119,7 @@ public class PersonRepositoryTests {
}
}
----
====
The sample creates an application context with Spring's unit test support which will perform annotation based dependency injection into test cases. Inside the test method we simply use the repository to query the datastore. We hand the repository a `PageRequest` instance that requests the first page of persons at a page size of 10.
@ -123,8 +128,8 @@ The sample creates an application context with Spring's unit test support which
@@ -123,8 +128,8 @@ The sample creates an application context with Spring's unit test support which
Most of the data access operations you usually trigger on a repository result a query being executed against the MongoDB databases. Defining such a query is just a matter of declaring a method on the repository interface
=== PersonRepository with query methods
.PersonRepository with query methods
====
[source,java]
----
public interface PersonRepository extends PagingAndSortingRepository<Person, String> {
@ -136,18 +141,103 @@ public interface PersonRepository extends PagingAndSortingRepository<Person, Str
@@ -136,18 +141,103 @@ public interface PersonRepository extends PagingAndSortingRepository<Person, Str
Person findByShippingAddresses(Address address);
}
----
====
The first method shows a query for all people with the given lastname. The query will be derived parsing the method name for constraints which can be concatenated with `And` and `Or`. Thus the method name will result in a query expression of`{"lastname" : lastname}`. The second example shows how pagination is applied to a query. Just equip your method signature with a `Pageable` parameter and let the method return a `Page` instance and we will automatically page the query accordingly. The third examples shows that you can query based on properties which are not a primitive type.
NOTE: Note that for version 1.0 we currently don't support referring to parameters that are mapped as `DBRef` in the domain class.
Using return type `List` will retrieve and return all matching documents before actually deleting them. A numeric return type directly removes the matching documents returning the total number of documents removed.
@ -165,8 +256,8 @@ Using return type `List` will retrieve and return all matching documents before
@@ -165,8 +256,8 @@ Using return type `List` will retrieve and return all matching documents before
As you've just seen there are a few keywords triggering geo-spatial operations within a MongoDB query. The `Near` keyword allows some further modification. Let's have look at some examples:
==== Advanced `Near` queries
.Advanced `Near` queries
====
[source,java]
----
public interface PersonRepository extends MongoRepository<Person, String>
@ -175,11 +266,12 @@ public interface PersonRepository extends MongoRepository<Person, String>
@@ -175,11 +266,12 @@ public interface PersonRepository extends MongoRepository<Person, String>
Adding a `Distance` parameter to the query method allows restricting results to those within the given distance. If the `Distance` was set up containing a `Metric` we will transparently use `$nearSphere` instead of $code.
==== Using `Distance` with `Metrics`
.Using `Distance` with `Metrics`
====
[source,java]
----
Point point = new Point(43.7, 48.8);
@ -187,6 +279,7 @@ Distance distance = new Distance(200, Metrics.KILOMETERS);
@@ -187,6 +279,7 @@ Distance distance = new Distance(200, Metrics.KILOMETERS);
As you can see using a `Distance` equipped with a `Metric` causes `$nearSphere` clause to be added instead of a plain `$near`. Beyond that the actual distance gets calculated according to the `Metrics` used.
@ -176,8 +176,8 @@ NOTE: For those not familiar with how to configure the Spring container using Ja
@@ -176,8 +176,8 @@ NOTE: For those not familiar with how to configure the Spring container using Ja
An example of using Java based bean metadata to register an instance of a `com.mongodb.Mongo` is shown below
==== Registering a com.mongodb.Mongo object using Java based bean metadata
.Registering a com.mongodb.Mongo object using Java based bean metadata
====
[source,java]
----
@Configuration
@ -191,6 +191,7 @@ public class AppConfig {
@@ -191,6 +191,7 @@ public class AppConfig {
}
}
----
====
This approach allows you to use the standard `com.mongodb.Mongo` API that you may already be used to using but also pollutes the code with the UnknownHostException checked exception. The use of the checked exception is not desirable as Java based bean metadata uses methods as a means to set object dependencies, making the calling code cluttered.
@ -198,8 +199,8 @@ An alternative is to register an instance of `com.mongodb.Mongo` instance with t
@@ -198,8 +199,8 @@ An alternative is to register an instance of `com.mongodb.Mongo` instance with t
An example of a Java based bean metadata that supports exception translation on `@Repository` annotated classes is shown below:
==== Registering a com.mongodb.Mongo object using Spring's MongoFactoryBean and enabling Spring's exception translation support
.Registering a com.mongodb.Mongo object using Spring's MongoFactoryBean and enabling Spring's exception translation support
====
[source,java]
----
@Configuration
@ -215,6 +216,7 @@ public class AppConfig {
@@ -215,6 +216,7 @@ public class AppConfig {
}
}
----
====
To access the `com.mongodb.Mongo` object created by the `MongoFactoryBean` in other `@Configuration` or your own classes, use a "`private @Autowired Mongo mongo;`" field.
@ -225,8 +227,8 @@ While you can use Spring's traditional `<beans/>` XML namespace to register an i
@@ -225,8 +227,8 @@ While you can use Spring's traditional `<beans/>` XML namespace to register an i
To use the Mongo namespace elements you will need to reference the Mongo schema:
==== XML schema to configure MongoDB
.XML schema to configure MongoDB
====
[source,xml]
----
<?xml version="1.0" encoding="UTF-8"?>
@ -246,11 +248,12 @@ To use the Mongo namespace elements you will need to reference the Mongo schema:
@@ -246,11 +248,12 @@ To use the Mongo namespace elements you will need to reference the Mongo schema:
</beans>
----
====
A more advanced configuration with MongoOptions is shown below (note these are not recommended values)
==== XML schema to configure a com.mongodb.Mongo object with MongoOptions
.XML schema to configure a com.mongodb.Mongo object with MongoOptions
====
[source,xml]
----
<beans>
@ -270,17 +273,18 @@ A more advanced configuration with MongoOptions is shown below (note these are n
@@ -270,17 +273,18 @@ A more advanced configuration with MongoOptions is shown below (note these are n
</mongo:mongo/>
</beans>
----
====
A configuration using replica sets is shown below.
===== XML schema to configure com.mongodb.Mongo object with Replica Sets
.XML schema to configure com.mongodb.Mongo object with Replica Sets
@ -416,17 +420,18 @@ If you need to configure additional options on the `com.mongodb.Mongo` instance
@@ -416,17 +420,18 @@ If you need to configure additional options on the `com.mongodb.Mongo` instance
Activating auditing functionality is just a matter of adding the Spring Data Mongo `auditing` namespace element to your configuration:
Since Spring Data MongoDB 1.4 auditing can be enabled by annotating a configuration class with the `@EnableMongoAuditing` annotation.
=== Activating auditing using JavaConfig
.Activating auditing using JavaConfig
====
[source,java]
----
@Configuration
@ -439,6 +444,7 @@ class Config {
@@ -439,6 +444,7 @@ class Config {
}
}
----
====
If you expose a bean of type `AuditorAware` to the `ApplicationContext`, the auditing infrastructure will pick it up automatically and use it to determine the current user to be set on domain types. If you have multiple implementations registered in the `ApplicationContext`, you can select the one to be used by explicitly setting the `auditorAwareRef` attribute of `@EnableJpaAuditing`.
@ -470,8 +476,8 @@ Now let's look at a examples of how to work with the `MongoTemplate` in the cont
@@ -470,8 +476,8 @@ Now let's look at a examples of how to work with the `MongoTemplate` in the cont
You can use Java to create and register an instance of MongoTemplate as shown below.
==== Registering a com.mongodb.Mongo object and enabling Spring's exception translation support
.Registering a com.mongodb.Mongo object and enabling Spring's exception translation support
====
[source,java]
----
@Configuration
@ -486,6 +492,7 @@ public class AppConfig {
@@ -486,6 +492,7 @@ public class AppConfig {
}
}
----
====
There are several overloaded constructors of MongoTemplate. These are
@ -694,8 +701,8 @@ As MongoDB collections can contain documents that represent instances of a varie
@@ -694,8 +701,8 @@ As MongoDB collections can contain documents that represent instances of a varie
To achieve that the `MappingMongoConverter` uses a `MongoTypeMapper` abstraction with `DefaultMongoTypeMapper` as it's main implementation. It's default behaviour is storing the fully qualified classname under `_class` inside the document for the top-level document as well as for every value if it's a complex type and a subtype of the property type declared.
As you can see we store the type information for the actual root class persistent as well as for the nested type as it is complex and a subtype of `Contact`. So if you're now using `mongoTemplate.findAll(Object.class, "sample")` we are able to find out that the document stored shall be a `Sample` instance. We are also able to find out that the value property shall be a `Person` actually.
@ -722,8 +730,8 @@ As you can see we store the type information for the actual root class persisten
@@ -722,8 +730,8 @@ As you can see we store the type information for the actual root class persisten
In case you want to avoid writing the entire Java class name as type information but rather like to use some key you can use the `@TypeAlias` annotation at the entity class being persisted. If you need to customize the mapping even more have a look at the `TypeInformationMapper` interface. An instance of that interface can be configured at the `DefaultMongoTypeMapper` which can be configured in turn on `MappingMongoConverter`.
===== Defining a TypeAlias for an Entity
.Defining a TypeAlias for an Entity
====
[source,java]
----
@TypeAlias("pers")
@ -731,6 +739,7 @@ class Person {
@@ -731,6 +739,7 @@ class Person {
}
----
====
Note that the resulting document will contain `"pers"` as the value in the `_class` Field.
@ -738,14 +747,15 @@ Note that the resulting document will contain `"pers"` as the value in the `_cla
@@ -738,14 +747,15 @@ Note that the resulting document will contain `"pers"` as the value in the `_cla
The following example demonstrates how to configure a custom `MongoTypeMapper` in `MappingMongoConverter`.
===== Configuring a custom MongoTypeMapper via Spring Java Config
.Configuring a custom MongoTypeMapper via Spring Java Config
====
[source,java]
----
class CustomMongoTypeMapper extends DefaultMongoTypeMapper {
//implement custom type mapping here
}
----
====
[source,java]
----
@ -779,14 +789,15 @@ class SampleMongoConfiguration extends AbstractMongoConfiguration {
@@ -779,14 +789,15 @@ class SampleMongoConfiguration extends AbstractMongoConfiguration {
Note that we are extending the `AbstractMongoConfiguration` class and override the bean definition of the `MappingMongoConverter` where we configure our custom `MongoTypeMapper`.
===== Configuring a custom MongoTypeMapper via XML
@ -801,8 +812,8 @@ When inserting or saving, if the Id property is not set, the assumption is that
@@ -801,8 +812,8 @@ When inserting or saving, if the Id property is not set, the assumption is that
Here is a basic example of using the save operation and retrieving its contents.
==== Inserting and retrieving documents using the MongoTemplate
.Inserting and retrieving documents using the MongoTemplate
Person qp = mongoTemplate.findOne(query(where("age").is(33)), Person.class);
----
====
The insert/save operations available to you are listed below.
@ -851,8 +863,8 @@ The MongoDB driver supports inserting a collection of documents in one operation
@@ -851,8 +863,8 @@ The MongoDB driver supports inserting a collection of documents in one operation
For updates we can elect to update the first document found using `MongoOperation`'s method `updateFirst` or we can update all documents that were found to match the query using the method `updateMulti`. Here is an example of an update of all SAVINGS accounts where we are adding a one time $50.00 bonus to the balance using the `$inc` operator.
new Update().inc("accounts.$.balance", 50.00), Account.class);
----
====
In addition to the `Query` discussed above we provide the update definition using an `Update` object. The `Update` class has methods that match the update modifiers available for MongoDB.
@ -963,13 +976,14 @@ You can use several overloaded methods to remove an object from the database.
@@ -963,13 +976,14 @@ You can use several overloaded methods to remove an object from the database.
You can express your queries using the `Query` and `Criteria` classes which have method names that mirror the native MongoDB operator names such as `lt`, `lte`, `is`, and others. The `Query` and `Criteria` classes follow a fluent API style so that you can easily chain together multiple method criteria and queries while having easy to understand code. Static imports in Java are used to help remove the need to see the 'new' keyword for creating `Query` and `Criteria` instances so as to improve readability. If you like to create `Query` instances from a plain JSON String use `BasicQuery`.
=== Creating a Query instance from a plain JSON String
.Creating a Query instance from a plain JSON String
====
[source,java]
----
BasicQuery query = new BasicQuery("{ age : { $lt : 50 }, accounts.balance : { $gt : 1000.00 }}");
List<Person> result = mongoTemplate.find(query, Person.class);
----
====
GeoSpatial queries are also supported and are described more in the section <<mongo.geospatial,GeoSpatial Queries>>.
@ -980,8 +994,8 @@ Map-Reduce operations are also supported and are described more in the section <
@@ -980,8 +994,8 @@ Map-Reduce operations are also supported and are described more in the section <
We saw how to retrieve a single document using the findOne and findById methods on MongoTemplate in previous sections which return a single domain object. We can also query for a collection of documents to be returned as a list of domain objects. Assuming that we have a number of Person objects with name and age stored as documents in a collection and that each person has an embedded account document with a balance. We can now run a query using the following code.
==== Querying for documents using the MongoTemplate
All find methods take a `Query` object as a parameter. This object defines the criteria and options used to perform the query. The criteria is specified using a `Criteria` object that has a static factory method named `where` used to instantiate a new `Criteria` object. We recommend using a static import for `org.springframework.data.mongodb.core.query.Criteria.where` and `Query.query` to make the query more readable.
@ -1525,14 +1540,15 @@ Note that the aggregation operations not listed here are currently not supported
@@ -1525,14 +1540,15 @@ Note that the aggregation operations not listed here are currently not supported
Projection expressions are used to define the fields that are the outcome of a particular aggregation step. Projection expressions can be defined via the `project` method of the `Aggregate` class.
==== Projection expression examples
.Projection expression examples
====
[source,java]
----
project("name", "netPrice") // will generate {$project: {name: 1, netPrice: 1}}
project().and("foo").as("bar") // will generate {$project: {bar: $foo}}
project("a","b").and("foo").as("bar") // will generate {$project: {a: 1, b: 1, bar: $foo}}
----
====
Note that more examples for project operations can be found in the `AggregationTests` class.
@ -1572,7 +1588,7 @@ Have a look at an example in more context in <<mongo.aggregation.examples.exampl
@@ -1572,7 +1588,7 @@ Have a look at an example in more context in <<mongo.aggregation.examples.exampl
The following examples demonstrate the usage patterns for the MongoDB Aggregation Framework with Spring Data MongoDB.
[[mongo.aggregation.examples.example1]]
==== Aggregation Framework Example 1
.Aggregation Framework Example 1
In this introductory example we want to aggregate a list of tags to get the occurrence count of a particular tag from a MongoDB collection called `"tags"` sorted by the occurrence count in descending order. This example demonstrates the usage of grouping, sorting, projections (selection) and unwinding (result splitting).
Note that the input collection is explicitly specified as the `"tags"` parameter to the `aggregate` Method. If the name of the input collection is not specified explicitly, it is derived from the input-class passed as first parameter to the `newAggreation` Method.
[[mongo.aggregation.examples.example2]]
==== Aggregation Framework Example 2
.Aggregation Framework Example 2
This example is based on the http://docs.mongodb.org/manual/tutorial/aggregation-examples/#largest-and-smallest-cities-by-state[Largest and Smallest Cities by State] example from the MongoDB Aggregation Framework documentation. We added additional sorting to produce stable results with different MongoDB versions. Here we want to return the smallest and largest cities by population for each state, using the aggregation framework. This example demonstrates the usage of grouping, sorting and projections (selection).
Note that we derive the name of the input-collection from the `ZipInfo`-class passed as first parameter to the `newAggregation`-Method.
[[mongo.aggregation.examples.example3]]
==== Aggregation Framework Example 3
.Aggregation Framework Example 3
This example is based on the http://docs.mongodb.org/manual/tutorial/aggregation-examples/#states-with-populations-over-10-million[States with Populations Over 10 Million ]example from the MongoDB Aggregation Framework documentation. We added additional sorting to produce stable results with different MongoDB versions. Here we want to return all states with a population greater than 10 million, using the aggregation framework. This example demonstrates the usage of grouping, sorting and matching (filtering).
It's time to look at some code examples showing how to use the `MongoTemplate`. First we look at creating our first collection.
==== Working with collections using the MongoTemplate
.Working with collections using the MongoTemplate
====
[source,java]
----
DBCollection collection = null;
@ -1986,6 +2002,7 @@ if (!mongoTemplate.getCollectionNames().contains("MyNewCollection")) {
@@ -1986,6 +2002,7 @@ if (!mongoTemplate.getCollectionNames().contains("MyNewCollection")) {
mongoTemplate.dropCollection("MyNewCollection");
----
====
* *getCollectionNames* Returns a set of collection names.
* *collectionExists* Check to see if a collection with a given name exists.
@ -2011,6 +2028,7 @@ Built into the MongoDB mapping framework are several `org.springframework.contex
@@ -2011,6 +2028,7 @@ Built into the MongoDB mapping framework are several `org.springframework.contex
To intercept an object before it goes through the conversion process (which turns your domain object into a `com.mongodb.DBObject`), you'd register a subclass of `AbstractMongoEventListener` that overrides the `onBeforeConvert` method. When the event is dispatched, your listener will be called and passed the domain object before it goes into the converter.
====
[source,java]
----
public class BeforeConvertListener extends AbstractMongoEventListener<Person> {
@ -2020,9 +2038,11 @@ public class BeforeConvertListener extends AbstractMongoEventListener<Person> {
@@ -2020,9 +2038,11 @@ public class BeforeConvertListener extends AbstractMongoEventListener<Person> {
}
}
----
====
To intercept an object before it goes into the database, you'd register a subclass of `org.springframework.data.mongodb.core.mapping.event.AbstractMongoEventListener` that overrides the `onBeforeSave` method. When the event is dispatched, your listener will be called and passed the domain object and the converted `com.mongodb.DBObject`.
====
[source,java]
----
public class BeforeSaveListener extends AbstractMongoEventListener<Person> {
@ -2032,6 +2052,7 @@ public class BeforeSaveListener extends AbstractMongoEventListener<Person> {
@@ -2032,6 +2052,7 @@ public class BeforeSaveListener extends AbstractMongoEventListener<Person> {
}
}
----
====
Simply declaring these beans in your Spring ApplicationContext will cause them to be invoked whenever the event is dispatched.
@ -2091,8 +2112,8 @@ boolean hasIndex = template.execute("geolocation", new CollectionCallbackBoolean
@@ -2091,8 +2112,8 @@ boolean hasIndex = template.execute("geolocation", new CollectionCallbackBoolean
MongoDB supports storing binary files inside it's filesystem GridFS. Spring Data MongoDB provides a `GridFsOperations` interface as well as the according implementation `GridFsTemplate` to easily interact with the filesystem. You can setup a `GridFsTemplate` instance by handing it a `MongoDbFactory` as well as a `MongoConverter`:
=== JavaConfig setup for a GridFsTemplate
.JavaConfig setup for a GridFsTemplate
====
[source,java]
----
class GridFsConfiguration extends AbstractMongoConfiguration {
@ -2105,11 +2126,12 @@ class GridFsConfiguration extends AbstractMongoConfiguration {
@@ -2105,11 +2126,12 @@ class GridFsConfiguration extends AbstractMongoConfiguration {
}
}
----
====
An according XML configuration looks like this:
=== XML configuration for a GridFsTemplate
.XML configuration for a GridFsTemplate
====
[source,xml]
----
<?xml version="1.0" encoding="UTF-8"?>
@ -2131,11 +2153,12 @@ An according XML configuration looks like this:
@@ -2131,11 +2153,12 @@ An according XML configuration looks like this:
</beans>
----
====
The template can now be injected and used to perform storage and retrieval operations.
=== Using GridFsTemplate to store files
.Using GridFsTemplate to store files
====
[source,java]
----
class GridFsClient {
@ -2154,13 +2177,14 @@ class GridFsClient {
@@ -2154,13 +2177,14 @@ class GridFsClient {
}
}
----
====
The `store(…)` operations take an `InputStream`, a filename and optionally metadata information about the file to store. The metadata can be an arbitrary object which will be marshalled by the `MongoConverter` configured with the `GridFsTemplate`. Alternatively you can also provide a `DBObject` as well.
Reading files from the filesystem can either be achieved through the `find(…)` or `getResources(…)` methods. Let's have a look at the `find(…)` methods first. You can either find a single file matching a `Query` or multiple ones. To easily define file queries we provide the `GridFsCriteria` helper class. It provides static factory methods to encapsulate default metadata fields (e.g. `whereFilename()`, `whereContentType()`) or the custom one through `whereMetaData()`.
=== Using GridFsTemplate to query for files
.Using GridFsTemplate to query for files
====
[source,java]
----
class GridFsClient {
@ -2174,13 +2198,14 @@ class GridFsClient {
@@ -2174,13 +2198,14 @@ class GridFsClient {
}
}
----
====
NOTE: Currently MongoDB does not support defining sort criteria when retrieving files from GridFS. Thus any sort criteria defined on the `Query` instance handed into the `find(…)` method will be disregarded.
The other option to read files from the GridFs is using the methods introduced by the `ResourcePatternResolver` interface. They allow handing an Ant path into the method ar thus retrieve files matching the given pattern.
=== Using GridFsTemplate to read files
.Using GridFsTemplate to read files
====
[source,java]
----
class GridFsClient {
@ -2194,5 +2219,6 @@ class GridFsClient {
@@ -2194,5 +2219,6 @@ class GridFsClient {
}
}
----
====
`GridFsOperations` extending `ResourcePatternResolver` allows the `GridFsTemplate` e.g. to be plugged into an `ApplicationContext` to read Spring Config files from a MongoDB.