Browse Source

DATAMONGO-2509 - Update Reference Documentation.

Related to: DATAMONGO-365, DATAMONGO-384, DATAMONGO-2192, DATAMONGO-2407

Original pull request: #853.
pull/862/head
Christoph Strobl 6 years ago committed by Mark Paluch
parent
commit
288e04b2d3
No known key found for this signature in database
GPG Key ID: 51A00FA751B91849
  1. 5
      README.adoc
  2. 1
      src/main/asciidoc/index.adoc
  3. 260
      src/main/asciidoc/reference/cross-store.adoc
  4. 17
      src/main/asciidoc/reference/mapping.adoc
  5. 6
      src/main/asciidoc/reference/mongo-custom-conversions.adoc
  6. 5
      src/main/asciidoc/reference/mongo-repositories.adoc
  7. 170
      src/main/asciidoc/reference/mongodb.adoc

5
README.adoc

@ -52,11 +52,6 @@ public class MyService { @@ -52,11 +52,6 @@ public class MyService {
@EnableMongoRepositories
class ApplicationConfig extends AbstractMongoClientConfiguration {
@Override
public MongoClient mongoClient() {
return MongoClients.create();
}
@Override
protected String getDatabaseName() {
return "springdata";

1
src/main/asciidoc/index.adoc

@ -32,7 +32,6 @@ include::reference/mongo-auditing.adoc[leveloffset=+1] @@ -32,7 +32,6 @@ include::reference/mongo-auditing.adoc[leveloffset=+1]
include::reference/mapping.adoc[leveloffset=+1]
include::reference/sharding.adoc[leveloffset=+1]
include::reference/kotlin.adoc[leveloffset=+1]
include::reference/cross-store.adoc[leveloffset=+1]
include::reference/jmx.adoc[leveloffset=+1]
include::reference/mongo-3.adoc[leveloffset=+1]

260
src/main/asciidoc/reference/cross-store.adoc

@ -1,260 +0,0 @@ @@ -1,260 +0,0 @@
[[mongo.cross.store]]
= Cross Store Support
WARNING: This feature has been deprecated and will be removed without replacement.
Sometimes you need to store data in multiple data stores, and these data stores need to be of different types. One might be relational while the other is a document store. For this use case, we created a separate module in the MongoDB support that handles what we call "`cross-store support`". The current implementation is based on JPA as the driver for the relational database and we let select fields in the Entities be stored in a Mongo database. In addition to letting you store your data in two stores, we also coordinate persistence operations for the non-transactional MongoDB store with the transaction life-cycle for the relational database.
[[mongodb_cross-store-configuration]]
== Cross Store Configuration
Assuming that you have a working JPA application and would like to add some cross-store persistence for MongoDB, what do you have to add to your configuration?
First, you need to add a dependency on the cross-store module. If you use Maven, you can add the following dependency to your pom:
.Example Maven pom.xml with `spring-data-mongodb-cross-store` dependency
====
[source,xml]
----
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
...
<!-- Spring Data -->
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-cross-store</artifactId>
<version>${spring.data.mongo.version}</version>
</dependency>
...
</project>
----
====
Once you have added the dependency, you need to enable AspectJ for the project. The cross-store support is implemented with AspectJ aspects so, if you enable compile-time AspectJ support, the cross-store features become available to your project. In Maven, you would add an additional plugin to the `<build>` section of the pom, as follows:
.Example Maven pom.xml with AspectJ plugin enabled
====
[source,xml]
----
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
...
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.0</version>
<dependencies>
<!-- NB: You must use Maven 2.0.9 or above or these are ignored (see MNG-2972) -->
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>${aspectj.version}</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjtools</artifactId>
<version>${aspectj.version}</version>
</dependency>
</dependencies>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
<configuration>
<outxml>true</outxml>
<aspectLibraries>
<aspectLibrary>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
</aspectLibrary>
<aspectLibrary>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb-cross-store</artifactId>
</aspectLibrary>
</aspectLibraries>
<source>1.6</source>
<target>1.6</target>
</configuration>
</plugin>
...
</plugins>
</build>
...
</project>
----
====
Finally, you need to configure your project to use MongoDB and also configure which aspects are used. You should add the following XML snippet to your application context:
.Example application context with MongoDB and cross-store aspect support
====
[source,xml]
----
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:jdbc="http://www.springframework.org/schema/jdbc"
xmlns:jpa="http://www.springframework.org/schema/data/jpa"
xmlns:mongo="http://www.springframework.org/schema/data/mongo"
xsi:schemaLocation="http://www.springframework.org/schema/data/mongo
https://www.springframework.org/schema/data/mongo/spring-mongo.xsd
http://www.springframework.org/schema/jdbc
https://www.springframework.org/schema/jdbc/spring-jdbc-3.0.xsd
http://www.springframework.org/schema/beans
https://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/data/jpa
https://www.springframework.org/schema/data/jpa/spring-jpa-1.0.xsd">
...
<!-- Mongo config -->
<mongo:mongo-client host="localhost" port="27017"/>
<bean id="mongoTemplate" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg name="mongoClient" ref="mongoClient"/>
<constructor-arg name="databaseName" value="test"/>
<constructor-arg name="defaultCollectionName" value="cross-store"/>
</bean>
<bean class="org.springframework.data.mongodb.core.MongoExceptionTranslator"/>
<!-- Mongo cross-store aspect config -->
<bean class="org.springframework.data.persistence.document.mongo.MongoDocumentBacking"
factory-method="aspectOf">
<property name="changeSetPersister" ref="mongoChangeSetPersister"/>
</bean>
<bean id="mongoChangeSetPersister"
class="org.springframework.data.persistence.document.mongo.MongoChangeSetPersister">
<property name="mongoTemplate" ref="mongoTemplate"/>
<property name="entityManagerFactory" ref="entityManagerFactory"/>
</bean>
...
</beans>
----
====
[[mongodb_cross-store-application]]
== Writing the Cross Store Application
We assume that you have a working JPA application, so we cover only the additional steps needed to persist part of your entity in your Mongo database. To do so, you need to identify the field you want to persist. It should be a domain class and follow the general rules for the Mongo mapping support covered in previous chapters. The field you want to persist in MongoDB should be annotated with the `@RelatedDocument` annotation. That is really all you need to do. The cross-store aspects take care of the rest, including:
* Marking the field with `@Transient` so that it will not be persisted by JPA
* Keeping track of any changes made to the field value and writing them to the database on successful transaction completion
* Loading the document from MongoDB the first time the value is used in your application.
The following example shows an entity that has a field annotated with `@RelatedDocument`:
.Example of Entity with @RelatedDocument
====
[source,java]
----
@Entity
public class Customer {
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
private String firstName;
private String lastName;
@RelatedDocument
private SurveyInfo surveyInfo;
// getters and setters omitted
}
----
====
The following example shows a domain class that is to be stored as a `Document`:
.Example of a domain class to be stored as a Document
====
[source,java]
----
public class SurveyInfo {
private Map<String, String> questionsAndAnswers;
public SurveyInfo() {
this.questionsAndAnswers = new HashMap<String, String>();
}
public SurveyInfo(Map<String, String> questionsAndAnswers) {
this.questionsAndAnswers = questionsAndAnswers;
}
public Map<String, String> getQuestionsAndAnswers() {
return questionsAndAnswers;
}
public void setQuestionsAndAnswers(Map<String, String> questionsAndAnswers) {
this.questionsAndAnswers = questionsAndAnswers;
}
public SurveyInfo addQuestionAndAnswer(String question, String answer) {
this.questionsAndAnswers.put(question, answer);
return this;
}
}
----
====
In the preceding example, once the `SurveyInfo` has been set on the `Customer` object, the `MongoTemplate` that was configured previously is used to save the `SurveyInfo` (along with some metadata about the JPA Entity) in a MongoDB collection named after the fully qualified name of the JPA Entity class. The following code shows how to configure a JPA entity for cross-store persistence with MongoDB:
.Example of code using the JPA Entity configured for cross-store persistence
====
[source,java]
----
Customer customer = new Customer();
customer.setFirstName("Sven");
customer.setLastName("Olafsen");
SurveyInfo surveyInfo = new SurveyInfo()
.addQuestionAndAnswer("age", "22")
.addQuestionAndAnswer("married", "Yes")
.addQuestionAndAnswer("citizenship", "Norwegian");
customer.setSurveyInfo(surveyInfo);
customerRepository.save(customer);
----
====
Running the preceding above results in the following JSON document being stored in MongoDB:
.Example of JSON document stored in MongoDB
====
[source,javascript]
----
{ "_id" : ObjectId( "4d9e8b6e3c55287f87d4b79e" ),
"_entity_id" : 1,
"_entity_class" : "org.springframework.data.mongodb.examples.custsvc.domain.Customer",
"_entity_field_name" : "surveyInfo",
"questionsAndAnswers" : { "married" : "Yes",
"age" : "22",
"citizenship" : "Norwegian" },
"_entity_field_class" : "org.springframework.data.mongodb.examples.custsvc.domain.SurveyInfo" }
----
====

17
src/main/asciidoc/reference/mapping.adoc

@ -269,15 +269,8 @@ You can configure the `MappingMongoConverter` as well as `com.mongodb.client.Mon @@ -269,15 +269,8 @@ You can configure the `MappingMongoConverter` as well as `com.mongodb.client.Mon
====
[source,java]
----
import com.mongodb.client.MongoClients;
@Configuration
public class GeoSpatialAppConfig extends AbstractMongoClientConfiguration {
@Bean
public MongoClient mongoClient() {
return MongoClients.create("monogodb://localhost:27017");
}
public class MongoConfig extends AbstractMongoClientConfiguration {
@Override
public String getDatabaseName() {
@ -287,12 +280,12 @@ public class GeoSpatialAppConfig extends AbstractMongoClientConfiguration { @@ -287,12 +280,12 @@ public class GeoSpatialAppConfig extends AbstractMongoClientConfiguration {
// the following are optional
@Override
public String getMappingBasePackage() {
public String getMappingBasePackage() { <1>
return "com.bigbank.domain";
}
@Override
void configureConverters(MongoConverterConfigurationAdapter adapter) {
void configureConverters(MongoConverterConfigurationAdapter adapter) { <2>
adapter.registerConverter(new org.springframework.data.mongodb.test.PersonReadConverter());
adapter.registerConverter(new org.springframework.data.mongodb.test.PersonWriteConverter());
@ -304,6 +297,8 @@ public class GeoSpatialAppConfig extends AbstractMongoClientConfiguration { @@ -304,6 +297,8 @@ public class GeoSpatialAppConfig extends AbstractMongoClientConfiguration {
}
}
----
<1> The mapping base package defines the root path used to scan for entities used to pre initialize the `MappingContext`. By default the configuration classes package is used.
<2> Configure additional custom converters for specific domain types that replace the default mapping procedure for those types with your custom implementation.
====
`AbstractMongoClientConfiguration` requires you to implement methods that define a `com.mongodb.client.MongoClient` as well as provide a database name. `AbstractMongoClientConfiguration` also has a method named `getMappingBasePackage(…)` that you can override to tell the converter where to scan for classes annotated with the `@Document` annotation.
@ -390,6 +385,8 @@ public class Person { @@ -390,6 +385,8 @@ public class Person {
IMPORTANT: The `@Id` annotation tells the mapper which property you want to use for the MongoDB `_id` property, and the `@Indexed` annotation tells the mapping framework to call `createIndex(…)` on that property of your document, making searches faster.
Automatic index creation is only done for types annotated with `@Document`.
WARNING: Auto index creation is turned **OFF** by default and need to be enabled via the configuration (see <<mapping.index-creation>>).
[[mapping.index-creation]]
=== Index Creation

6
src/main/asciidoc/reference/mongo-custom-conversions.adoc

@ -94,12 +94,6 @@ class MyMongoConfiguration extends AbstractMongoClientConfiguration { @@ -94,12 +94,6 @@ class MyMongoConfiguration extends AbstractMongoClientConfiguration {
return "database";
}
@Override
@Bean
public MongoClient mongoClient() {
return MongoClients.create();
}
@Override
protected void configureConverters(MongoConverterConfigurationAdapter adapter) {
adapter.registerConverter(new com.example.PersonReadConverter());

5
src/main/asciidoc/reference/mongo-repositories.adoc

@ -64,11 +64,6 @@ class ApplicationConfig extends AbstractMongoClientConfiguration { @@ -64,11 +64,6 @@ class ApplicationConfig extends AbstractMongoClientConfiguration {
return "e-store";
}
@Override
public MongoClient mongoClient() {
return MongoClients.create();
}
@Override
protected String getMappingBasePackage() {
return "com.oreilly.springdata.mongodb";

170
src/main/asciidoc/reference/mongodb.adoc

@ -349,17 +349,13 @@ public class ApplicationContextEventTestsAppConfig extends AbstractMongoClientCo @@ -349,17 +349,13 @@ public class ApplicationContextEventTestsAppConfig extends AbstractMongoClientCo
}
@Override
@Bean
public MongoClient mongoClient() {
protected void configureClientSettings(Builder builder) {
MongoClientSettings settings = MongoClientSettings.builder()
builder
.credential(MongoCredential.createCredential("name", "db", "pwd".toCharArray()))
.applyToClusterSettings(settings -> {
settings.hosts(singletonList(new ServerAddress("127.0.0.1", 27017)));
})
.build();
return MongoClients.create(settings);
});
}
}
----
@ -371,23 +367,6 @@ The following example shows encoded credentials: @@ -371,23 +367,6 @@ The following example shows encoded credentials:
`m0ng0@dmin:mo_res:bw6},Qsdxx@admin@database` -> `m0ng0%40dmin:mo_res%3Abw6%7D%2CQsdxx%40admin@database`
See https://tools.ietf.org/html/rfc3986#section-2.2[section 2.2 of RFC 3986] for further details.
[source,java]
----
@Configuration
public class MongoClientConfiguration extends AbstractMongoClientConfiguration {
@Override
protected String getDatabaseName() {
return "database";
}
@Override
public MongoClient mongoClient() {
return MongoClients.create("mongodb://localhost:27017/?replicaSet=rs0&w=majority");
}
}
----
[[mongo.mongo-db-factory-xml]]
=== Registering a `MongoDatabaseFactory` Instance by Using XML-based Metadata
@ -737,6 +716,25 @@ class Person { @@ -737,6 +716,25 @@ class Person {
Note that the resulting document contains `pers` as the value in the `_class` Field.
[WARNING]
====
Type aliases only work if the mapping context is aware of the actual type. The required entity metadata is aquired either on first save or has to be provided via the configurations initial entity set. By default the configuration base package is scanned for potential candidates.
[source,java]
----
@Configuration
public class AppConfig extends AbstractMongoClientConfiguration {
@Override
protected Set<Class<?>> getInitialEntitySet() {
return Collections.singleton(Person.class);
}
// ...
}
----
====
==== Configuring Custom Type Mapping
The following example shows how to configure a custom `MongoTypeMapper` in `MappingMongoConverter`:
@ -761,11 +759,6 @@ class SampleMongoConfiguration extends AbstractMongoClientConfiguration { @@ -761,11 +759,6 @@ class SampleMongoConfiguration extends AbstractMongoClientConfiguration {
return "database";
}
@Override
public MongoClient mongoClient() {
return MongoClients.create();
}
@Bean
@Override
public MappingMongoConverter mappingMongoConverter() throws Exception {
@ -935,7 +928,10 @@ Related to performing an `updateFirst` operation, you can also perform an "`upse @@ -935,7 +928,10 @@ Related to performing an `updateFirst` operation, you can also perform an "`upse
[source]
----
template.upsert(query(where("ssn").is(1111).and("firstName").is("Joe").and("Fraizer").is("Update")), update("address", addr), Person.class);
template.update(Person.class)
.matching(query(where("ssn").is(1111).and("firstName").is("Joe").and("Fraizer").is("Update"))
.apply(update("address", addr))
.upsert();
----
WARNING: `upsert` does not support ordering. Please use <<mongo-template.find-and-upsert, findAndModify>> to apply `Sort`.
@ -960,32 +956,48 @@ The following example inserts a few `Person` objects into the container and perf @@ -960,32 +956,48 @@ The following example inserts a few `Person` objects into the container and perf
[source,java]
----
mongoTemplate.insert(new Person("Tom", 21));
mongoTemplate.insert(new Person("Dick", 22));
mongoTemplate.insert(new Person("Harry", 23));
template.insert(new Person("Tom", 21));
template.insert(new Person("Dick", 22));
template.insert(new Person("Harry", 23));
Query query = new Query(Criteria.where("firstName").is("Harry"));
Update update = new Update().inc("age", 1);
Person p = mongoTemplate.findAndModify(query, update, Person.class); // return's old person object
assertThat(p.getFirstName(), is("Harry"));
assertThat(p.getAge(), is(23));
p = mongoTemplate.findOne(query, Person.class);
assertThat(p.getAge(), is(24));
Person oldValue = template.update(Person.class)
.matching(query)
.apply(update)
.findAndModifyValue(); // return's old person object
assertThat(oldValue.getFirstName(), is("Harry"));
assertThat(oldValue.getAge(), is(23));
Person newValue = template.query(Person.class)
.matching(query)
.findOneValue();
assertThat(newValue.getAge(), is(24));
// Now return the newly updated document when updating
p = template.findAndModify(query, update, new FindAndModifyOptions().returnNew(true), Person.class);
assertThat(p.getAge(), is(25));
Person newestValue = template.update(Person.class)
.matching(query)
.apply(update)
.withOptions(FindAndModifyOptions.options().returnNew(true)) // Now return the newly updated document when updating
.findAndModifyValue();
assertThat(newestValue.getAge(), is(25));
----
The `FindAndModifyOptions` method lets you set the options of `returnNew`, `upsert`, and `remove`. An example extending from the previous code snippet follows:
[source,java]
----
Query query2 = new Query(Criteria.where("firstName").is("Mary"));
p = mongoTemplate.findAndModify(query2, update, new FindAndModifyOptions().returnNew(true).upsert(true), Person.class);
assertThat(p.getFirstName(), is("Mary"));
assertThat(p.getAge(), is(1));
Person upserted = template.update(Person.class)
.matching(new Query(Criteria.where("firstName").is("Mary")))
.apply(update)
.withOptions(FindAndModifyOptions.options().upsert(true).returnNew(true))
.findAndModifyValue()
assertThat(upserted.getFirstName(), is("Mary"));
assertThat(upserted.getAge(), is(1));
----
[[mongo-template.aggregation-update]]
@ -1166,10 +1178,11 @@ Earlier, we saw how to retrieve a single document by using the `findOne` and `fi @@ -1166,10 +1178,11 @@ Earlier, we saw how to retrieve a single document by using the `findOne` and `fi
import static org.springframework.data.mongodb.core.query.Criteria.where;
import static org.springframework.data.mongodb.core.query.Query.query;
// ...
List<Person> result = mongoTemplate.find(query(where("age").lt(50)
.and("accounts.balance").gt(1000.00d)), Person.class);
List<Person> result = template.query(Person.class)
.matching(query(where("age").lt(50).and("accounts.balance").gt(1000.00d)))
.all();
----
====
@ -1709,13 +1722,35 @@ db.foo.createIndex( @@ -1709,13 +1722,35 @@ db.foo.createIndex(
)
----
A query searching for `coffee cake`, sorted by relevance according to the `weights`, can be defined and executed as follows:
A query searching for `coffee cake` can be defined and executed as follows:
.Full Text Query
====
[source,java]
----
Query query = TextQuery.searching(new TextCriteria().matchingAny("coffee", "cake")).sortByScore();
Query query = TextQuery
.searching(new TextCriteria().matchingAny("coffee", "cake"));
List<Document> page = template.find(query, Document.class);
----
====
To sort results by relevance according to the `weights` use `TextQuery.sortByScore`.
.Full Text Query - Sort by Score
====
[source,java]
----
Query query = TextQuery
.searching(new TextCriteria().matchingAny("coffee", "cake"))
.sortByScore() <1>
.includeScore(); <2>
List<Document> page = template.find(query, Document.class);
----
<1> Use the score property for sorting results by relevance which triggers `.sort({'score': {'$meta': 'textScore'}})`.
<2> Use `TextQuery.includeScore()` to include the calculated relevance in the resulting `Document`.
====
You can exclude search terms by prefixing the term with `-` or by using `notMatching`, as shown in the following example (note that the two lines have the same effect and are thus redundant):
@ -1941,6 +1976,8 @@ JsonSchemaProperty.named("birthdate").ofType(Type.dateType()); @@ -1941,6 +1976,8 @@ JsonSchemaProperty.named("birthdate").ofType(Type.dateType());
JsonSchemaProperty.named("birthdate").with(JsonSchemaObject.of(Type.dateType()).description("Must be a date"));
----
The Schema builder also provides support for https://docs.mongodb.com/manual/core/security-client-side-encryption/[Client-Side Field Level Encryption]. Please refer to <<mongo.jsonSchema.encrypted-fields>> for more information,
`CollectionOptions` provides the entry point to schema support for collections, as the following example shows:
.Create collection with `$jsonSchema`
@ -2155,11 +2192,9 @@ directly there are several methods for those options. @@ -2155,11 +2192,9 @@ directly there are several methods for those options.
Query query = query(where("firstname").is("luke"))
.comment("find luke") <1>
.batchSize(100) <2>
.slaveOk(); <3>
----
<1> The comment propagated to the MongoDB profile log.
<2> The number of documents to return in each response batch.
<3> Allows querying a replica slave.
====
On the repository level the `@Meta` annotation provides means to add query options in a declarative way.
@ -2175,6 +2210,35 @@ List<Person> findByFirstname(String firstname); @@ -2175,6 +2210,35 @@ List<Person> findByFirstname(String firstname);
include::../{spring-data-commons-docs}/query-by-example.adoc[leveloffset=+1]
include::query-by-example.adoc[leveloffset=+1]
[[mongo.query.count]]
== Counting Documents
In pre 3.x versions of SpringData MongoDB the count operation was executed against MongoDBs internal collection statistics.
With the introduction of <<mongo.transactions>> this was no longer possible because statistics would not correctly reflect potential changes during a transaction requiring an aggregation based count approach.
So in 2.x `MongoOperations.count()` would use the collection statistics if no transaction was in progress, and the aggregation variant if so.
As off Spring Data MongoDB 3.x any `count` operation, may it be with our without a filter query, uses the aggregation based count approach via MongoDBs `countDocuments`.
[NOTE]
====
MongoDBs native `countDocuments` method, and the executed `$match` aggregation, does not support `$near` and `$nearSphere` but require `$geoWithin` along with `$center` or `$centerSphere` which does not support `$minDistance` (see https://jira.mongodb.org/browse/SERVER-37043).
Therefore a given `Query` will be rewritten for `count` operations using `Reactive`-/`MongoTemplate` to bypass the issue like shown below.
[source,javascript]
----
{ location : { $near : [-73.99171, 40.738868], $maxDistance : 1.1 } } <1>
{ location : { $geoWithin : { $center: [ [-73.99171, 40.738868], 1.1] } } } <2>
{ location : { $near : [-73.99171, 40.738868], $minDistance : 0.1, $maxDistance : 1.1 } } <3>
{$and :[ { $nor :[ { location :{ $geoWithin :{ $center :[ [-73.99171, 40.738868 ], 0.01] } } } ]}, { location :{ $geoWithin :{ $center :[ [-73.99171, 40.738868 ], 1.1] } } } ] } <4>
----
<1> Count source query using `$near`.
<2> Rewritten query now using `$geoWithin` with `$center`.
<3> Count source query using `$near` with `$minDistance` and `$maxDistance`.
<4> Rewritten query now a combination of `$nor` `$geowithin` critierias to work aournd unsupported `$minDistance`.
====
[[mongo.mapreduce]]
== Map-Reduce Operations
@ -2309,7 +2373,7 @@ Note that you can specify additional limit and sort values on the query, but you @@ -2309,7 +2373,7 @@ Note that you can specify additional limit and sort values on the query, but you
====
https://docs.mongodb.com/master/release-notes/4.2-compatibility/[MongoDB 4.2] removed support for the `eval` command used
by `ScriptOperations`. +
There is no replacement for the removed functionallity.
There is no replacement for the removed functionality.
====
MongoDB allows executing JavaScript functions on the server by either directly sending the script or calling a stored one. `ScriptOperations` can be accessed through `MongoTemplate` and provides basic abstraction for `JavaScript` usage. The following example shows how to us the `ScriptOperations` class:

Loading…
Cancel
Save