Browse Source

Polishing.

Refine API naming towards merge/property instead of combine/specify. Tweak documentation. Introduce Resolution.ofValue(…) for easier creation.

See #3870
Original pull request: #3986.
pull/3999/head
Mark Paluch 4 years ago
parent
commit
d133ef19dd
No known key found for this signature in database
GPG Key ID: 4406B84C1661DCD1
  1. 13
      spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/MappingMongoJsonSchemaCreator.java
  2. 21
      spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/MongoJsonSchemaCreator.java
  3. 6
      spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/JsonSchemaProperty.java
  4. 11
      spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/MergedJsonSchema.java
  5. 10
      spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/MergedJsonSchemaProperty.java
  6. 91
      spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/MongoJsonSchema.java
  7. 41
      spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/TypeUnifyingMergeFunction.java
  8. 87
      spring-data-mongodb/src/test/java/org/springframework/data/mongodb/core/MappingMongoJsonSchemaCreatorUnitTests.java
  9. 14
      spring-data-mongodb/src/test/java/org/springframework/data/mongodb/core/schema/TypeUnifyingMergeFunctionUnitTests.java
  10. 38
      src/main/asciidoc/reference/mongo-json-schema.adoc

13
spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/MappingMongoJsonSchemaCreator.java

@ -94,13 +94,8 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator { @@ -94,13 +94,8 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
}
@Override
public PropertySpecifier specify(String path) {
return new PropertySpecifier() {
@Override
public MongoJsonSchemaCreator types(Class<?>... types) {
return specifyTypesFor(path, types);
}
};
public PropertySpecifier property(String path) {
return types -> withTypesFor(path, types);
}
/**
@ -111,7 +106,7 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator { @@ -111,7 +106,7 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
* @return new instance of {@link MongoJsonSchemaCreator}.
* @since 3.4
*/
public MongoJsonSchemaCreator specifyTypesFor(String path, Class<?>... types) {
public MongoJsonSchemaCreator withTypesFor(String path, Class<?>... types) {
LinkedMultiValueMap<String, Class<?>> clone = mergeProperties.clone();
for (Class<?> type : types) {
@ -213,7 +208,7 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator { @@ -213,7 +208,7 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
}
}
return targetProperties.size() == 1 ? targetProperties.iterator().next()
: JsonSchemaProperty.combined(targetProperties);
: JsonSchemaProperty.merged(targetProperties);
}
}

21
spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/MongoJsonSchemaCreator.java

@ -79,17 +79,17 @@ public interface MongoJsonSchemaCreator { @@ -79,17 +79,17 @@ public interface MongoJsonSchemaCreator {
MongoJsonSchema createSchemaFor(Class<?> type);
/**
* Create a combined {@link MongoJsonSchema} out of the individual schemas of the given types by combining their
* Create a merged {@link MongoJsonSchema} out of the individual schemas of the given types by merging their
* properties into one large {@link MongoJsonSchema schema}.
*
*
* @param types must not be {@literal null} nor contain {@literal null}.
* @return new instance of {@link MongoJsonSchema}.
* @since 3.4
*/
default MongoJsonSchema combineSchemaFor(Class<?>... types) {
default MongoJsonSchema mergedSchemaFor(Class<?>... types) {
MongoJsonSchema[] schemas = Arrays.stream(types).map(this::createSchemaFor).toArray(MongoJsonSchema[]::new);
return MongoJsonSchema.combined(schemas);
return MongoJsonSchema.merge(schemas);
}
/**
@ -108,32 +108,32 @@ public interface MongoJsonSchemaCreator { @@ -108,32 +108,32 @@ public interface MongoJsonSchemaCreator {
* @return new instance of {@link PropertySpecifier}.
* @since 3.4
*/
PropertySpecifier specify(String path);
PropertySpecifier property(String path);
/**
* The context in which a specific {@link #getProperty()} is encountered during schema creation.
*
*
* @since 3.3
*/
interface JsonSchemaPropertyContext {
/**
* The path to a given field/property in dot notation.
*
*
* @return never {@literal null}.
*/
String getPath();
/**
* The current property.
*
*
* @return never {@literal null}.
*/
MongoPersistentProperty getProperty();
/**
* Obtain the {@link MongoPersistentEntity} for a given property.
*
*
* @param property must not be {@literal null}.
* @param <T>
* @return {@literal null} if the property is not an entity. It is nevertheless recommend to check
@ -234,7 +234,6 @@ public interface MongoJsonSchemaCreator { @@ -234,7 +234,6 @@ public interface MongoJsonSchemaCreator {
}
/**
* @since 3.4
* @author Christoph Strobl
* @since 3.4
*/
@ -246,6 +245,6 @@ public interface MongoJsonSchemaCreator { @@ -246,6 +245,6 @@ public interface MongoJsonSchemaCreator {
* @param types must not be {@literal null}.
* @return the source
*/
MongoJsonSchemaCreator types(Class<?>... types);
MongoJsonSchemaCreator withTypes(Class<?>... types);
}
}

6
spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/JsonSchemaProperty.java

@ -236,14 +236,14 @@ public interface JsonSchemaProperty extends JsonSchemaObject { @@ -236,14 +236,14 @@ public interface JsonSchemaProperty extends JsonSchemaObject {
}
/**
* Combines multiple {@link JsonSchemaProperty} with potentially different attributes into one.
* Merges multiple {@link JsonSchemaProperty} with potentially different attributes into one.
*
* @param properties must not be {@literal null}.
* @return new instance of {@link JsonSchemaProperty}.
* @since 3.4
*/
static JsonSchemaProperty combined(Collection<JsonSchemaProperty> properties) {
return new CombinedJsonSchemaProperty(properties);
static JsonSchemaProperty merged(Collection<JsonSchemaProperty> properties) {
return new MergedJsonSchemaProperty(properties);
}
/**

11
spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/CombinedJsonSchema.java → spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/MergedJsonSchema.java

@ -24,21 +24,22 @@ import java.util.function.BiFunction; @@ -24,21 +24,22 @@ import java.util.function.BiFunction;
import org.bson.Document;
/**
* {@link MongoJsonSchema} implementation that is capable of combining properties of different schemas into one.
* {@link MongoJsonSchema} implementation that is capable of merging properties from different schemas into a single
* one.
*
* @author Christoph Strobl
* @since 3.4
*/
class CombinedJsonSchema implements MongoJsonSchema {
class MergedJsonSchema implements MongoJsonSchema {
private final List<MongoJsonSchema> schemaList;
private final BiFunction<Map<String, Object>, Map<String, Object>, Document> mergeFunction;
CombinedJsonSchema(List<MongoJsonSchema> schemaList, ConflictResolutionFunction conflictResolutionFunction) {
MergedJsonSchema(List<MongoJsonSchema> schemaList, ConflictResolutionFunction conflictResolutionFunction) {
this(schemaList, new TypeUnifyingMergeFunction(conflictResolutionFunction));
}
CombinedJsonSchema(List<MongoJsonSchema> schemaList,
MergedJsonSchema(List<MongoJsonSchema> schemaList,
BiFunction<Map<String, Object>, Map<String, Object>, Document> mergeFunction) {
this.schemaList = new ArrayList<>(schemaList);
@ -46,7 +47,7 @@ class CombinedJsonSchema implements MongoJsonSchema { @@ -46,7 +47,7 @@ class CombinedJsonSchema implements MongoJsonSchema {
}
@Override
public MongoJsonSchema combineWith(Collection<MongoJsonSchema> sources) {
public MongoJsonSchema mergeWith(Collection<MongoJsonSchema> sources) {
schemaList.addAll(sources);
return this;

10
spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/CombinedJsonSchemaProperty.java → spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/MergedJsonSchemaProperty.java

@ -30,24 +30,24 @@ import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictReso @@ -30,24 +30,24 @@ import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictReso
* @author Christoph Strobl
* @since 3.4
*/
class CombinedJsonSchemaProperty implements JsonSchemaProperty {
class MergedJsonSchemaProperty implements JsonSchemaProperty {
private final Iterable<JsonSchemaProperty> properties;
private final BiFunction<Map<String, Object>, Map<String, Object>, Document> mergeFunction;
CombinedJsonSchemaProperty(Iterable<JsonSchemaProperty> properties) {
MergedJsonSchemaProperty(Iterable<JsonSchemaProperty> properties) {
this(properties, (k, a, b) -> {
throw new IllegalStateException(
String.format("Error resolving conflict for %s. No conflict resolution function defined.", k));
String.format("Error resolving conflict for '%s'. No conflict resolution function defined.", k));
});
}
CombinedJsonSchemaProperty(Iterable<JsonSchemaProperty> properties,
MergedJsonSchemaProperty(Iterable<JsonSchemaProperty> properties,
ConflictResolutionFunction conflictResolutionFunction) {
this(properties, new TypeUnifyingMergeFunction(conflictResolutionFunction));
}
CombinedJsonSchemaProperty(Iterable<JsonSchemaProperty> properties,
MergedJsonSchemaProperty(Iterable<JsonSchemaProperty> properties,
BiFunction<Map<String, Object>, Map<String, Object>, Document> mergeFunction) {
this.properties = properties;

91
spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/MongoJsonSchema.java

@ -25,6 +25,7 @@ import java.util.Set; @@ -25,6 +25,7 @@ import java.util.Set;
import org.bson.Document;
import org.springframework.data.mongodb.core.schema.TypedJsonSchemaObject.ObjectJsonSchemaObject;
import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
/**
* Interface defining MongoDB-specific JSON schema object. New objects can be built with {@link #builder()}, for
@ -79,7 +80,7 @@ public interface MongoJsonSchema { @@ -79,7 +80,7 @@ public interface MongoJsonSchema {
/**
* Create the {@link Document} defining the schema. <br />
* Property and field names need to be mapped to the domain type ones by running the {@link Document} through a
* Property and field names need to be mapped to the domain type property by running the {@link Document} through a
* {@link org.springframework.data.mongodb.core.convert.JsonSchemaMapper} to apply field name customization.
*
* @return never {@literal null}.
@ -108,69 +109,69 @@ public interface MongoJsonSchema { @@ -108,69 +109,69 @@ public interface MongoJsonSchema {
}
/**
* Create a new {@link MongoJsonSchema} combining properties from the given sources.
* Create a new {@link MongoJsonSchema} merging properties from the given sources.
*
* @param sources must not be {@literal null}.
* @return new instance of {@link MongoJsonSchema}.
* @since 3.4
*/
static MongoJsonSchema combined(MongoJsonSchema... sources) {
return combined((path, a, b) -> {
throw new IllegalStateException(
String.format("Failure combining schema for path %s holding values a) %s and b) %s.", path.dotPath(), a, b));
static MongoJsonSchema merge(MongoJsonSchema... sources) {
return merge((path, left, right) -> {
throw new IllegalStateException(String.format("Cannot merge schema for path '%s' holding values '%s' and '%s'.",
path.dotPath(), left, right));
}, sources);
}
/**
* Create a new {@link MongoJsonSchema} combining properties from the given sources.
* Create a new {@link MongoJsonSchema} merging properties from the given sources.
*
* @param sources must not be {@literal null}.
* @return new instance of {@link MongoJsonSchema}.
* @since 3.4
*/
static MongoJsonSchema combined(ConflictResolutionFunction mergeFunction, MongoJsonSchema... sources) {
return new CombinedJsonSchema(Arrays.asList(sources), mergeFunction);
static MongoJsonSchema merge(ConflictResolutionFunction mergeFunction, MongoJsonSchema... sources) {
return new MergedJsonSchema(Arrays.asList(sources), mergeFunction);
}
/**
* Create a new {@link MongoJsonSchema} combining properties from the given sources.
* Create a new {@link MongoJsonSchema} merging properties from the given sources.
*
* @param sources must not be {@literal null}.
* @return new instance of {@link MongoJsonSchema}.
* @since 3.4
*/
default MongoJsonSchema combineWith(MongoJsonSchema... sources) {
return combineWith(Arrays.asList(sources));
default MongoJsonSchema mergeWith(MongoJsonSchema... sources) {
return mergeWith(Arrays.asList(sources));
}
/**
* Create a new {@link MongoJsonSchema} combining properties from the given sources.
* Create a new {@link MongoJsonSchema} merging properties from the given sources.
*
* @param sources must not be {@literal null}.
* @return new instance of {@link MongoJsonSchema}.
* @since 3.4
*/
default MongoJsonSchema combineWith(Collection<MongoJsonSchema> sources) {
return combineWith(sources, (path, a, b) -> {
throw new IllegalStateException(
String.format("Failure combining schema for path %s holding values a) %s and b) %s.", path.dotPath(), a, b));
default MongoJsonSchema mergeWith(Collection<MongoJsonSchema> sources) {
return mergeWith(sources, (path, left, right) -> {
throw new IllegalStateException(String.format("Cannot merge schema for path '%s' holding values '%s' and '%s'.",
path.dotPath(), left, right));
});
}
/**
* Create a new {@link MongoJsonSchema} combining properties from the given sources.
* Create a new {@link MongoJsonSchema} merging properties from the given sources.
*
* @param sources must not be {@literal null}.
* @return new instance of {@link MongoJsonSchema}.
* @since 3.4
*/
default MongoJsonSchema combineWith(Collection<MongoJsonSchema> sources,
default MongoJsonSchema mergeWith(Collection<MongoJsonSchema> sources,
ConflictResolutionFunction conflictResolutionFunction) {
List<MongoJsonSchema> schemaList = new ArrayList<>(sources.size() + 1);
schemaList.add(this);
schemaList.addAll(new ArrayList<>(sources));
return new CombinedJsonSchema(schemaList, conflictResolutionFunction);
return new MergedJsonSchema(schemaList, conflictResolutionFunction);
}
/**
@ -183,8 +184,8 @@ public interface MongoJsonSchema { @@ -183,8 +184,8 @@ public interface MongoJsonSchema {
}
/**
* A resolution function that may be called on conflicting paths. Eg. when trying to merge properties with different
* values into one.
* A resolution function that is called on conflicting paths when trying to merge properties with different values
* into a single value.
*
* @author Christoph Strobl
* @since 3.4
@ -193,12 +194,14 @@ public interface MongoJsonSchema { @@ -193,12 +194,14 @@ public interface MongoJsonSchema {
interface ConflictResolutionFunction {
/**
* Resolve the conflict for two values under the same {@code path}.
*
* @param path the {@link Path} leading to the conflict.
* @param a can be {@literal null}.
* @param b can be {@literal null}.
* @param left can be {@literal null}.
* @param right can be {@literal null}.
* @return never {@literal null}.
*/
Resolution resolveConflict(Path path, @Nullable Object a, @Nullable Object b);
Resolution resolveConflict(Path path, @Nullable Object left, @Nullable Object right);
/**
* @author Christoph Strobl
@ -218,7 +221,7 @@ public interface MongoJsonSchema { @@ -218,7 +221,7 @@ public interface MongoJsonSchema {
}
/**
* The result after processing a conflict when combining schemas. May indicate to {@link #SKIP skip} the entry
* The result after processing a conflict when merging schemas. May indicate to {@link #SKIP skip} the entry
* entirely.
*
* @author Christoph Strobl
@ -260,6 +263,42 @@ public interface MongoJsonSchema { @@ -260,6 +263,42 @@ public interface MongoJsonSchema {
static Resolution skip() {
return SKIP;
}
/**
* Construct a resolution for a {@link Path} using the given {@code value}.
*
* @param path the conflicting path.
* @param value the value to apply.
* @return
*/
static Resolution ofValue(Path path, Object value) {
Assert.notNull(path, "Path must not be null");
return ofValue(path.currentElement(), value);
}
/**
* Construct a resolution from a {@code key} and {@code value}.
*
* @param key name of the path segment, typically {@link Path#currentElement()}
* @param value the value to apply.
* @return
*/
static Resolution ofValue(String key, Object value) {
return new Resolution() {
@Override
public String getKey() {
return key;
}
@Override
public Object getValue() {
return value;
}
};
}
}
}

41
spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/TypeUnifyingMergeFunction.java

@ -25,11 +25,14 @@ import org.bson.Document; @@ -25,11 +25,14 @@ import org.bson.Document;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictResolutionFunction;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictResolutionFunction.Path;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictResolutionFunction.Resolution;
import org.springframework.lang.Nullable;
import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils;
/**
* Merge function considering BSON type hints. Conflicts are resolved through a {@link ConflictResolutionFunction}.
*
* @author Christoph Strobl
* @since 3.4
*/
@ -42,15 +45,16 @@ class TypeUnifyingMergeFunction implements BiFunction<Map<String, Object>, Map<S @@ -42,15 +45,16 @@ class TypeUnifyingMergeFunction implements BiFunction<Map<String, Object>, Map<S
}
@Override
public Document apply(Map<String, Object> a, Map<String, Object> b) {
return merge(SimplePath.root(), a, b);
public Document apply(Map<String, Object> left, Map<String, Object> right) {
return merge(SimplePath.root(), left, right);
}
Document merge(SimplePath path, Map<String, Object> a, Map<String, Object> b) {
@SuppressWarnings("unchecked")
Document merge(SimplePath path, Map<String, Object> left, Map<String, Object> right) {
Document target = new Document(a);
Document target = new Document(left);
for (String key : b.keySet()) {
for (String key : right.keySet()) {
SimplePath currentPath = path.append(key);
if (isTypeKey(key)) {
@ -58,39 +62,39 @@ class TypeUnifyingMergeFunction implements BiFunction<Map<String, Object>, Map<S @@ -58,39 +62,39 @@ class TypeUnifyingMergeFunction implements BiFunction<Map<String, Object>, Map<S
Object unifiedExistingType = getUnifiedExistingType(key, target);
if (unifiedExistingType != null) {
if (!ObjectUtils.nullSafeEquals(unifiedExistingType, b.get(key))) {
resolveConflict(currentPath, a, b, target);
if (!ObjectUtils.nullSafeEquals(unifiedExistingType, right.get(key))) {
resolveConflict(currentPath, left, right, target);
}
continue;
}
}
if (!target.containsKey(key)) {
target.put(key, b.get(key));
target.put(key, right.get(key));
continue;
}
Object existingEntry = target.get(key);
Object newEntry = b.get(key);
Object newEntry = right.get(key);
if (existingEntry instanceof Map && newEntry instanceof Map) {
target.put(key, merge(currentPath, (Map) existingEntry, (Map) newEntry));
target.put(key, merge(currentPath, (Map<String, Object>) existingEntry, (Map<String, Object>) newEntry));
} else if (!ObjectUtils.nullSafeEquals(existingEntry, newEntry)) {
resolveConflict(currentPath, a, b, target);
resolveConflict(currentPath, left, right, target);
}
}
return target;
}
private void resolveConflict(Path path, Map<String, Object> a, Map<String, Object> b, Document target) {
applyConflictResolution(path, target, conflictResolutionFunction.resolveConflict(path, a, b));
private void resolveConflict(Path path, Map<String, Object> left, Map<String, Object> right, Document target) {
applyConflictResolution(path, target, conflictResolutionFunction.resolveConflict(path, left, right));
}
private void applyConflictResolution(Path path, Document target, Resolution resolution) {
if (Resolution.SKIP.equals(resolution) || resolution.getValue() == null) {
target.remove(path.currentElement());
return ;
return;
}
if (isTypeKey(resolution.getKey())) {
@ -115,19 +119,20 @@ class TypeUnifyingMergeFunction implements BiFunction<Map<String, Object>, Map<S @@ -115,19 +119,20 @@ class TypeUnifyingMergeFunction implements BiFunction<Map<String, Object>, Map<S
return key;
}
@Nullable
private static Object getUnifiedExistingType(String key, Document source) {
return source.get(getTypeKeyToUse(key, source));
}
/**
* Trivial {@link List} based {@link Path} implementation.
*
*
* @author Christoph Strobl
* @since 3.4
*/
static class SimplePath implements Path {
private List<String> path;
private final List<String> path;
SimplePath(List<String> path) {
this.path = path;
@ -137,10 +142,6 @@ class TypeUnifyingMergeFunction implements BiFunction<Map<String, Object>, Map<S @@ -137,10 +142,6 @@ class TypeUnifyingMergeFunction implements BiFunction<Map<String, Object>, Map<S
return new SimplePath(Collections.emptyList());
}
static SimplePath of(List<String> path) {
return new SimplePath(new ArrayList<>(path));
}
static SimplePath of(List<String> path, String next) {
List<String> fullPath = new ArrayList<>(path.size() + 1);

87
spring-data-mongodb/src/test/java/org/springframework/data/mongodb/core/MappingMongoJsonSchemaCreatorUnitTests.java

@ -41,7 +41,6 @@ import org.springframework.data.mongodb.core.mapping.MongoMappingContext; @@ -41,7 +41,6 @@ import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.schema.JsonSchemaObject.Type;
import org.springframework.data.mongodb.core.schema.JsonSchemaProperty;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictResolutionFunction;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictResolutionFunction.Resolution;
import org.springframework.data.spel.spi.EvaluationContextExtension;
import org.springframework.data.spel.spi.Function;
@ -52,14 +51,14 @@ import org.springframework.data.spel.spi.Function; @@ -52,14 +51,14 @@ import org.springframework.data.spel.spi.Function;
* @author Christoph Strobl
* @author Mark Paluch
*/
public class MappingMongoJsonSchemaCreatorUnitTests {
class MappingMongoJsonSchemaCreatorUnitTests {
MappingMongoConverter converter;
MongoMappingContext mappingContext;
MappingMongoJsonSchemaCreator schemaCreator;
private MappingMongoConverter converter;
private MongoMappingContext mappingContext;
private MappingMongoJsonSchemaCreator schemaCreator;
@BeforeEach
public void setUp() {
void setUp() {
mappingContext = new MongoMappingContext();
converter = new MappingMongoConverter(NoOpDbRefResolver.INSTANCE, mappingContext);
@ -67,7 +66,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests { @@ -67,7 +66,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
}
@Test // DATAMONGO-1849
public void simpleTypes() {
void simpleTypes() {
MongoJsonSchema schema = schemaCreator.createSchemaFor(VariousFieldTypes.class);
@ -75,21 +74,21 @@ public class MappingMongoJsonSchemaCreatorUnitTests { @@ -75,21 +74,21 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
}
@Test // DATAMONGO-1849
public void withRemappedIdType() {
void withRemappedIdType() {
MongoJsonSchema schema = schemaCreator.createSchemaFor(WithExplicitMongoIdTypeMapping.class);
assertThat(schema.toDocument().get("$jsonSchema", Document.class)).isEqualTo(WITH_EXPLICIT_MONGO_ID_TYPE_MAPPING);
}
@Test // DATAMONGO-1849
public void cyclic() {
void cyclic() {
MongoJsonSchema schema = schemaCreator.createSchemaFor(Cyclic.class);
assertThat(schema.toDocument().get("$jsonSchema", Document.class)).isEqualTo(CYCLIC);
}
@Test // DATAMONGO-1849
public void converterRegistered() {
void converterRegistered() {
MappingMongoConverter converter = new MappingMongoConverter(NoOpDbRefResolver.INSTANCE, mappingContext);
MongoCustomConversions mcc = new MongoCustomConversions(
@ -105,7 +104,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests { @@ -105,7 +104,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
}
@Test // GH-3800
public void csfle/*encryptedFieldsOnly*/() {
void csfle/*encryptedFieldsOnly*/() {
MongoJsonSchema schema = MongoJsonSchemaCreator.create() //
.filter(MongoJsonSchemaCreator.encryptedOnly()) // filter non encrypted fields
@ -116,7 +115,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests { @@ -116,7 +115,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
}
@Test // GH-3800
public void csfleCyclic/*encryptedFieldsOnly*/() {
void csfleCyclic/*encryptedFieldsOnly*/() {
MongoJsonSchema schema = MongoJsonSchemaCreator.create() //
.filter(MongoJsonSchemaCreator.encryptedOnly()) // filter non encrypted fields
@ -127,7 +126,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests { @@ -127,7 +126,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
}
@Test // GH-3800
public void csfleWithKeyFromProperties() {
void csfleWithKeyFromProperties() {
GenericApplicationContext applicationContext = new GenericApplicationContext();
applicationContext.registerBean("encryptionExtension", EncryptionExtension.class, () -> new EncryptionExtension());
@ -145,7 +144,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests { @@ -145,7 +144,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
}
@Test // GH-3800
public void csfleWithKeyFromMethod() {
void csfleWithKeyFromMethod() {
GenericApplicationContext applicationContext = new GenericApplicationContext();
applicationContext.registerBean("encryptionExtension", EncryptionExtension.class, () -> new EncryptionExtension());
@ -168,8 +167,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests { @@ -168,8 +167,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
void shouldAllowToSpecifyPolymorphicTypesForProperty() {
MongoJsonSchema schema = MongoJsonSchemaCreator.create() //
.specify("objectValue").types(A.class, B.class)
.createSchemaFor(SomeTestObject.class);
.property("objectValue").withTypes(A.class, B.class).createSchemaFor(SomeTestObject.class);
Document targetSchema = schema.schemaDocument();
assertThat(targetSchema) //
@ -182,13 +180,12 @@ public class MappingMongoJsonSchemaCreatorUnitTests { @@ -182,13 +180,12 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
void shouldAllowToSpecifyNestedPolymorphicTypesForProperty() {
MongoJsonSchema schema = MongoJsonSchemaCreator.create() //
.specify("value.objectValue").types(A.class, B.class) //
.property("value.objectValue").withTypes(A.class, B.class) //
.createSchemaFor(WrapperAroundA.class);
Document targetSchema = schema.schemaDocument();
assertThat(schema.schemaDocument()) //
.containsEntry("properties.value.properties.objectValue.properties.aNonEncrypted", new Document("type", "string")) //
.containsEntry("properties.value.properties.objectValue.properties.aNonEncrypted",
new Document("type", "string")) //
.containsEntry("properties.value.properties.objectValue.properties.aEncrypted", ENCRYPTED_BSON_STRING) //
.containsEntry("properties.value.properties.objectValue.properties.bEncrypted", ENCRYPTED_BSON_STRING);
@ -198,8 +195,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests { @@ -198,8 +195,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
void shouldAllowToSpecifyGenericTypesForProperty() {
MongoJsonSchema schema = MongoJsonSchemaCreator.create() //
.specify("genericValue").types(A.class, B.class)
.createSchemaFor(SomeTestObject.class);
.property("genericValue").withTypes(A.class, B.class).createSchemaFor(SomeTestObject.class);
assertThat(schema.schemaDocument()) //
.containsEntry("properties.genericValue.properties.aNonEncrypted", new Document("type", "string")) //
@ -211,7 +207,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests { @@ -211,7 +207,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
void encryptionFilterShouldCaptureSpecifiedPolymorphicTypesForProperty() {
MongoJsonSchema schema = MongoJsonSchemaCreator.create() //
.specify("objectValue").types(A.class, B.class) //
.property("objectValue").withTypes(A.class, B.class) //
.filter(MongoJsonSchemaCreator.encryptedOnly()) //
.createSchemaFor(SomeTestObject.class);
@ -224,8 +220,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests { @@ -224,8 +220,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
@Test // GH-3870
void allowsToCreateCombinedSchemaWhenPropertiesDoNotOverlap() {
MongoJsonSchema schema = MongoJsonSchemaCreator.create()
.combineSchemaFor(A.class, B.class, C.class);
MongoJsonSchema schema = MongoJsonSchemaCreator.create().mergedSchemaFor(A.class, B.class, C.class);
assertThat(schema.schemaDocument()) //
.containsEntry("properties.aNonEncrypted", new Document("type", "string")) //
@ -242,9 +237,9 @@ public class MappingMongoJsonSchemaCreatorUnitTests { @@ -242,9 +237,9 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
MongoJsonSchema schemaAButDifferent = MongoJsonSchemaCreator.create() //
.createSchemaFor(PropertyClashWithA.class);
MongoJsonSchema targetSchema = schemaA.combineWith(schemaAButDifferent);
MongoJsonSchema targetSchema = schemaA.mergeWith(schemaAButDifferent);
assertThatExceptionOfType(IllegalStateException.class).isThrownBy(() -> targetSchema.schemaDocument());
assertThatExceptionOfType(IllegalStateException.class).isThrownBy(targetSchema::schemaDocument);
}
@Test // GH-3870
@ -255,18 +250,8 @@ public class MappingMongoJsonSchemaCreatorUnitTests { @@ -255,18 +250,8 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
MongoJsonSchema schemaAButDifferent = MongoJsonSchemaCreator.create() //
.createSchemaFor(PropertyClashWithA.class);
MongoJsonSchema schema = schemaA.combineWith(Collections.singleton(schemaAButDifferent), (path, a, b) -> new Resolution() {
@Override
public String getKey() {
return path.currentElement();
}
@Override
public Object getValue() {
return "object";
}
});
MongoJsonSchema schema = schemaA.mergeWith(Collections.singleton(schemaAButDifferent),
(path, a, b) -> Resolution.ofValue(path, "object"));
assertThat(schema.schemaDocument()) //
.containsEntry("properties.aNonEncrypted", new Document("type", "object"));
@ -276,13 +261,11 @@ public class MappingMongoJsonSchemaCreatorUnitTests { @@ -276,13 +261,11 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
void bsonTypeVsJustTypeValueResolutionIsDoneByDefault() {
MongoJsonSchema schemaUsingType = MongoJsonSchema.builder()
.property(JsonSchemaProperty.named("value").ofType(Type.jsonTypeOf("string")))
.build();
.property(JsonSchemaProperty.named("value").ofType(Type.jsonTypeOf("string"))).build();
MongoJsonSchema schemaUsingBsonType = MongoJsonSchema.builder()
.property(JsonSchemaProperty.named("value").ofType(Type.bsonTypeOf("string")))
.build();
.property(JsonSchemaProperty.named("value").ofType(Type.bsonTypeOf("string"))).build();
MongoJsonSchema targetSchema = MongoJsonSchema.combined(schemaUsingType, schemaUsingBsonType);
MongoJsonSchema targetSchema = MongoJsonSchema.merge(schemaUsingType, schemaUsingBsonType);
assertThat(targetSchema.schemaDocument()) //
.containsEntry("properties.value", new Document("type", "string"));
@ -292,7 +275,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests { @@ -292,7 +275,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
// --> ENUM
static final String JUST_SOME_ENUM = "{ 'type' : 'string', 'enum' : ['ONE', 'TWO'] }";
private static final String JUST_SOME_ENUM = "{ 'type' : 'string', 'enum' : ['ONE', 'TWO'] }";
enum JustSomeEnum {
ONE, TWO
@ -657,14 +640,15 @@ public class MappingMongoJsonSchemaCreatorUnitTests { @@ -657,14 +640,15 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
}
}
static final Document ENCRYPTED_BSON_STRING = Document.parse("{'encrypt': { 'bsonType': 'string','algorithm': 'AEAD_AES_256_CBC_HMAC_SHA_512-Deterministic'} }");
private static final Document ENCRYPTED_BSON_STRING = Document
.parse("{'encrypt': { 'bsonType': 'string','algorithm': 'AEAD_AES_256_CBC_HMAC_SHA_512-Deterministic'} }");
static class SomeTestObject<T> {
T genericValue;
Object objectValue;
}
static class RootWithGenerics<S,T> {
static class RootWithGenerics<S, T> {
S sValue;
T tValue;
}
@ -686,18 +670,15 @@ public class MappingMongoJsonSchemaCreatorUnitTests { @@ -686,18 +670,15 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
String aNonEncrypted;
@Encrypted(algorithm = "AEAD_AES_256_CBC_HMAC_SHA_512-Deterministic")
String aEncrypted;
@Encrypted(algorithm = "AEAD_AES_256_CBC_HMAC_SHA_512-Deterministic") String aEncrypted;
}
static class B {
@Encrypted(algorithm = "AEAD_AES_256_CBC_HMAC_SHA_512-Deterministic")
String bEncrypted;
@Encrypted(algorithm = "AEAD_AES_256_CBC_HMAC_SHA_512-Deterministic") String bEncrypted;
}
static class C extends A {
@Encrypted(algorithm = "AEAD_AES_256_CBC_HMAC_SHA_512-Deterministic")
String cEncrypted;
@Encrypted(algorithm = "AEAD_AES_256_CBC_HMAC_SHA_512-Deterministic") String cEncrypted;
}
static class PropertyClashWithA {

14
spring-data-mongodb/src/test/java/org/springframework/data/mongodb/core/schema/TypeUnifyingMergeFunctionUnitTests.java

@ -33,6 +33,8 @@ import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictReso @@ -33,6 +33,8 @@ import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictReso
import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictResolutionFunction.Resolution;
/**
* Unit tests for {@link TypeUnifyingMergeFunction}.
*
* @author Christoph Strobl
*/
@ExtendWith(MockitoExtension.class)
@ -101,17 +103,7 @@ public class TypeUnifyingMergeFunctionUnitTests { @@ -101,17 +103,7 @@ public class TypeUnifyingMergeFunctionUnitTests {
ArgumentCaptor<Object> aValueCaptor = ArgumentCaptor.forClass(Object.class);
ArgumentCaptor<Object> bValueCaptor = ArgumentCaptor.forClass(Object.class);
when(crf.resolveConflict(any(), aValueCaptor.capture(), bValueCaptor.capture())).thenReturn(new Resolution() {
@Override
public String getKey() {
return "nested";
}
@Override
public Object getValue() {
return "from-function";
}
});
when(crf.resolveConflict(any(), aValueCaptor.capture(), bValueCaptor.capture())).thenReturn(Resolution.ofValue("nested", "from-function"));
Map<String, Object> a = new LinkedHashMap<>();
a.put("nested", Collections.singletonMap("a", "a-value"));

38
src/main/asciidoc/reference/mongo-json-schema.adoc

@ -192,25 +192,25 @@ unless there is more specific information available via the `@MongoId` annotatio @@ -192,25 +192,25 @@ unless there is more specific information available via the `@MongoId` annotatio
The above example demonstrated how to derive the schema from a very precise typed source.
Using polymorphic elements within the domain model can lead to inaccurate schema representation for `Object` and generic `<T>` types, which are likely to represented as `{ type : 'object' }` without further specification.
`MongoJsonSchemaCreator.specify(...)` allows to define additional types that should be considered when rendering the schema.
`MongoJsonSchemaCreator.property(…)` allows defining additional details such as nested document types that should be considered when rendering the schema.
.Specify additional types for properties
====
[source,java]
----
public class Root {
class Root {
Object value;
}
public class A {
class A {
String aValue;
}
public class B {
class B {
String bValue;
}
MongoJsonSchemaCreator.create()
.specify("value").types(A.class, B.class) <1>
.property("value").withTypes(A.class, B.class) <1>
----
[source,json]
@ -220,7 +220,7 @@ MongoJsonSchemaCreator.create() @@ -220,7 +220,7 @@ MongoJsonSchemaCreator.create()
'properties' : {
'value' : {
'type' : 'object',
'properties' : { <1>
'properties' : { <1>
'aValue' : { 'type' : 'string' },
'bValue' : { 'type' : 'string' }
}
@ -228,30 +228,30 @@ MongoJsonSchemaCreator.create() @@ -228,30 +228,30 @@ MongoJsonSchemaCreator.create()
}
}
----
<1> Properties of the given types are combined into one element.
<1> Properties of the given types are merged into one element.
====
MongoDBs schema free approach allows to store documents of different structure in one collection.
MongoDBs schema-free approach allows storing documents of different structure in one collection.
Those may be modeled having a common base class.
Regardless of the chosen approach `MongoJsonSchemaCreator.combine(...)` is can help circumvent the need of combining multiple schema into one.
Regardless of the chosen approach, `MongoJsonSchemaCreator.merge(…)` can help circumvent the need of merging multiple schema into one.
.Combining multiple Schemas
.Merging multiple Schemas into a single Schema definition
====
[source,java]
----
public abstract class Root {
abstract class Root {
String rootValue;
}
public class A extends Root {
class A extends Root {
String aValue;
}
public class B extends Root {
class B extends Root {
String bValue;
}
MongoJsonSchemaCreator.combined(A.class, B.class) <1>
MongoJsonSchemaCreator.mergedSchemaFor(A.class, B.class) <1>
----
[source,json]
@ -271,17 +271,17 @@ MongoJsonSchemaCreator.combined(A.class, B.class) <1> @@ -271,17 +271,17 @@ MongoJsonSchemaCreator.combined(A.class, B.class) <1>
[NOTE]
====
Equally named properties need to refer to the same json schema in order to be combined.
The following example shows a definition that cannot be combined automatically because of a data type mismatch.
In this case a `ConflictResolutionFunction` has to be provided to `MongoJsonSchemaCreator`.
Properties with the same name need to refer to the same JSON schema in order to be combined.
The following example shows a definition that cannot be merged automatically because of a data type mismatch.
In this case a `ConflictResolutionFunction` must be provided to `MongoJsonSchemaCreator`.
[source,java]
----
public class A extends Root {
class A extends Root {
String value;
}
public class B extends Root {
class B extends Root {
Integer value;
}
----

Loading…
Cancel
Save