Browse Source

Polishing.

Refine API naming towards merge/property instead of combine/specify. Tweak documentation. Introduce Resolution.ofValue(…) for easier creation.

See #3870
Original pull request: #3986.
pull/3999/head
Mark Paluch 4 years ago
parent
commit
d133ef19dd
No known key found for this signature in database
GPG Key ID: 4406B84C1661DCD1
  1. 13
      spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/MappingMongoJsonSchemaCreator.java
  2. 21
      spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/MongoJsonSchemaCreator.java
  3. 6
      spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/JsonSchemaProperty.java
  4. 11
      spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/MergedJsonSchema.java
  5. 10
      spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/MergedJsonSchemaProperty.java
  6. 91
      spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/MongoJsonSchema.java
  7. 41
      spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/TypeUnifyingMergeFunction.java
  8. 87
      spring-data-mongodb/src/test/java/org/springframework/data/mongodb/core/MappingMongoJsonSchemaCreatorUnitTests.java
  9. 14
      spring-data-mongodb/src/test/java/org/springframework/data/mongodb/core/schema/TypeUnifyingMergeFunctionUnitTests.java
  10. 38
      src/main/asciidoc/reference/mongo-json-schema.adoc

13
spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/MappingMongoJsonSchemaCreator.java

@ -94,13 +94,8 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
} }
@Override @Override
public PropertySpecifier specify(String path) { public PropertySpecifier property(String path) {
return new PropertySpecifier() { return types -> withTypesFor(path, types);
@Override
public MongoJsonSchemaCreator types(Class<?>... types) {
return specifyTypesFor(path, types);
}
};
} }
/** /**
@ -111,7 +106,7 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
* @return new instance of {@link MongoJsonSchemaCreator}. * @return new instance of {@link MongoJsonSchemaCreator}.
* @since 3.4 * @since 3.4
*/ */
public MongoJsonSchemaCreator specifyTypesFor(String path, Class<?>... types) { public MongoJsonSchemaCreator withTypesFor(String path, Class<?>... types) {
LinkedMultiValueMap<String, Class<?>> clone = mergeProperties.clone(); LinkedMultiValueMap<String, Class<?>> clone = mergeProperties.clone();
for (Class<?> type : types) { for (Class<?> type : types) {
@ -213,7 +208,7 @@ class MappingMongoJsonSchemaCreator implements MongoJsonSchemaCreator {
} }
} }
return targetProperties.size() == 1 ? targetProperties.iterator().next() return targetProperties.size() == 1 ? targetProperties.iterator().next()
: JsonSchemaProperty.combined(targetProperties); : JsonSchemaProperty.merged(targetProperties);
} }
} }

21
spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/MongoJsonSchemaCreator.java

@ -79,17 +79,17 @@ public interface MongoJsonSchemaCreator {
MongoJsonSchema createSchemaFor(Class<?> type); MongoJsonSchema createSchemaFor(Class<?> type);
/** /**
* Create a combined {@link MongoJsonSchema} out of the individual schemas of the given types by combining their * Create a merged {@link MongoJsonSchema} out of the individual schemas of the given types by merging their
* properties into one large {@link MongoJsonSchema schema}. * properties into one large {@link MongoJsonSchema schema}.
* *
* @param types must not be {@literal null} nor contain {@literal null}. * @param types must not be {@literal null} nor contain {@literal null}.
* @return new instance of {@link MongoJsonSchema}. * @return new instance of {@link MongoJsonSchema}.
* @since 3.4 * @since 3.4
*/ */
default MongoJsonSchema combineSchemaFor(Class<?>... types) { default MongoJsonSchema mergedSchemaFor(Class<?>... types) {
MongoJsonSchema[] schemas = Arrays.stream(types).map(this::createSchemaFor).toArray(MongoJsonSchema[]::new); MongoJsonSchema[] schemas = Arrays.stream(types).map(this::createSchemaFor).toArray(MongoJsonSchema[]::new);
return MongoJsonSchema.combined(schemas); return MongoJsonSchema.merge(schemas);
} }
/** /**
@ -108,32 +108,32 @@ public interface MongoJsonSchemaCreator {
* @return new instance of {@link PropertySpecifier}. * @return new instance of {@link PropertySpecifier}.
* @since 3.4 * @since 3.4
*/ */
PropertySpecifier specify(String path); PropertySpecifier property(String path);
/** /**
* The context in which a specific {@link #getProperty()} is encountered during schema creation. * The context in which a specific {@link #getProperty()} is encountered during schema creation.
* *
* @since 3.3 * @since 3.3
*/ */
interface JsonSchemaPropertyContext { interface JsonSchemaPropertyContext {
/** /**
* The path to a given field/property in dot notation. * The path to a given field/property in dot notation.
* *
* @return never {@literal null}. * @return never {@literal null}.
*/ */
String getPath(); String getPath();
/** /**
* The current property. * The current property.
* *
* @return never {@literal null}. * @return never {@literal null}.
*/ */
MongoPersistentProperty getProperty(); MongoPersistentProperty getProperty();
/** /**
* Obtain the {@link MongoPersistentEntity} for a given property. * Obtain the {@link MongoPersistentEntity} for a given property.
* *
* @param property must not be {@literal null}. * @param property must not be {@literal null}.
* @param <T> * @param <T>
* @return {@literal null} if the property is not an entity. It is nevertheless recommend to check * @return {@literal null} if the property is not an entity. It is nevertheless recommend to check
@ -234,7 +234,6 @@ public interface MongoJsonSchemaCreator {
} }
/** /**
* @since 3.4
* @author Christoph Strobl * @author Christoph Strobl
* @since 3.4 * @since 3.4
*/ */
@ -246,6 +245,6 @@ public interface MongoJsonSchemaCreator {
* @param types must not be {@literal null}. * @param types must not be {@literal null}.
* @return the source * @return the source
*/ */
MongoJsonSchemaCreator types(Class<?>... types); MongoJsonSchemaCreator withTypes(Class<?>... types);
} }
} }

6
spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/JsonSchemaProperty.java

@ -236,14 +236,14 @@ public interface JsonSchemaProperty extends JsonSchemaObject {
} }
/** /**
* Combines multiple {@link JsonSchemaProperty} with potentially different attributes into one. * Merges multiple {@link JsonSchemaProperty} with potentially different attributes into one.
* *
* @param properties must not be {@literal null}. * @param properties must not be {@literal null}.
* @return new instance of {@link JsonSchemaProperty}. * @return new instance of {@link JsonSchemaProperty}.
* @since 3.4 * @since 3.4
*/ */
static JsonSchemaProperty combined(Collection<JsonSchemaProperty> properties) { static JsonSchemaProperty merged(Collection<JsonSchemaProperty> properties) {
return new CombinedJsonSchemaProperty(properties); return new MergedJsonSchemaProperty(properties);
} }
/** /**

11
spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/CombinedJsonSchema.java → spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/MergedJsonSchema.java

@ -24,21 +24,22 @@ import java.util.function.BiFunction;
import org.bson.Document; import org.bson.Document;
/** /**
* {@link MongoJsonSchema} implementation that is capable of combining properties of different schemas into one. * {@link MongoJsonSchema} implementation that is capable of merging properties from different schemas into a single
* one.
* *
* @author Christoph Strobl * @author Christoph Strobl
* @since 3.4 * @since 3.4
*/ */
class CombinedJsonSchema implements MongoJsonSchema { class MergedJsonSchema implements MongoJsonSchema {
private final List<MongoJsonSchema> schemaList; private final List<MongoJsonSchema> schemaList;
private final BiFunction<Map<String, Object>, Map<String, Object>, Document> mergeFunction; private final BiFunction<Map<String, Object>, Map<String, Object>, Document> mergeFunction;
CombinedJsonSchema(List<MongoJsonSchema> schemaList, ConflictResolutionFunction conflictResolutionFunction) { MergedJsonSchema(List<MongoJsonSchema> schemaList, ConflictResolutionFunction conflictResolutionFunction) {
this(schemaList, new TypeUnifyingMergeFunction(conflictResolutionFunction)); this(schemaList, new TypeUnifyingMergeFunction(conflictResolutionFunction));
} }
CombinedJsonSchema(List<MongoJsonSchema> schemaList, MergedJsonSchema(List<MongoJsonSchema> schemaList,
BiFunction<Map<String, Object>, Map<String, Object>, Document> mergeFunction) { BiFunction<Map<String, Object>, Map<String, Object>, Document> mergeFunction) {
this.schemaList = new ArrayList<>(schemaList); this.schemaList = new ArrayList<>(schemaList);
@ -46,7 +47,7 @@ class CombinedJsonSchema implements MongoJsonSchema {
} }
@Override @Override
public MongoJsonSchema combineWith(Collection<MongoJsonSchema> sources) { public MongoJsonSchema mergeWith(Collection<MongoJsonSchema> sources) {
schemaList.addAll(sources); schemaList.addAll(sources);
return this; return this;

10
spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/CombinedJsonSchemaProperty.java → spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/MergedJsonSchemaProperty.java

@ -30,24 +30,24 @@ import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictReso
* @author Christoph Strobl * @author Christoph Strobl
* @since 3.4 * @since 3.4
*/ */
class CombinedJsonSchemaProperty implements JsonSchemaProperty { class MergedJsonSchemaProperty implements JsonSchemaProperty {
private final Iterable<JsonSchemaProperty> properties; private final Iterable<JsonSchemaProperty> properties;
private final BiFunction<Map<String, Object>, Map<String, Object>, Document> mergeFunction; private final BiFunction<Map<String, Object>, Map<String, Object>, Document> mergeFunction;
CombinedJsonSchemaProperty(Iterable<JsonSchemaProperty> properties) { MergedJsonSchemaProperty(Iterable<JsonSchemaProperty> properties) {
this(properties, (k, a, b) -> { this(properties, (k, a, b) -> {
throw new IllegalStateException( throw new IllegalStateException(
String.format("Error resolving conflict for %s. No conflict resolution function defined.", k)); String.format("Error resolving conflict for '%s'. No conflict resolution function defined.", k));
}); });
} }
CombinedJsonSchemaProperty(Iterable<JsonSchemaProperty> properties, MergedJsonSchemaProperty(Iterable<JsonSchemaProperty> properties,
ConflictResolutionFunction conflictResolutionFunction) { ConflictResolutionFunction conflictResolutionFunction) {
this(properties, new TypeUnifyingMergeFunction(conflictResolutionFunction)); this(properties, new TypeUnifyingMergeFunction(conflictResolutionFunction));
} }
CombinedJsonSchemaProperty(Iterable<JsonSchemaProperty> properties, MergedJsonSchemaProperty(Iterable<JsonSchemaProperty> properties,
BiFunction<Map<String, Object>, Map<String, Object>, Document> mergeFunction) { BiFunction<Map<String, Object>, Map<String, Object>, Document> mergeFunction) {
this.properties = properties; this.properties = properties;

91
spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/MongoJsonSchema.java

@ -25,6 +25,7 @@ import java.util.Set;
import org.bson.Document; import org.bson.Document;
import org.springframework.data.mongodb.core.schema.TypedJsonSchemaObject.ObjectJsonSchemaObject; import org.springframework.data.mongodb.core.schema.TypedJsonSchemaObject.ObjectJsonSchemaObject;
import org.springframework.lang.Nullable; import org.springframework.lang.Nullable;
import org.springframework.util.Assert;
/** /**
* Interface defining MongoDB-specific JSON schema object. New objects can be built with {@link #builder()}, for * Interface defining MongoDB-specific JSON schema object. New objects can be built with {@link #builder()}, for
@ -79,7 +80,7 @@ public interface MongoJsonSchema {
/** /**
* Create the {@link Document} defining the schema. <br /> * Create the {@link Document} defining the schema. <br />
* Property and field names need to be mapped to the domain type ones by running the {@link Document} through a * Property and field names need to be mapped to the domain type property by running the {@link Document} through a
* {@link org.springframework.data.mongodb.core.convert.JsonSchemaMapper} to apply field name customization. * {@link org.springframework.data.mongodb.core.convert.JsonSchemaMapper} to apply field name customization.
* *
* @return never {@literal null}. * @return never {@literal null}.
@ -108,69 +109,69 @@ public interface MongoJsonSchema {
} }
/** /**
* Create a new {@link MongoJsonSchema} combining properties from the given sources. * Create a new {@link MongoJsonSchema} merging properties from the given sources.
* *
* @param sources must not be {@literal null}. * @param sources must not be {@literal null}.
* @return new instance of {@link MongoJsonSchema}. * @return new instance of {@link MongoJsonSchema}.
* @since 3.4 * @since 3.4
*/ */
static MongoJsonSchema combined(MongoJsonSchema... sources) { static MongoJsonSchema merge(MongoJsonSchema... sources) {
return combined((path, a, b) -> { return merge((path, left, right) -> {
throw new IllegalStateException( throw new IllegalStateException(String.format("Cannot merge schema for path '%s' holding values '%s' and '%s'.",
String.format("Failure combining schema for path %s holding values a) %s and b) %s.", path.dotPath(), a, b)); path.dotPath(), left, right));
}, sources); }, sources);
} }
/** /**
* Create a new {@link MongoJsonSchema} combining properties from the given sources. * Create a new {@link MongoJsonSchema} merging properties from the given sources.
* *
* @param sources must not be {@literal null}. * @param sources must not be {@literal null}.
* @return new instance of {@link MongoJsonSchema}. * @return new instance of {@link MongoJsonSchema}.
* @since 3.4 * @since 3.4
*/ */
static MongoJsonSchema combined(ConflictResolutionFunction mergeFunction, MongoJsonSchema... sources) { static MongoJsonSchema merge(ConflictResolutionFunction mergeFunction, MongoJsonSchema... sources) {
return new CombinedJsonSchema(Arrays.asList(sources), mergeFunction); return new MergedJsonSchema(Arrays.asList(sources), mergeFunction);
} }
/** /**
* Create a new {@link MongoJsonSchema} combining properties from the given sources. * Create a new {@link MongoJsonSchema} merging properties from the given sources.
* *
* @param sources must not be {@literal null}. * @param sources must not be {@literal null}.
* @return new instance of {@link MongoJsonSchema}. * @return new instance of {@link MongoJsonSchema}.
* @since 3.4 * @since 3.4
*/ */
default MongoJsonSchema combineWith(MongoJsonSchema... sources) { default MongoJsonSchema mergeWith(MongoJsonSchema... sources) {
return combineWith(Arrays.asList(sources)); return mergeWith(Arrays.asList(sources));
} }
/** /**
* Create a new {@link MongoJsonSchema} combining properties from the given sources. * Create a new {@link MongoJsonSchema} merging properties from the given sources.
* *
* @param sources must not be {@literal null}. * @param sources must not be {@literal null}.
* @return new instance of {@link MongoJsonSchema}. * @return new instance of {@link MongoJsonSchema}.
* @since 3.4 * @since 3.4
*/ */
default MongoJsonSchema combineWith(Collection<MongoJsonSchema> sources) { default MongoJsonSchema mergeWith(Collection<MongoJsonSchema> sources) {
return combineWith(sources, (path, a, b) -> { return mergeWith(sources, (path, left, right) -> {
throw new IllegalStateException( throw new IllegalStateException(String.format("Cannot merge schema for path '%s' holding values '%s' and '%s'.",
String.format("Failure combining schema for path %s holding values a) %s and b) %s.", path.dotPath(), a, b)); path.dotPath(), left, right));
}); });
} }
/** /**
* Create a new {@link MongoJsonSchema} combining properties from the given sources. * Create a new {@link MongoJsonSchema} merging properties from the given sources.
* *
* @param sources must not be {@literal null}. * @param sources must not be {@literal null}.
* @return new instance of {@link MongoJsonSchema}. * @return new instance of {@link MongoJsonSchema}.
* @since 3.4 * @since 3.4
*/ */
default MongoJsonSchema combineWith(Collection<MongoJsonSchema> sources, default MongoJsonSchema mergeWith(Collection<MongoJsonSchema> sources,
ConflictResolutionFunction conflictResolutionFunction) { ConflictResolutionFunction conflictResolutionFunction) {
List<MongoJsonSchema> schemaList = new ArrayList<>(sources.size() + 1); List<MongoJsonSchema> schemaList = new ArrayList<>(sources.size() + 1);
schemaList.add(this); schemaList.add(this);
schemaList.addAll(new ArrayList<>(sources)); schemaList.addAll(new ArrayList<>(sources));
return new CombinedJsonSchema(schemaList, conflictResolutionFunction); return new MergedJsonSchema(schemaList, conflictResolutionFunction);
} }
/** /**
@ -183,8 +184,8 @@ public interface MongoJsonSchema {
} }
/** /**
* A resolution function that may be called on conflicting paths. Eg. when trying to merge properties with different * A resolution function that is called on conflicting paths when trying to merge properties with different values
* values into one. * into a single value.
* *
* @author Christoph Strobl * @author Christoph Strobl
* @since 3.4 * @since 3.4
@ -193,12 +194,14 @@ public interface MongoJsonSchema {
interface ConflictResolutionFunction { interface ConflictResolutionFunction {
/** /**
* Resolve the conflict for two values under the same {@code path}.
*
* @param path the {@link Path} leading to the conflict. * @param path the {@link Path} leading to the conflict.
* @param a can be {@literal null}. * @param left can be {@literal null}.
* @param b can be {@literal null}. * @param right can be {@literal null}.
* @return never {@literal null}. * @return never {@literal null}.
*/ */
Resolution resolveConflict(Path path, @Nullable Object a, @Nullable Object b); Resolution resolveConflict(Path path, @Nullable Object left, @Nullable Object right);
/** /**
* @author Christoph Strobl * @author Christoph Strobl
@ -218,7 +221,7 @@ public interface MongoJsonSchema {
} }
/** /**
* The result after processing a conflict when combining schemas. May indicate to {@link #SKIP skip} the entry * The result after processing a conflict when merging schemas. May indicate to {@link #SKIP skip} the entry
* entirely. * entirely.
* *
* @author Christoph Strobl * @author Christoph Strobl
@ -260,6 +263,42 @@ public interface MongoJsonSchema {
static Resolution skip() { static Resolution skip() {
return SKIP; return SKIP;
} }
/**
* Construct a resolution for a {@link Path} using the given {@code value}.
*
* @param path the conflicting path.
* @param value the value to apply.
* @return
*/
static Resolution ofValue(Path path, Object value) {
Assert.notNull(path, "Path must not be null");
return ofValue(path.currentElement(), value);
}
/**
* Construct a resolution from a {@code key} and {@code value}.
*
* @param key name of the path segment, typically {@link Path#currentElement()}
* @param value the value to apply.
* @return
*/
static Resolution ofValue(String key, Object value) {
return new Resolution() {
@Override
public String getKey() {
return key;
}
@Override
public Object getValue() {
return value;
}
};
}
} }
} }

41
spring-data-mongodb/src/main/java/org/springframework/data/mongodb/core/schema/TypeUnifyingMergeFunction.java

@ -25,11 +25,14 @@ import org.bson.Document;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictResolutionFunction; import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictResolutionFunction;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictResolutionFunction.Path; import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictResolutionFunction.Path;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictResolutionFunction.Resolution; import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictResolutionFunction.Resolution;
import org.springframework.lang.Nullable;
import org.springframework.util.CollectionUtils; import org.springframework.util.CollectionUtils;
import org.springframework.util.ObjectUtils; import org.springframework.util.ObjectUtils;
import org.springframework.util.StringUtils; import org.springframework.util.StringUtils;
/** /**
* Merge function considering BSON type hints. Conflicts are resolved through a {@link ConflictResolutionFunction}.
*
* @author Christoph Strobl * @author Christoph Strobl
* @since 3.4 * @since 3.4
*/ */
@ -42,15 +45,16 @@ class TypeUnifyingMergeFunction implements BiFunction<Map<String, Object>, Map<S
} }
@Override @Override
public Document apply(Map<String, Object> a, Map<String, Object> b) { public Document apply(Map<String, Object> left, Map<String, Object> right) {
return merge(SimplePath.root(), a, b); return merge(SimplePath.root(), left, right);
} }
Document merge(SimplePath path, Map<String, Object> a, Map<String, Object> b) { @SuppressWarnings("unchecked")
Document merge(SimplePath path, Map<String, Object> left, Map<String, Object> right) {
Document target = new Document(a); Document target = new Document(left);
for (String key : b.keySet()) { for (String key : right.keySet()) {
SimplePath currentPath = path.append(key); SimplePath currentPath = path.append(key);
if (isTypeKey(key)) { if (isTypeKey(key)) {
@ -58,39 +62,39 @@ class TypeUnifyingMergeFunction implements BiFunction<Map<String, Object>, Map<S
Object unifiedExistingType = getUnifiedExistingType(key, target); Object unifiedExistingType = getUnifiedExistingType(key, target);
if (unifiedExistingType != null) { if (unifiedExistingType != null) {
if (!ObjectUtils.nullSafeEquals(unifiedExistingType, b.get(key))) { if (!ObjectUtils.nullSafeEquals(unifiedExistingType, right.get(key))) {
resolveConflict(currentPath, a, b, target); resolveConflict(currentPath, left, right, target);
} }
continue; continue;
} }
} }
if (!target.containsKey(key)) { if (!target.containsKey(key)) {
target.put(key, b.get(key)); target.put(key, right.get(key));
continue; continue;
} }
Object existingEntry = target.get(key); Object existingEntry = target.get(key);
Object newEntry = b.get(key); Object newEntry = right.get(key);
if (existingEntry instanceof Map && newEntry instanceof Map) { if (existingEntry instanceof Map && newEntry instanceof Map) {
target.put(key, merge(currentPath, (Map) existingEntry, (Map) newEntry)); target.put(key, merge(currentPath, (Map<String, Object>) existingEntry, (Map<String, Object>) newEntry));
} else if (!ObjectUtils.nullSafeEquals(existingEntry, newEntry)) { } else if (!ObjectUtils.nullSafeEquals(existingEntry, newEntry)) {
resolveConflict(currentPath, a, b, target); resolveConflict(currentPath, left, right, target);
} }
} }
return target; return target;
} }
private void resolveConflict(Path path, Map<String, Object> a, Map<String, Object> b, Document target) { private void resolveConflict(Path path, Map<String, Object> left, Map<String, Object> right, Document target) {
applyConflictResolution(path, target, conflictResolutionFunction.resolveConflict(path, a, b)); applyConflictResolution(path, target, conflictResolutionFunction.resolveConflict(path, left, right));
} }
private void applyConflictResolution(Path path, Document target, Resolution resolution) { private void applyConflictResolution(Path path, Document target, Resolution resolution) {
if (Resolution.SKIP.equals(resolution) || resolution.getValue() == null) { if (Resolution.SKIP.equals(resolution) || resolution.getValue() == null) {
target.remove(path.currentElement()); target.remove(path.currentElement());
return ; return;
} }
if (isTypeKey(resolution.getKey())) { if (isTypeKey(resolution.getKey())) {
@ -115,19 +119,20 @@ class TypeUnifyingMergeFunction implements BiFunction<Map<String, Object>, Map<S
return key; return key;
} }
@Nullable
private static Object getUnifiedExistingType(String key, Document source) { private static Object getUnifiedExistingType(String key, Document source) {
return source.get(getTypeKeyToUse(key, source)); return source.get(getTypeKeyToUse(key, source));
} }
/** /**
* Trivial {@link List} based {@link Path} implementation. * Trivial {@link List} based {@link Path} implementation.
* *
* @author Christoph Strobl * @author Christoph Strobl
* @since 3.4 * @since 3.4
*/ */
static class SimplePath implements Path { static class SimplePath implements Path {
private List<String> path; private final List<String> path;
SimplePath(List<String> path) { SimplePath(List<String> path) {
this.path = path; this.path = path;
@ -137,10 +142,6 @@ class TypeUnifyingMergeFunction implements BiFunction<Map<String, Object>, Map<S
return new SimplePath(Collections.emptyList()); return new SimplePath(Collections.emptyList());
} }
static SimplePath of(List<String> path) {
return new SimplePath(new ArrayList<>(path));
}
static SimplePath of(List<String> path, String next) { static SimplePath of(List<String> path, String next) {
List<String> fullPath = new ArrayList<>(path.size() + 1); List<String> fullPath = new ArrayList<>(path.size() + 1);

87
spring-data-mongodb/src/test/java/org/springframework/data/mongodb/core/MappingMongoJsonSchemaCreatorUnitTests.java

@ -41,7 +41,6 @@ import org.springframework.data.mongodb.core.mapping.MongoMappingContext;
import org.springframework.data.mongodb.core.schema.JsonSchemaObject.Type; import org.springframework.data.mongodb.core.schema.JsonSchemaObject.Type;
import org.springframework.data.mongodb.core.schema.JsonSchemaProperty; import org.springframework.data.mongodb.core.schema.JsonSchemaProperty;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema; import org.springframework.data.mongodb.core.schema.MongoJsonSchema;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictResolutionFunction;
import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictResolutionFunction.Resolution; import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictResolutionFunction.Resolution;
import org.springframework.data.spel.spi.EvaluationContextExtension; import org.springframework.data.spel.spi.EvaluationContextExtension;
import org.springframework.data.spel.spi.Function; import org.springframework.data.spel.spi.Function;
@ -52,14 +51,14 @@ import org.springframework.data.spel.spi.Function;
* @author Christoph Strobl * @author Christoph Strobl
* @author Mark Paluch * @author Mark Paluch
*/ */
public class MappingMongoJsonSchemaCreatorUnitTests { class MappingMongoJsonSchemaCreatorUnitTests {
MappingMongoConverter converter; private MappingMongoConverter converter;
MongoMappingContext mappingContext; private MongoMappingContext mappingContext;
MappingMongoJsonSchemaCreator schemaCreator; private MappingMongoJsonSchemaCreator schemaCreator;
@BeforeEach @BeforeEach
public void setUp() { void setUp() {
mappingContext = new MongoMappingContext(); mappingContext = new MongoMappingContext();
converter = new MappingMongoConverter(NoOpDbRefResolver.INSTANCE, mappingContext); converter = new MappingMongoConverter(NoOpDbRefResolver.INSTANCE, mappingContext);
@ -67,7 +66,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
} }
@Test // DATAMONGO-1849 @Test // DATAMONGO-1849
public void simpleTypes() { void simpleTypes() {
MongoJsonSchema schema = schemaCreator.createSchemaFor(VariousFieldTypes.class); MongoJsonSchema schema = schemaCreator.createSchemaFor(VariousFieldTypes.class);
@ -75,21 +74,21 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
} }
@Test // DATAMONGO-1849 @Test // DATAMONGO-1849
public void withRemappedIdType() { void withRemappedIdType() {
MongoJsonSchema schema = schemaCreator.createSchemaFor(WithExplicitMongoIdTypeMapping.class); MongoJsonSchema schema = schemaCreator.createSchemaFor(WithExplicitMongoIdTypeMapping.class);
assertThat(schema.toDocument().get("$jsonSchema", Document.class)).isEqualTo(WITH_EXPLICIT_MONGO_ID_TYPE_MAPPING); assertThat(schema.toDocument().get("$jsonSchema", Document.class)).isEqualTo(WITH_EXPLICIT_MONGO_ID_TYPE_MAPPING);
} }
@Test // DATAMONGO-1849 @Test // DATAMONGO-1849
public void cyclic() { void cyclic() {
MongoJsonSchema schema = schemaCreator.createSchemaFor(Cyclic.class); MongoJsonSchema schema = schemaCreator.createSchemaFor(Cyclic.class);
assertThat(schema.toDocument().get("$jsonSchema", Document.class)).isEqualTo(CYCLIC); assertThat(schema.toDocument().get("$jsonSchema", Document.class)).isEqualTo(CYCLIC);
} }
@Test // DATAMONGO-1849 @Test // DATAMONGO-1849
public void converterRegistered() { void converterRegistered() {
MappingMongoConverter converter = new MappingMongoConverter(NoOpDbRefResolver.INSTANCE, mappingContext); MappingMongoConverter converter = new MappingMongoConverter(NoOpDbRefResolver.INSTANCE, mappingContext);
MongoCustomConversions mcc = new MongoCustomConversions( MongoCustomConversions mcc = new MongoCustomConversions(
@ -105,7 +104,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
} }
@Test // GH-3800 @Test // GH-3800
public void csfle/*encryptedFieldsOnly*/() { void csfle/*encryptedFieldsOnly*/() {
MongoJsonSchema schema = MongoJsonSchemaCreator.create() // MongoJsonSchema schema = MongoJsonSchemaCreator.create() //
.filter(MongoJsonSchemaCreator.encryptedOnly()) // filter non encrypted fields .filter(MongoJsonSchemaCreator.encryptedOnly()) // filter non encrypted fields
@ -116,7 +115,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
} }
@Test // GH-3800 @Test // GH-3800
public void csfleCyclic/*encryptedFieldsOnly*/() { void csfleCyclic/*encryptedFieldsOnly*/() {
MongoJsonSchema schema = MongoJsonSchemaCreator.create() // MongoJsonSchema schema = MongoJsonSchemaCreator.create() //
.filter(MongoJsonSchemaCreator.encryptedOnly()) // filter non encrypted fields .filter(MongoJsonSchemaCreator.encryptedOnly()) // filter non encrypted fields
@ -127,7 +126,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
} }
@Test // GH-3800 @Test // GH-3800
public void csfleWithKeyFromProperties() { void csfleWithKeyFromProperties() {
GenericApplicationContext applicationContext = new GenericApplicationContext(); GenericApplicationContext applicationContext = new GenericApplicationContext();
applicationContext.registerBean("encryptionExtension", EncryptionExtension.class, () -> new EncryptionExtension()); applicationContext.registerBean("encryptionExtension", EncryptionExtension.class, () -> new EncryptionExtension());
@ -145,7 +144,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
} }
@Test // GH-3800 @Test // GH-3800
public void csfleWithKeyFromMethod() { void csfleWithKeyFromMethod() {
GenericApplicationContext applicationContext = new GenericApplicationContext(); GenericApplicationContext applicationContext = new GenericApplicationContext();
applicationContext.registerBean("encryptionExtension", EncryptionExtension.class, () -> new EncryptionExtension()); applicationContext.registerBean("encryptionExtension", EncryptionExtension.class, () -> new EncryptionExtension());
@ -168,8 +167,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
void shouldAllowToSpecifyPolymorphicTypesForProperty() { void shouldAllowToSpecifyPolymorphicTypesForProperty() {
MongoJsonSchema schema = MongoJsonSchemaCreator.create() // MongoJsonSchema schema = MongoJsonSchemaCreator.create() //
.specify("objectValue").types(A.class, B.class) .property("objectValue").withTypes(A.class, B.class).createSchemaFor(SomeTestObject.class);
.createSchemaFor(SomeTestObject.class);
Document targetSchema = schema.schemaDocument(); Document targetSchema = schema.schemaDocument();
assertThat(targetSchema) // assertThat(targetSchema) //
@ -182,13 +180,12 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
void shouldAllowToSpecifyNestedPolymorphicTypesForProperty() { void shouldAllowToSpecifyNestedPolymorphicTypesForProperty() {
MongoJsonSchema schema = MongoJsonSchemaCreator.create() // MongoJsonSchema schema = MongoJsonSchemaCreator.create() //
.specify("value.objectValue").types(A.class, B.class) // .property("value.objectValue").withTypes(A.class, B.class) //
.createSchemaFor(WrapperAroundA.class); .createSchemaFor(WrapperAroundA.class);
Document targetSchema = schema.schemaDocument();
assertThat(schema.schemaDocument()) // assertThat(schema.schemaDocument()) //
.containsEntry("properties.value.properties.objectValue.properties.aNonEncrypted", new Document("type", "string")) // .containsEntry("properties.value.properties.objectValue.properties.aNonEncrypted",
new Document("type", "string")) //
.containsEntry("properties.value.properties.objectValue.properties.aEncrypted", ENCRYPTED_BSON_STRING) // .containsEntry("properties.value.properties.objectValue.properties.aEncrypted", ENCRYPTED_BSON_STRING) //
.containsEntry("properties.value.properties.objectValue.properties.bEncrypted", ENCRYPTED_BSON_STRING); .containsEntry("properties.value.properties.objectValue.properties.bEncrypted", ENCRYPTED_BSON_STRING);
@ -198,8 +195,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
void shouldAllowToSpecifyGenericTypesForProperty() { void shouldAllowToSpecifyGenericTypesForProperty() {
MongoJsonSchema schema = MongoJsonSchemaCreator.create() // MongoJsonSchema schema = MongoJsonSchemaCreator.create() //
.specify("genericValue").types(A.class, B.class) .property("genericValue").withTypes(A.class, B.class).createSchemaFor(SomeTestObject.class);
.createSchemaFor(SomeTestObject.class);
assertThat(schema.schemaDocument()) // assertThat(schema.schemaDocument()) //
.containsEntry("properties.genericValue.properties.aNonEncrypted", new Document("type", "string")) // .containsEntry("properties.genericValue.properties.aNonEncrypted", new Document("type", "string")) //
@ -211,7 +207,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
void encryptionFilterShouldCaptureSpecifiedPolymorphicTypesForProperty() { void encryptionFilterShouldCaptureSpecifiedPolymorphicTypesForProperty() {
MongoJsonSchema schema = MongoJsonSchemaCreator.create() // MongoJsonSchema schema = MongoJsonSchemaCreator.create() //
.specify("objectValue").types(A.class, B.class) // .property("objectValue").withTypes(A.class, B.class) //
.filter(MongoJsonSchemaCreator.encryptedOnly()) // .filter(MongoJsonSchemaCreator.encryptedOnly()) //
.createSchemaFor(SomeTestObject.class); .createSchemaFor(SomeTestObject.class);
@ -224,8 +220,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
@Test // GH-3870 @Test // GH-3870
void allowsToCreateCombinedSchemaWhenPropertiesDoNotOverlap() { void allowsToCreateCombinedSchemaWhenPropertiesDoNotOverlap() {
MongoJsonSchema schema = MongoJsonSchemaCreator.create() MongoJsonSchema schema = MongoJsonSchemaCreator.create().mergedSchemaFor(A.class, B.class, C.class);
.combineSchemaFor(A.class, B.class, C.class);
assertThat(schema.schemaDocument()) // assertThat(schema.schemaDocument()) //
.containsEntry("properties.aNonEncrypted", new Document("type", "string")) // .containsEntry("properties.aNonEncrypted", new Document("type", "string")) //
@ -242,9 +237,9 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
MongoJsonSchema schemaAButDifferent = MongoJsonSchemaCreator.create() // MongoJsonSchema schemaAButDifferent = MongoJsonSchemaCreator.create() //
.createSchemaFor(PropertyClashWithA.class); .createSchemaFor(PropertyClashWithA.class);
MongoJsonSchema targetSchema = schemaA.combineWith(schemaAButDifferent); MongoJsonSchema targetSchema = schemaA.mergeWith(schemaAButDifferent);
assertThatExceptionOfType(IllegalStateException.class).isThrownBy(() -> targetSchema.schemaDocument()); assertThatExceptionOfType(IllegalStateException.class).isThrownBy(targetSchema::schemaDocument);
} }
@Test // GH-3870 @Test // GH-3870
@ -255,18 +250,8 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
MongoJsonSchema schemaAButDifferent = MongoJsonSchemaCreator.create() // MongoJsonSchema schemaAButDifferent = MongoJsonSchemaCreator.create() //
.createSchemaFor(PropertyClashWithA.class); .createSchemaFor(PropertyClashWithA.class);
MongoJsonSchema schema = schemaA.combineWith(Collections.singleton(schemaAButDifferent), (path, a, b) -> new Resolution() { MongoJsonSchema schema = schemaA.mergeWith(Collections.singleton(schemaAButDifferent),
(path, a, b) -> Resolution.ofValue(path, "object"));
@Override
public String getKey() {
return path.currentElement();
}
@Override
public Object getValue() {
return "object";
}
});
assertThat(schema.schemaDocument()) // assertThat(schema.schemaDocument()) //
.containsEntry("properties.aNonEncrypted", new Document("type", "object")); .containsEntry("properties.aNonEncrypted", new Document("type", "object"));
@ -276,13 +261,11 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
void bsonTypeVsJustTypeValueResolutionIsDoneByDefault() { void bsonTypeVsJustTypeValueResolutionIsDoneByDefault() {
MongoJsonSchema schemaUsingType = MongoJsonSchema.builder() MongoJsonSchema schemaUsingType = MongoJsonSchema.builder()
.property(JsonSchemaProperty.named("value").ofType(Type.jsonTypeOf("string"))) .property(JsonSchemaProperty.named("value").ofType(Type.jsonTypeOf("string"))).build();
.build();
MongoJsonSchema schemaUsingBsonType = MongoJsonSchema.builder() MongoJsonSchema schemaUsingBsonType = MongoJsonSchema.builder()
.property(JsonSchemaProperty.named("value").ofType(Type.bsonTypeOf("string"))) .property(JsonSchemaProperty.named("value").ofType(Type.bsonTypeOf("string"))).build();
.build();
MongoJsonSchema targetSchema = MongoJsonSchema.combined(schemaUsingType, schemaUsingBsonType); MongoJsonSchema targetSchema = MongoJsonSchema.merge(schemaUsingType, schemaUsingBsonType);
assertThat(targetSchema.schemaDocument()) // assertThat(targetSchema.schemaDocument()) //
.containsEntry("properties.value", new Document("type", "string")); .containsEntry("properties.value", new Document("type", "string"));
@ -292,7 +275,7 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
// --> ENUM // --> ENUM
static final String JUST_SOME_ENUM = "{ 'type' : 'string', 'enum' : ['ONE', 'TWO'] }"; private static final String JUST_SOME_ENUM = "{ 'type' : 'string', 'enum' : ['ONE', 'TWO'] }";
enum JustSomeEnum { enum JustSomeEnum {
ONE, TWO ONE, TWO
@ -657,14 +640,15 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
} }
} }
static final Document ENCRYPTED_BSON_STRING = Document.parse("{'encrypt': { 'bsonType': 'string','algorithm': 'AEAD_AES_256_CBC_HMAC_SHA_512-Deterministic'} }"); private static final Document ENCRYPTED_BSON_STRING = Document
.parse("{'encrypt': { 'bsonType': 'string','algorithm': 'AEAD_AES_256_CBC_HMAC_SHA_512-Deterministic'} }");
static class SomeTestObject<T> { static class SomeTestObject<T> {
T genericValue; T genericValue;
Object objectValue; Object objectValue;
} }
static class RootWithGenerics<S,T> { static class RootWithGenerics<S, T> {
S sValue; S sValue;
T tValue; T tValue;
} }
@ -686,18 +670,15 @@ public class MappingMongoJsonSchemaCreatorUnitTests {
String aNonEncrypted; String aNonEncrypted;
@Encrypted(algorithm = "AEAD_AES_256_CBC_HMAC_SHA_512-Deterministic") @Encrypted(algorithm = "AEAD_AES_256_CBC_HMAC_SHA_512-Deterministic") String aEncrypted;
String aEncrypted;
} }
static class B { static class B {
@Encrypted(algorithm = "AEAD_AES_256_CBC_HMAC_SHA_512-Deterministic") @Encrypted(algorithm = "AEAD_AES_256_CBC_HMAC_SHA_512-Deterministic") String bEncrypted;
String bEncrypted;
} }
static class C extends A { static class C extends A {
@Encrypted(algorithm = "AEAD_AES_256_CBC_HMAC_SHA_512-Deterministic") @Encrypted(algorithm = "AEAD_AES_256_CBC_HMAC_SHA_512-Deterministic") String cEncrypted;
String cEncrypted;
} }
static class PropertyClashWithA { static class PropertyClashWithA {

14
spring-data-mongodb/src/test/java/org/springframework/data/mongodb/core/schema/TypeUnifyingMergeFunctionUnitTests.java

@ -33,6 +33,8 @@ import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictReso
import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictResolutionFunction.Resolution; import org.springframework.data.mongodb.core.schema.MongoJsonSchema.ConflictResolutionFunction.Resolution;
/** /**
* Unit tests for {@link TypeUnifyingMergeFunction}.
*
* @author Christoph Strobl * @author Christoph Strobl
*/ */
@ExtendWith(MockitoExtension.class) @ExtendWith(MockitoExtension.class)
@ -101,17 +103,7 @@ public class TypeUnifyingMergeFunctionUnitTests {
ArgumentCaptor<Object> aValueCaptor = ArgumentCaptor.forClass(Object.class); ArgumentCaptor<Object> aValueCaptor = ArgumentCaptor.forClass(Object.class);
ArgumentCaptor<Object> bValueCaptor = ArgumentCaptor.forClass(Object.class); ArgumentCaptor<Object> bValueCaptor = ArgumentCaptor.forClass(Object.class);
when(crf.resolveConflict(any(), aValueCaptor.capture(), bValueCaptor.capture())).thenReturn(new Resolution() { when(crf.resolveConflict(any(), aValueCaptor.capture(), bValueCaptor.capture())).thenReturn(Resolution.ofValue("nested", "from-function"));
@Override
public String getKey() {
return "nested";
}
@Override
public Object getValue() {
return "from-function";
}
});
Map<String, Object> a = new LinkedHashMap<>(); Map<String, Object> a = new LinkedHashMap<>();
a.put("nested", Collections.singletonMap("a", "a-value")); a.put("nested", Collections.singletonMap("a", "a-value"));

38
src/main/asciidoc/reference/mongo-json-schema.adoc

@ -192,25 +192,25 @@ unless there is more specific information available via the `@MongoId` annotatio
The above example demonstrated how to derive the schema from a very precise typed source. The above example demonstrated how to derive the schema from a very precise typed source.
Using polymorphic elements within the domain model can lead to inaccurate schema representation for `Object` and generic `<T>` types, which are likely to represented as `{ type : 'object' }` without further specification. Using polymorphic elements within the domain model can lead to inaccurate schema representation for `Object` and generic `<T>` types, which are likely to represented as `{ type : 'object' }` without further specification.
`MongoJsonSchemaCreator.specify(...)` allows to define additional types that should be considered when rendering the schema. `MongoJsonSchemaCreator.property(…)` allows defining additional details such as nested document types that should be considered when rendering the schema.
.Specify additional types for properties .Specify additional types for properties
==== ====
[source,java] [source,java]
---- ----
public class Root { class Root {
Object value; Object value;
} }
public class A { class A {
String aValue; String aValue;
} }
public class B { class B {
String bValue; String bValue;
} }
MongoJsonSchemaCreator.create() MongoJsonSchemaCreator.create()
.specify("value").types(A.class, B.class) <1> .property("value").withTypes(A.class, B.class) <1>
---- ----
[source,json] [source,json]
@ -220,7 +220,7 @@ MongoJsonSchemaCreator.create()
'properties' : { 'properties' : {
'value' : { 'value' : {
'type' : 'object', 'type' : 'object',
'properties' : { <1> 'properties' : { <1>
'aValue' : { 'type' : 'string' }, 'aValue' : { 'type' : 'string' },
'bValue' : { 'type' : 'string' } 'bValue' : { 'type' : 'string' }
} }
@ -228,30 +228,30 @@ MongoJsonSchemaCreator.create()
} }
} }
---- ----
<1> Properties of the given types are combined into one element. <1> Properties of the given types are merged into one element.
==== ====
MongoDBs schema free approach allows to store documents of different structure in one collection. MongoDBs schema-free approach allows storing documents of different structure in one collection.
Those may be modeled having a common base class. Those may be modeled having a common base class.
Regardless of the chosen approach `MongoJsonSchemaCreator.combine(...)` is can help circumvent the need of combining multiple schema into one. Regardless of the chosen approach, `MongoJsonSchemaCreator.merge(…)` can help circumvent the need of merging multiple schema into one.
.Combining multiple Schemas .Merging multiple Schemas into a single Schema definition
==== ====
[source,java] [source,java]
---- ----
public abstract class Root { abstract class Root {
String rootValue; String rootValue;
} }
public class A extends Root { class A extends Root {
String aValue; String aValue;
} }
public class B extends Root { class B extends Root {
String bValue; String bValue;
} }
MongoJsonSchemaCreator.combined(A.class, B.class) <1> MongoJsonSchemaCreator.mergedSchemaFor(A.class, B.class) <1>
---- ----
[source,json] [source,json]
@ -271,17 +271,17 @@ MongoJsonSchemaCreator.combined(A.class, B.class) <1>
[NOTE] [NOTE]
==== ====
Equally named properties need to refer to the same json schema in order to be combined. Properties with the same name need to refer to the same JSON schema in order to be combined.
The following example shows a definition that cannot be combined automatically because of a data type mismatch. The following example shows a definition that cannot be merged automatically because of a data type mismatch.
In this case a `ConflictResolutionFunction` has to be provided to `MongoJsonSchemaCreator`. In this case a `ConflictResolutionFunction` must be provided to `MongoJsonSchemaCreator`.
[source,java] [source,java]
---- ----
public class A extends Root { class A extends Root {
String value; String value;
} }
public class B extends Root { class B extends Root {
Integer value; Integer value;
} }
---- ----

Loading…
Cancel
Save