mirror of https://github.com/sbt/sbt.git
Major / important fix in dependency management handling
Version and scope from dependency management now override the ones from the dependencies no matter what. For spark 1.3 (a bit old now, but unit tested in CentralTests), this makes spark-core depend on hadoop 1.x by default (because of dependency management), instead of hadoop 2.2 whose versions appear in the dependency sections of it (but are now overridden by those in dependency management). Enabling the hadoop-2.2 profile reestablishes the former hadoop 2.2 versions. This commit breaks the spark related test in CentralTests, repaired by the next two commits.
This commit is contained in:
parent
ffd58b51b0
commit
5146be5c39
|
|
@ -239,13 +239,23 @@ object Resolution {
|
|||
var dep = dep0
|
||||
|
||||
for ((mgmtConfig, mgmtDep) <- dict.get(DepMgmt.key(dep0))) {
|
||||
if (dep.version.isEmpty)
|
||||
|
||||
if (mgmtDep.version.nonEmpty)
|
||||
dep = dep.copy(version = mgmtDep.version)
|
||||
if (config.isEmpty)
|
||||
|
||||
if (mgmtConfig.nonEmpty)
|
||||
config = mgmtConfig
|
||||
|
||||
// FIXME The version and scope/config from dependency management, if any, are substituted
|
||||
// no matter what. The same is not done for the exclusions and optionality, for a lack of
|
||||
// way of distinguishing empty exclusions from no exclusion section and optional set to
|
||||
// false from no optional section in the dependency management for now.
|
||||
|
||||
if (dep.exclusions.isEmpty)
|
||||
dep = dep.copy(exclusions = mgmtDep.exclusions)
|
||||
|
||||
if (mgmtDep.optional)
|
||||
dep = dep.copy(optional = mgmtDep.optional)
|
||||
}
|
||||
|
||||
(config, dep)
|
||||
|
|
|
|||
Loading…
Reference in New Issue