Def.make could take 10099ms for 100 subprojects. This would display
logs probably for projects with more than 10 subprojects, which might
pause for a few seconds during load.
When running a sbt script, this change lets the user on UNIX and
Windows platforms to use native file extensions like none/.sh or
.bat/.cmd. The code copies the file to the sbt boot/hash/src_managed
directory with a .scala extension.
It turns out that we can leverage the`FakeResolver` that has been
implemented to use with the static launcher, and resolve a "fake
compiler bridge" using it, rather than copying it from the resources.
This also has the advantage of not requiring to change the build
definition.
This commit introduces a new "static" launcher that does not use Ivy to
gather all the artifacts that it requires, but rather expect them to be
immediately available.
To be able to use sbt without Internet access, we add a new
`ComponentCompiler` that is able to retrieve the bridge sources from the
resources on classpath and compile it.
sbt's shell provided completion only for keys that were relative to a
defined project, but didn't provide completion for keys that belong to
the build definition only.
This commit fixes this issue by defining a new kind of `Parser` (from
which completions are generated) which runs its input simultaneously on
distinct parsers. We now define a parser for project-level keys and
another parser for build-level keys. These two parsers are eventually
combined, and we get the completions of both parsers.
Fixessbt/sbt#2460
Fixes#2464 and Fixes#2465
appResolvers is a set of resolvers specified in the launcher
configuration.
This list fluctuates depending on the version of sbt, and sbt 0.13.10
meant to stabilize it by weeding out JCenter even when it includes it,
which failed when I applied the filter on the wrong list. This should
correct it.
We are hiding a bug fix on global setting that was not importing auto
imports because fixing this via sbt/sbt#2399 breaks the source
compatibility: sbt/sbt#2415
I've split out the relevant portion of the scripted test into
global-settings, and marked it pending.
This addresses 0.13.10 regression, which currently warns users about
Maven incompatibility on a private configuration. This adds a config
class so the build user can control the level of the warning as well as
the target configuration to be monitored.
By default, we are only going to look at `Compile` and `Runtime`.
Previously, the autoimports for globally defined plugins were not
imported for global configuration files, although they were imported for
project configuration files.
This patch causes an additional plugin discovery phase to happen during
global config evaluation, so that auto-plugins can be detected and their
imports subsequently included.
Adds `trackInternalDependencies` and `exportToInternal` settings. These
can be used to control whether to trigger compilation of a dependent
subprojects when you call `compile`. Both keys will take one of three
values: `TrackLevel.NoTracking`, `TrackLevel.TrackIfMissing`, and
`TrackLevel.TrackAlways`. By default they are both set to
`TrackLevel.TrackAlways`.
When `trackInternalDependencies` is set to `TrackLevel.TrackIfMissing`,
sbt will no longer try to compile internal (inter-project) dependencies
automatically, unless there are no `*.class` files (or JAR file when
`exportJars` is `true`) in the output directory. When the setting is
set to `TrackLevel.NoTracking`, the compilation of internal
dependencies will be skipped. Note that the classpath will still be
appended, and dependency graph will still show them as dependencies.
The motivation is to save the I/O overhead of checking for the changes
on a build with many subprojects during development. Here's how to set
all subprojects to `TrackIfMissing`.
lazy val root = (project in file(".")).
aggregate(....).
settings(
inThisBuild(Seq(
trackInternalDependencies := TrackLevel.TrackIfMissing,
exportJars := true
))
)
The `exportToInternal` setting allows the dependee subprojects to opt
out of the internal tracking, which might be useful if you want to
track most subprojects except for a few. The intersection of the
`trackInternalDependencies` and `exportToInternal` settings will be
used to determine the actual track level. Here's an example to opt-out
one project:
lazy val dontTrackMe = (project in file("dontTrackMe")).
settings(
exportToInternal := TrackLevel.NoTracking
)
As described in #2336, I noticed that when using 0.13 nightly from
Bintray, sbt was unable to locate the compiler source.
Since `updateSbtClassifiers` is already set up to download sbt's own
sources, the `ivyConfiguration` should be reused. However, `compilers`
is a derived task, which is unable to depend on a scoped key.
To workaround this I had to create a new key called
`bootIvyConfiguration`. This should now use the metabuild's resolvers
to download the compiler bridge source.
Adds a new setting `useJCenter`, which is set to `false` by default.
When set to `true`, JCenter will be placed as the first external
resolver to find library dependencies.
The implementation of `externalResolvers` is changed to incorporate the
setting by calling `Resolver.reorganizeAppResolvers`. These changes
were required because `externalResolvers` uses whatever that's in the
launchconfig, which the build user may not upgrade.
Even though it's not really used, updateClassifiers constructs
dependency graph based on the result from update.
The direct cause of #2264 came from the fact that the `allModules`
returned from ConfigurationReport did not include dependency
configurations. For example it returned "compile" instead of
"compile->runtime". I've identified that in #2264 and was fixed by
@Duhemm in f49fb33e6d.
Martin identified that the fix still does not address the fact that
updateClassifier hardcodes the classifiers to be tried. This commit
adds the fallback behavior so for Ivy-published modules it will use the
explicit list of artifacts, and for others it will fallback to the
hardcoded list of classifiers.
Note that they won't be downloaded again, because the component compiler
will look for a previously-compiled version of the compiler bridge
before trying to fetch the sources again. If they've already been
downlaoded, then they have been compiled and a compiled version of the
compiler bridge already exists.
In order to restore reproducibility of builds, we no longer cascade over
the possibly available versions of the compiler bridge sources (a
specific version of the bridge sources may not be available one day, but
exist on the next day), but rather let the build definition configure
which module to use.
Fixessbt/sbt#2196
This allows for the same functionality that using SettingsDefinition in
Project#settings allows (specifying either bare Setting[_] or a Seq[Setting[_]])
to be available outside of the settings for a project, for instance when
defining a val.
In short, it allows:
val modelSettings = Def.settings(
sharedSettings,
libraryDependencies += foo
)