This commit refactors things so that the nio apis are located primarily
in the nio package. Because the nio keys are a first class sbt feature,
I had to add import sbt.nio._ and sbt.nio.Keys._ to the autoimports in
BuildUtil.scala
In my recent changes to watch, I have been moving towards a world in
which sbt manages the file inputs and outputs at the task level. The
main idea is that we want to enable a user to specify the inputs and
outputs of a task and have sbt able to track those inputs across
multiple task evaluations. Sbt should be able to automatically trigger a
build when the inputs change and it also should be able to avoid task
evaluation if non of the inputs have changed.
The former case of having sbt automatically watch the file inputs of a
task has been present since watch was refactored. In this commit, I
make it possible for the user to retrieve the lists of new, modified and
deleted files. The user can then avoid task evaluation if none of the
inputs have changed.
To implement this, I inject a number of new settings during project
load if the fileInputs setting is defined for a task. The injected
settings are:
allPathsAndAttributes -- this retrieves all of the paths described by
the fileInputs for the task along with their attributes
fileStamps -- this retrieves all of the file stamps for the files
returned by allPathsAndAttributes
Using these two injected tasks, I also inject a number of derived tasks,
such as allFiles, which returns all of the regular files returned by
allPathsAndAttributes and changedFiles, which returns all of the regular
files that have been modified since the last run.
Using these injected settings, the user is able to write tasks that
avoid evaluation if the inputs haven't changed.
foo / fileInputs += baseDirectory.value.toGlob / ** / "*.scala"
foo := {
foo.previous match {
case Some(p) if (foo / changedFiles).value.isEmpty => p
case _ => fooImpl((foo / allFiles).value
}
}
To make this whole mechanism work, I add a private task key:
val fileAttributeMap = taskKey[java.util.HashMap[Path, Stamp]]("...")
This keeps track of the stamps for all of the files that are managed by
sbt. The fileStamps task will first look for the stamp in the attribute
map and, only if it is not present, it will update the cache. This
allows us to ensure that a given file will only be stamped once per task
evaluation run no matter how the file inputs are specified. Moreover, in
a continuous build, I'm able to reuse the attribute map which can
significantly reduce latency because the default file stamping
implementation used by zinc is fairly expensive (it can take anywhere
between 300-1500ms to stamp 5000 8kb source files on my mac).
I also renamed some of the watch related keys to be a bit more clear.
The newest version of io repackages a number of classes into the
sbt.nio.* packages. It also changes some of the semantics of glob
related apis. This commit updates all of the usages of the updated apis
within sbt but should have no functional difference.
I decided that it makes sense to move all of the new watch code out of
the Watched companion object since the Watched trait itself is now
deprecated. I don't really like having the new code in Watched.scala
mixed with the legacy code, so I pulled it all out and moved it into the
Watch object. Since we have to put all of the logic for the Continuous
object in main in order to access the sbt.Keys object, it makes sense to
move the logic out of main-command and into command so that most of the
watch related logic is in the same subproject.
In order to speed up the start up time of the test and run tasks, I'm
introducing a ClassLoaderCache that can be used to avoid reloading the
classes in the project dependencies (which includes the scala library).
I made the api as minimal as possible so that we can iterate on the
implementation without breaking binary compatibility. This feature will
be gated on a feature flag, so I'm not concerned with the cache class
loaders being useable in every user configuration. Over time, I hope
that the CachedClassLoaders will be a drop in replacement for the
existing one-off class loaders*.
The LayeredClassLoader was adapted from the NativeCopyLoader. The main
difference is that the NativeCopyLoader extracts named shared libraries
into the task temp directory to ensure that the ephemeral libraries are
deleted after each task run. This is a problem if we are caching the
ClassLoader so for LayeredClassLoader I track the native libraries that
are extracted by the loader and I delete them either when the loader is
explicitly closed or in a shutdown hook.
* This of course means that we both must layer the class loaders
appropriately so that the project code is in a layer above the cached
loaders and we must correctly invalidate the cache when the project, or
its dependencies are updated.
I am going to add a classloader cache to improve the startup performance
of the run and test tasks. To prevent the classloader cache from having
unbounded size, I'm adding a simple LRUCache implementation to sbt. An
important characteristic of the implementation of the cache is that when
entries are evicted, we run a callback to cleanup the entry. This allows
us to automatically cleanup any resources created by the entry.
This is a pretty naive implementation that uses an array of entries that
it manipulates as elements are removed/accessed. In general, I expect
these caches to be quite small <= 4 elements, so the storage overhead /
performance of the simple implementation should be quite good. If
performance ever becomes an issue, we can specialzed LRUCache.apply to
use a different implementation for caches with large limits.
It drives me crazy that in intellij when I do the go to class task that
TestBuild.Keys comes up before Keys. Given how central Keys is to sbt,
it doesn't seem like a good idea to alias that particular class name.
Fixes#4293
Ref #4231, #4065
This fixes the regression on sbt 1.2.0 that displays a lot of warnings about configurations.
The warning was added in #4231 in an attempt to fix#4065. This actually highlights somewhat loose usage of configurations among the builds in the wild, and the limitation on the current slash syntax implementation.
I think we can remove this warning for now, and try to fix#4065 by making slash syntax more robust. In particular, we need to memorize the mapping between the configuration name and Scala identifier across the entire build, and use that in the shell.
Fixes https://github.com/sbt/sbt/issues/1502
This adds `--addPluginSbtFile=<file>` command, which adds the given .sbt file to the plugin build.
Using this mechanism editors or IDEs can start a build with required plugin.
```
$ cat /tmp/extra.sbt
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.7")
$ sbt --addPluginSbtFile=/tmp/extra.sbt
...
sbt:helloworld> plugins
In file:/xxxx/hellotest/
...
sbtassembly.AssemblyPlugin: enabled in root
```
This is an implementation of `textDocument/definition` request.
Supports types only, and only in case when type is found in Zinc Analysis. When source(s) are found then editor opens potential source(s).
This simple implementation does not use semantic data.
During the processing of `textDocument/didSave`, we will start collecting the location of Analysis files via `lspCollectAnalyses`.
Later on, when the user asked for `textDocument/definition`, sbt server will invoke a Future call to lspDefinition, which direct reads the files to locate the definition of a class.
```
Provided by:
ProjectRef(uri("...."), "root") / Test / test
Dependencies:
Test / executeTests
Test / test / streams
Test / state
Test / test / testResultLogger
```