Add support for managed task inputs

In my recent changes to watch, I have been moving towards a world in
which sbt manages the file inputs and outputs at the task level. The
main idea is that we want to enable a user to specify the inputs and
outputs of a task and have sbt able to track those inputs across
multiple task evaluations. Sbt should be able to automatically trigger a
build when the inputs change and it also should be able to avoid task
evaluation if non of the inputs have changed.

The former case of having sbt automatically watch the file inputs of a
task has been present since watch was refactored. In this commit, I
make it possible for the user to retrieve the lists of new, modified and
deleted files. The user can then avoid task evaluation if none of the
inputs have changed.

To implement this, I inject a number of new settings during project
load if the fileInputs setting is defined for a task. The injected
settings are:

allPathsAndAttributes -- this retrieves all of the paths described by
  the fileInputs for the task along with their attributes
fileStamps -- this retrieves all of the file stamps for the files
  returned by allPathsAndAttributes

Using these two injected tasks, I also inject a number of derived tasks,
such as allFiles, which returns all of the regular files returned by
allPathsAndAttributes and changedFiles, which returns all of the regular
files that have been modified since the last run.

Using these injected settings, the user is able to write tasks that
avoid evaluation if the inputs haven't changed.

foo / fileInputs += baseDirectory.value.toGlob / ** / "*.scala"
foo := {
  foo.previous match {
    case Some(p) if (foo / changedFiles).value.isEmpty => p
    case _ => fooImpl((foo / allFiles).value
  }
}

To make this whole mechanism work, I add a private task key:
val fileAttributeMap = taskKey[java.util.HashMap[Path, Stamp]]("...")
This keeps track of the stamps for all of the files that are managed by
sbt. The fileStamps task will first look for the stamp in the attribute
map and, only if it is not present, it will update the cache. This
allows us to ensure that a given file will only be stamped once per task
evaluation run no matter how the file inputs are specified. Moreover, in
a continuous build, I'm able to reuse the attribute map which can
significantly reduce latency because the default file stamping
implementation used by zinc is fairly expensive (it can take anywhere
between 300-1500ms to stamp 5000 8kb source files on my mac).

I also renamed some of the watch related keys to be a bit more clear.
This commit is contained in:
Ethan Atkins 2019-04-23 17:00:13 -07:00
parent ba1f690bba
commit 72df8f674c
71 changed files with 1129 additions and 796 deletions

View File

@ -718,6 +718,18 @@ lazy val sbtIgnoredProblems = {
exclude[ReversedMissingMethodProblem]("sbt.Import.sbt$Import$_setter_$WatchSource_="),
exclude[ReversedMissingMethodProblem]("sbt.Import.WatchSource"),
exclude[ReversedMissingMethodProblem]("sbt.Import.AnyPath"),
exclude[ReversedMissingMethodProblem]("sbt.Import.sbt$Import$_setter_$**_="),
exclude[ReversedMissingMethodProblem]("sbt.Import.sbt$Import$_setter_$*_="),
exclude[ReversedMissingMethodProblem]("sbt.Import.sbt$Import$_setter_$AnyPath_="),
exclude[ReversedMissingMethodProblem]("sbt.Import.sbt$Import$_setter_$Glob_="),
exclude[ReversedMissingMethodProblem]("sbt.Import.sbt$Import$_setter_$RecursiveGlob_="),
exclude[ReversedMissingMethodProblem]("sbt.Import.sbt$Import$_setter_$RelativeGlob_="),
exclude[ReversedMissingMethodProblem]("sbt.Import.*"),
exclude[ReversedMissingMethodProblem]("sbt.Import.**"),
exclude[ReversedMissingMethodProblem]("sbt.Import.RecursiveGlob"),
exclude[ReversedMissingMethodProblem]("sbt.Import.Glob"),
exclude[ReversedMissingMethodProblem]("sbt.Import.RelativeGlob"),
// Dropped in favour of kind-projector's polymorphic lambda literals
exclude[DirectMissingMethodProblem]("sbt.Import.Param"),
exclude[DirectMissingMethodProblem]("sbt.package.Param"),

View File

@ -1,25 +0,0 @@
/*
* sbt
* Copyright 2011 - 2018, Lightbend, Inc.
* Copyright 2008 - 2010, Mark Harrah
* Licensed under Apache License 2.0 (see LICENSE)
*/
package sbt.internal.util.appmacro
import scala.reflect.macros.blackbox
object MacroDefaults {
/**
* Macro to generated default file tree repository. It must be defined as an untyped tree because
* sbt.Keys is not available in this project. This is meant for internal use only, but must be
* public because its a macro.
* @param c the macro context
* @return the tree expressing the default file tree repository.
*/
def dynamicInputs(c: blackbox.Context): c.Tree = {
import c.universe._
q"sbt.internal.Continuous.dynamicInputs.value: @sbtUnchecked"
}
}

View File

@ -12,6 +12,7 @@ import jline.console.history.{ FileHistory, MemoryHistory }
import java.io.{ File, FileDescriptor, FileInputStream, FilterInputStream, InputStream }
import complete.Parser
import jline.Terminal
import scala.concurrent.duration._
import scala.annotation.tailrec
@ -119,7 +120,7 @@ private[sbt] object JLine {
// When calling this, ensure that enableEcho has been or will be called.
// TerminalFactory.get will initialize the terminal to disable echo.
private[sbt] def terminal = jline.TerminalFactory.get
private[sbt] def terminal: Terminal = jline.TerminalFactory.get
private def withTerminal[T](f: jline.Terminal => T): T =
synchronized {

View File

@ -9,7 +9,6 @@ package sbt
import java.io.{ File, PrintWriter }
import java.net.{ URI, URL, URLClassLoader }
import java.nio.file.{ Path => NioPath }
import java.util.Optional
import java.util.concurrent.{ Callable, TimeUnit }
@ -28,16 +27,15 @@ import sbt.Project.{
}
import sbt.Scope.{ GlobalScope, ThisScope, fillTaskAxis }
import sbt.internal.CommandStrings.ExportStream
import sbt.internal.TransitiveGlobs._
import sbt.internal._
import sbt.internal.inc.JavaInterfaceUtil._
import sbt.internal.inc.{ ZincLmUtil, ZincUtil }
import sbt.internal.io.{ Source, WatchState }
import sbt.internal.librarymanagement.{ CustomHttp => _, _ }
import sbt.internal.librarymanagement.mavenint.{
PomExtraDependencyAttributes,
SbtPomExtraProperties
}
import sbt.internal.librarymanagement.{ CustomHttp => _, _ }
import sbt.internal.server.{
Definition,
LanguageServerProtocol,
@ -66,9 +64,8 @@ import sbt.librarymanagement.CrossVersion.{ binarySbtVersion, binaryScalaVersion
import sbt.librarymanagement._
import sbt.librarymanagement.ivy._
import sbt.librarymanagement.syntax._
import sbt.nio.FileStamp
import sbt.nio.Keys._
import sbt.nio.file.{ FileTreeView, Glob }
import sbt.nio.file.FileTreeView
import sbt.nio.file.syntax._
import sbt.std.TaskExtra._
import sbt.testing.{ AnnotatedFingerprint, Framework, Runner, SubclassFingerprint }
@ -87,7 +84,6 @@ import scala.xml.NodeSeq
// incremental compiler
import sbt.SlashSyntax0._
import sbt.internal.GlobLister._
import sbt.internal.inc.{
Analysis,
AnalyzingCompiler,
@ -148,11 +144,16 @@ object Defaults extends BuildCommon {
private[sbt] lazy val globalCore: Seq[Setting[_]] = globalDefaults(
defaultTestTasks(test) ++ defaultTestTasks(testOnly) ++ defaultTestTasks(testQuick) ++ Seq(
excludeFilter :== HiddenFileFilter,
pathToFileStamp :== sbt.nio.FileStamp.hash,
classLoaderCache := ClassLoaderCache(4),
fileInputs :== Nil,
fileStamper :== sbt.nio.FileStamper.Hash,
watchForceTriggerOnAnyChange :== true,
watchTriggers :== Nil,
sbt.nio.Keys.fileAttributeMap := {
new java.util.HashMap[NioPath, (Option[FileStamp.Hash], Option[FileStamp.LastModified])]()
state.value
.get(sbt.nio.Keys.persistentFileAttributeMap)
.getOrElse(new sbt.nio.Keys.FileAttributeMap)
},
) ++ TaskRepository
.proxy(GlobalScope / classLoaderCache, ClassLoaderCache(4)) ++ globalIvyCore ++ globalJvmCore
@ -283,7 +284,6 @@ object Defaults extends BuildCommon {
extraLoggers :== { _ =>
Nil
},
pollingGlobs :== Nil,
watchSources :== Nil, // Although this is deprecated, it can't be removed or it breaks += for legacy builds.
skip :== false,
taskTemporaryDirectory := { val dir = IO.createTemporaryDirectory; dir.deleteOnExit(); dir },
@ -310,10 +310,6 @@ object Defaults extends BuildCommon {
parallelExecution :== true,
fileTreeView :== FileTreeView.default,
Continuous.dynamicInputs := Continuous.dynamicInputsImpl.value,
externalHooks := {
val repository = fileTreeView.value
compileOptions => Some(ExternalHooks(compileOptions, repository))
},
logBuffered :== false,
commands :== Nil,
showSuccess :== true,
@ -354,9 +350,7 @@ object Defaults extends BuildCommon {
watchAntiEntropyRetentionPeriod :== Watch.defaultAntiEntropyRetentionPeriod,
watchLogLevel :== Level.Info,
watchOnEnter :== Watch.defaultOnEnter,
watchOnMetaBuildEvent :== Watch.ifChanged(Watch.Reload),
watchOnInputEvent :== Watch.trigger,
watchOnTriggerEvent :== Watch.trigger,
watchOnFileInputEvent :== Watch.trigger,
watchDeletionQuarantinePeriod :== Watch.defaultDeletionQuarantinePeriod,
watchService :== Watched.newWatchService,
watchStartMessage :== Watch.defaultStartWatch,
@ -415,16 +409,23 @@ object Defaults extends BuildCommon {
)
},
unmanagedSources / fileInputs := {
val filter =
(includeFilter in unmanagedSources).value -- (excludeFilter in unmanagedSources).value
val include = (includeFilter in unmanagedSources).value
val filter = (excludeFilter in unmanagedSources).value match {
// Hidden files are already filtered out by the FileStamps method
case NothingFilter | HiddenFileFilter => include
case exclude => include -- exclude
}
val baseSources = if (sourcesInBase.value) baseDirectory.value * filter :: Nil else Nil
unmanagedSourceDirectories.value.map(_ ** filter) ++ baseSources
},
unmanagedSources := (unmanagedSources / fileInputs).value
.all(fileTreeView.value)
.map(FileStamp.stampedFile),
unmanagedSources := (unmanagedSources / fileStamps).value.map(_._1.toFile),
managedSourceDirectories := Seq(sourceManaged.value),
managedSources := generate(sourceGenerators).value,
managedSources := {
val stamper = sbt.nio.Keys.stamper.value
val res = generate(sourceGenerators).value
res.foreach(f => stamper(f.toPath))
res
},
sourceGenerators :== Nil,
sourceGenerators / fileOutputs := Seq(managedDirectory.value ** AllPassFilter),
sourceDirectories := Classpaths
@ -441,12 +442,15 @@ object Defaults extends BuildCommon {
.concatSettings(unmanagedResourceDirectories, managedResourceDirectories)
.value,
unmanagedResources / fileInputs := {
val filter =
(includeFilter in unmanagedResources).value -- (excludeFilter in unmanagedResources).value
val include = (includeFilter in unmanagedResources).value
val filter = (excludeFilter in unmanagedResources).value match {
// Hidden files are already filtered out by the FileStamps method
case NothingFilter | HiddenFileFilter => include
case exclude => include -- exclude
}
unmanagedResourceDirectories.value.map(_ ** filter)
},
unmanagedResources :=
(unmanagedResources / fileInputs).value.all(fileTreeView.value).map(FileStamp.stampedFile),
unmanagedResources := (unmanagedResources / allPaths).value.map(_.toFile),
resourceGenerators :== Nil,
resourceGenerators += Def.task {
PluginDiscovery.writeDescriptors(discoveredSbtPlugins.value, resourceManaged.value)
@ -607,7 +611,12 @@ object Defaults extends BuildCommon {
else ""
s"inc_compile$extra.zip"
},
compileIncSetup := compileIncSetupTask.value,
compileIncSetup := {
val base = compileIncSetupTask.value
val incOptions =
base.incrementalCompilerOptions.withExternalHooks(ExternalHooks.default.value)
base.withIncrementalCompilerOptions(incOptions)
},
console := consoleTask.value,
collectAnalyses := Definition.collectAnalysesTask.map(_ => ()).value,
consoleQuick := consoleQuickTask.value,
@ -645,9 +654,7 @@ object Defaults extends BuildCommon {
watchTransitiveSources := watchTransitiveSourcesTask.value,
watch := watchSetting.value,
fileOutputs += target.value ** AllPassFilter,
transitiveGlobs := InputGraph.task.value,
transitiveInputs := InputGraph.inputsTask.value,
transitiveTriggers := InputGraph.triggersTask.value,
TransitiveDynamicInputs.transitiveDynamicInputs := InputGraph.task.value,
)
def generate(generators: SettingKey[Seq[Task[Seq[File]]]]): Initialize[Task[Seq[File]]] =
@ -1240,7 +1247,10 @@ object Defaults extends BuildCommon {
exclude: ScopedTaskable[FileFilter]
): Initialize[Task[Seq[File]]] = Def.task {
val filter = include.toTask.value -- exclude.toTask.value
dirs.toTask.value.map(_ ** filter).all(fileTreeView.value).map(FileStamp.stampedFile)
val view = fileTreeView.value
view.list(dirs.toTask.value.map(_ ** filter)).collect {
case (p, a) if !a.isDirectory => p.toFile
}
}
def artifactPathSetting(art: SettingKey[Artifact]): Initialize[File] =
Def.setting {
@ -1606,7 +1616,14 @@ object Defaults extends BuildCommon {
val contents = AnalysisContents.create(analysisResult.analysis(), analysisResult.setup())
store.set(contents)
}
analysisResult.analysis
val map = sbt.nio.Keys.fileAttributeMap.value
val analysis = analysisResult.analysis
import scala.collection.JavaConverters._
analysis.readStamps.getAllProductStamps.asScala.foreach {
case (f, s) =>
map.put(f.toPath, sbt.nio.FileStamp.LastModified(s.getLastModified.orElse(-1L)))
}
analysis
}
def compileIncrementalTask = Def.task {
// TODO - Should readAnalysis + saveAnalysis be scoped by the compile task too?
@ -1615,13 +1632,14 @@ object Defaults extends BuildCommon {
private val incCompiler = ZincUtil.defaultIncrementalCompiler
private[this] def compileIncrementalTaskImpl(s: TaskStreams, ci: Inputs): CompileResult = {
lazy val x = s.text(ExportStream)
def onArgs(cs: Compilers) =
def onArgs(cs: Compilers) = {
cs.withScalac(
cs.scalac match {
case ac: AnalyzingCompiler => ac.onArgs(exported(x, "scalac"))
case x => x
}
)
}
// .withJavac(
// cs.javac.onArgs(exported(x, "javac"))
//)
@ -1689,13 +1707,7 @@ object Defaults extends BuildCommon {
Inputs.of(
compilers.value,
options,
externalHooks
.value(options)
.map { hooks =>
val newOptions = setup.incrementalCompilerOptions.withExternalHooks(hooks)
setup.withIncrementalCompilerOptions(newOptions)
}
.getOrElse(setup),
setup,
previousCompile.value
)
}
@ -2085,9 +2097,7 @@ object Classpaths {
shellPrompt := shellPromptFromState,
dynamicDependency := { (): Unit },
transitiveClasspathDependency := { (): Unit },
transitiveGlobs := { (Nil: Seq[Glob], Nil: Seq[Glob]) },
transitiveInputs := Nil,
transitiveTriggers := Nil,
TransitiveDynamicInputs.transitiveDynamicInputs :== Nil,
)
)
@ -3049,31 +3059,21 @@ object Classpaths {
): Initialize[Task[Seq[(File, CompileAnalysis)]]] =
Def.taskDyn {
val dirs = productDirectories.value
def containsClassFile(fs: List[File]): Boolean =
(fs exists { dir =>
(dir ** DirectoryFilter).get exists { d =>
(d * "*.class").get.nonEmpty
}
})
val view = fileTreeView.value
def containsClassFile(): Boolean = view.list(dirs.map(_ ** "*.class")).nonEmpty
TrackLevel.intersection(track, exportToInternal.value) match {
case TrackLevel.TrackAlways =>
Def.task {
products.value map { (_, compile.value) }
}
case TrackLevel.TrackIfMissing if !containsClassFile(dirs.toList) =>
case TrackLevel.TrackIfMissing if !containsClassFile() =>
Def.task {
products.value map { (_, compile.value) }
}
case _ =>
Def.task {
val analysisOpt = previousCompile.value.analysis.toOption
dirs map { x =>
(
x,
if (analysisOpt.isDefined) analysisOpt.get
else Analysis.empty
)
}
val analysis = previousCompile.value.analysis.toOption.getOrElse(Analysis.empty)
dirs.map(_ -> analysis)
}
}
}
@ -3406,8 +3406,9 @@ object Classpaths {
base: File,
filter: FileFilter,
excl: FileFilter
): Classpath =
): Classpath = {
(base * (filter -- excl) +++ (base / config.name).descendantsExcept(filter, excl)).classpath
}
@deprecated(
"The method only works for Scala 2, use the overloaded version to support both Scala 2 and Scala 3",
"1.1.5"

View File

@ -16,7 +16,7 @@ import sbt.Project.richInitializeTask
import sbt.Scope.Global
import sbt.internal.Aggregation.KeyValue
import sbt.internal.TaskName._
import sbt.internal.TransitiveGlobs._
import sbt.internal.TransitiveDynamicInputs._
import sbt.internal.util._
import sbt.internal.{ BuildStructure, GCUtil, Load, TaskProgress, TaskTimings, TaskTraceEvent, _ }
import sbt.librarymanagement.{ Resolver, UpdateReport }
@ -572,33 +572,31 @@ object EvaluateTask {
stream
}).value
})
} else if (scoped.key == transitiveInputs.key) {
} else if (scoped.key == transitiveDynamicInputs.key) {
scoped.scope.task.toOption.toSeq.map { key =>
val updatedKey = ScopedKey(scoped.scope.copy(task = Zero), key)
transitiveInputs in scoped.scope := InputGraph.inputsTask(updatedKey).value
}
} else if (scoped.key == transitiveTriggers.key) {
scoped.scope.task.toOption.toSeq.map { key =>
val updatedKey = ScopedKey(scoped.scope.copy(task = Zero), key)
transitiveTriggers in scoped.scope := InputGraph.triggersTask(updatedKey).value
}
} else if (scoped.key == transitiveGlobs.key) {
scoped.scope.task.toOption.toSeq.map { key =>
val updatedKey = ScopedKey(scoped.scope.copy(task = Zero), key)
transitiveGlobs in scoped.scope := InputGraph.task(updatedKey).value
transitiveDynamicInputs in scoped.scope := InputGraph.task(updatedKey).value
}
} else if (scoped.key == dynamicDependency.key) {
(dynamicDependency in scoped.scope := { () }) :: Nil
(dynamicDependency in scoped.scope := {
()
}) :: Nil
} else if (scoped.key == transitiveClasspathDependency.key) {
(transitiveClasspathDependency in scoped.scope := { () }) :: Nil
} else if (scoped.key == sbt.nio.Keys.fileInputs.key) {
(sbt.nio.Keys.fileHashes in scoped.scope) := {
import GlobLister._
val map = sbt.nio.FileStamp.fileHashMap.value
(sbt.nio.Keys.fileInputs in scoped.scope).value.all(fileTreeView.value).collect {
case (p, a) if a.isRegularFile => p -> map.get(p)
}
}
(transitiveClasspathDependency in scoped.scope := {
()
}) :: Nil
} else if (scoped.key == sbt.nio.Keys.allFiles.key) {
sbt.nio.Settings.allFiles(scoped) :: Nil
} else if (scoped.key == sbt.nio.Keys.allPaths.key) {
sbt.nio.Settings.allPaths(scoped) :: Nil
} else if (scoped.key == sbt.nio.Keys.changedFiles.key) {
sbt.nio.Settings.changedFiles(scoped)
} else if (scoped.key == sbt.nio.Keys.modifiedFiles.key) {
sbt.nio.Settings.modifiedFiles(scoped)
} else if (scoped.key == sbt.nio.Keys.removedFiles.key) {
sbt.nio.Settings.removedFiles(scoped) :: Nil
} else if (scoped.key == sbt.nio.Keys.stamper.key) {
sbt.nio.Settings.stamper(scoped) :: Nil
} else {
Nil
}

View File

@ -30,7 +30,7 @@ import sbt.librarymanagement.Configurations.CompilerPlugin
import sbt.librarymanagement.LibraryManagementCodec._
import sbt.librarymanagement._
import sbt.librarymanagement.ivy.{ Credentials, IvyConfiguration, IvyPaths, UpdateOptions }
import sbt.nio.file.{ FileAttributes, FileTreeView, Glob }
import sbt.nio.file.{ FileAttributes, Glob }
import sbt.testing.Framework
import sbt.util.{ Level, Logger }
import xsbti.compile._
@ -96,32 +96,26 @@ object Keys {
val analysis = AttributeKey[CompileAnalysis]("analysis", "Analysis of compilation, including dependencies and generated outputs.", DSetting)
val suppressSbtShellNotification = settingKey[Boolean]("""True to suppress the "Executing in batch mode.." message.""").withRank(CSetting)
val fileTreeView = taskKey[FileTreeView[(NioPath, FileAttributes)]]("A view of the file system.").withRank(DSetting)
val pollInterval = settingKey[FiniteDuration]("Interval between checks for modified sources by the continuous execution command.").withRank(BMinusSetting)
val pollingGlobs = settingKey[Seq[Glob]]("Directories that cannot be cached and must always be rescanned. Typically these will be NFS mounted or something similar.").withRank(DSetting)
val watchAntiEntropy = settingKey[FiniteDuration]("Duration for which the watch EventMonitor will ignore events for a file after that file has triggered a build.").withRank(BMinusSetting)
val watchAntiEntropyRetentionPeriod = settingKey[FiniteDuration]("Wall clock Duration for which a FileEventMonitor will store anti-entropy events. This prevents spurious triggers when a task takes a long time to run. Higher values will consume more memory but make spurious triggers less likely.").withRank(BMinusSetting)
val watchDeletionQuarantinePeriod = settingKey[FiniteDuration]("Period for which deletion events will be quarantined. This is to prevent spurious builds when a file is updated with a rename which manifests as a file deletion followed by a file creation. The higher this value is set, the longer the delay will be between a file deletion and a build trigger but the less likely it is for a spurious trigger.").withRank(DSetting)
val watchLogLevel = settingKey[sbt.util.Level.Value]("Transform the default logger in continuous builds.").withRank(DSetting)
val watchInputHandler = settingKey[InputStream => Watch.Action]("Function that is periodically invoked to determine if the continuous build should be stopped or if a build should be triggered. It will usually read from stdin to respond to user commands. This is only invoked if watchInputStream is set.").withRank(DSetting)
val watchForceTriggerOnAnyChange = settingKey[Boolean]("Force the watch process to rerun the current task(s) if any relevant source change is detected regardless of whether or not the underlying file has actually changed.").withRank(DSetting)
val watchInputStream = taskKey[InputStream]("The input stream to read for user input events. This will usually be System.in").withRank(DSetting)
val watchInputParser = settingKey[Parser[Watch.Action]]("A parser of user input that can be used to trigger or exit a continuous build").withRank(DSetting)
val watchOnEnter = settingKey[() => Unit]("Function to run prior to beginning a continuous build. This will run before the continuous task(s) is(are) first evaluated.").withRank(DSetting)
val watchOnExit = settingKey[() => Unit]("Function to run upon exit of a continuous build. It can be used to cleanup resources used during the watch.").withRank(DSetting)
val watchOnInputEvent = settingKey[(Int, Watch.Event) => Watch.Action]("Callback to invoke if an event is triggered in a continuous build by one of the transitive inputs. This is only invoked if watchOnEvent is not explicitly set.").withRank(DSetting)
val watchOnEvent = settingKey[Continuous.Arguments => Watch.Event => Watch.Action]("Determines how to handle a file event. The Seq[Glob] contains all of the transitive inputs for the task(s) being run by the continuous build.").withRank(DSetting)
val watchOnMetaBuildEvent = settingKey[(Int, Watch.Event) => Watch.Action]("Callback to invoke if an event is triggered in a continuous build by one of the meta build triggers.").withRank(DSetting)
val watchOnFileInputEvent = settingKey[(Int, Watch.Event) => Watch.Action]("Callback to invoke if an event is triggered in a continuous build by one of the transitive inputs. This is only invoked if watchOnEvent is not explicitly set.").withRank(DSetting)
val watchOnTermination = settingKey[(Watch.Action, String, Int, State) => State]("Transforms the state upon completion of a watch. The String argument is the command that was run during the watch. The Int parameter specifies how many times the command was run during the watch.").withRank(DSetting)
val watchOnTrigger = settingKey[Continuous.Arguments => Watch.Event => Unit]("Callback to invoke when a continuous build triggers. The first parameter is the number of previous watch task invocations. The second parameter is the Event that triggered this build").withRank(DSetting)
val watchOnTriggerEvent = settingKey[(Int, Watch.Event) => Watch.Action]("Callback to invoke if an event is triggered in a continuous build by one of the transitive triggers. This is only invoked if watchOnEvent is not explicitly set.").withRank(DSetting)
val watchOnIteration = settingKey[Int => Watch.Action]("Function that is invoked before waiting for file system events or user input events. This is only invoked if watchOnStart is not explicitly set.").withRank(DSetting)
val watchOnStart = settingKey[Continuous.Arguments => () => Watch.Action]("Function is invoked before waiting for file system or input events. The returned Action is used to either trigger the build, terminate the watch or wait for events.").withRank(DSetting)
val watchOnIteration = settingKey[Int => Watch.Action]("Function that is invoked before waiting for file system events or user input events.").withRank(DSetting)
val watchService = settingKey[() => WatchService]("Service to use to monitor file system changes.").withRank(BMinusSetting).withRank(DSetting)
val watchStartMessage = settingKey[(Int, String, Seq[String]) => Option[String]]("The message to show when triggered execution waits for sources to change. The parameters are the current watch iteration count, the current project name and the tasks that are being run with each build.").withRank(DSetting)
// The watchTasks key should really be named watch, but that is already taken by the deprecated watch key. I'd be surprised if there are any plugins that use it so I think we should consider breaking binary compatibility to rename this task.
val watchTasks = InputKey[StateTransform]("watch", "Watch a task (or multiple tasks) and rebuild when its file inputs change or user input is received. The semantics are more or less the same as the `~` command except that it cannot transform the state on exit. This means that it cannot be used to reload the build.").withRank(DSetting)
val watchTrackMetaBuild = settingKey[Boolean]("Toggles whether or not changing the build files (e.g. **/*.sbt, project/**/(*.scala | *.java)) should automatically trigger a project reload").withRank(DSetting)
val watchTriggeredMessage = settingKey[(Int, Watch.Event, Seq[String]) => Option[String]]("The message to show before triggered execution executes an action after sources change. The parameters are the path that triggered the build and the current watch iteration count.").withRank(DSetting)
val watchTriggeredMessage = settingKey[(Int, NioPath, Seq[String]) => Option[String]]("The message to show before triggered execution executes an action after sources change. The parameters are the path that triggered the build and the current watch iteration count.").withRank(DSetting)
// Deprecated watch apis
@deprecated("This is no longer used for continuous execution", "1.3.0")
@ -169,9 +163,7 @@ object Keys {
// Output paths
val classDirectory = settingKey[File]("Directory for compiled classes and copied resources.").withRank(AMinusSetting)
@deprecated("Clean is now implemented using globs.", "1.3.0")
val cleanFiles = taskKey[Seq[File]]("The files to recursively delete during a clean.").withRank(BSetting)
@deprecated("Clean is now implemented using globs. Prefer the cleanKeepGlobs task", "1.3.0")
val cleanKeepFiles = settingKey[Seq[File]]("Files or directories to keep during a clean. Must be direct children of target.").withRank(CSetting)
val cleanKeepGlobs = settingKey[Seq[Glob]]("Globs to keep during a clean. Must be direct children of target.").withRank(CSetting)
val crossPaths = settingKey[Boolean]("If true, enables cross paths, which distinguish input and output directories for cross-building.").withRank(ASetting)
@ -239,7 +231,6 @@ object Keys {
val copyResources = taskKey[Seq[(File, File)]]("Copies resources to the output directory.").withRank(AMinusTask)
val aggregate = settingKey[Boolean]("Configures task aggregation.").withRank(BMinusSetting)
val sourcePositionMappers = taskKey[Seq[xsbti.Position => Option[xsbti.Position]]]("Maps positions in generated source files to the original source it was generated from").withRank(DTask)
val externalHooks = taskKey[CompileOptions => Option[ExternalHooks]]("External hooks for modifying the internal behavior of the incremental compiler.").withRank(BMinusSetting)
// package keys
val packageBin = taskKey[File]("Produces a main artifact, such as a binary jar.").withRank(ATask)

View File

@ -20,13 +20,11 @@ import sbt.internal.Aggregation.AnyKeys
import sbt.internal.CommandStrings.BootCommand
import sbt.internal._
import sbt.internal.inc.ScalaInstance
import sbt.internal.nio.FileTreeRepository
import sbt.internal.util.Types.{ const, idFun }
import sbt.internal.util._
import sbt.internal.util.complete.Parser
import sbt.io._
import sbt.io.syntax._
import sbt.nio.file.FileAttributes
import sbt.util.{ Level, Logger, Show }
import xsbti.compile.CompilerCache
import xsbti.{ AppMain, AppProvider, ComponentProvider, ScalaProvider }
@ -894,28 +892,16 @@ object BuiltinCommands {
}
s.put(Keys.stateCompilerCache, cache)
}
private[sbt] val rawGlobalFileTreeRepository = AttributeKey[FileTreeRepository[FileAttributes]](
"raw-global-file-tree-repository",
"Provides a view into the file system that may or may not cache the tree in memory",
1000
)
private[sbt] def registerGlobalCaches(s: State): State =
try {
val cleanedUp = new AtomicBoolean(false)
def cleanup(): Unit = {
s.get(rawGlobalFileTreeRepository).foreach(_.close())
s.get(Keys.taskRepository).foreach(_.close())
()
}
cleanup()
val fileTreeRepository = FileTreeRepository.default
val fileCache = System.getProperty("sbt.io.filecache", "validate")
val newState = s
.addExitHook(if (cleanedUp.compareAndSet(false, true)) cleanup())
s.addExitHook(if (cleanedUp.compareAndSet(false, true)) cleanup())
.put(Keys.taskRepository, new TaskRepository.Repr)
.put(rawGlobalFileTreeRepository, fileTreeRepository)
if (fileCache == "false" || (fileCache != "true" && Util.isWindows)) newState
else newState.put(Keys.globalFileTreeRepository, FileManagement.copy(fileTreeRepository))
} catch {
case NonFatal(_) => s
}

View File

@ -6,16 +6,19 @@
*/
package sbt
import java.io.InputStream
import java.nio.file.Path
import java.time.format.{ DateTimeFormatter, TextStyle }
import java.time.{ Instant, ZoneId, ZonedDateTime }
import java.util.Locale
import java.util.concurrent.TimeUnit
import sbt.BasicCommandStrings.ContinuousExecutePrefix
import sbt.internal.LabeledFunctions._
import sbt.internal.nio.FileEvent
import sbt.internal.util.Util
import sbt.internal.util.complete.Parser
import sbt.internal.util.complete.Parser._
import sbt.internal.util.{ JLine, Util }
import sbt.nio.file.FileAttributes
import sbt.util.{ Level, Logger }
@ -24,67 +27,94 @@ import scala.concurrent.duration._
import scala.util.control.NonFatal
object Watch {
/**
* Represents a file event that has been detected during a continuous build.
*/
sealed trait Event {
/**
* The path that triggered the event.
*
* @return the path that triggered the event.
*/
def path: Path
def previousAttributes: Option[FileAttributes]
def attributes: Option[FileAttributes]
/**
* The time specified in milliseconds from the epoch at which this event occurred.
*
* @return the time at which the event occurred.
*/
def occurredAt: FiniteDuration
}
private[this] val formatter = DateTimeFormatter.ofPattern("yyyy-MMM-dd HH:mm:ss.SSS")
private[this] val timeZone = ZoneId.systemDefault
private[this] val timeZoneName = timeZone.getDisplayName(TextStyle.SHORT, Locale.getDefault)
private[this] implicit class DurationOps(val d: Duration) extends AnyVal {
def finite: FiniteDuration = d match {
case f: FiniteDuration => f
case _ => new FiniteDuration(Long.MaxValue, TimeUnit.MILLISECONDS)
}
def toEpochString: String = {
val zdt = ZonedDateTime.ofInstant(Instant.ofEpochMilli(d.toMillis), timeZone)
s"${formatter.format(zdt)} $timeZoneName"
}
}
private[sbt] implicit class EventOps(val event: Event) extends AnyVal {
def toEpochString: String = event.occurredAt.toEpochString
}
private[sbt] object Event {
private implicit class DurationOps(val d: Duration) extends AnyVal {
def finite: FiniteDuration = d match {
case f: FiniteDuration => f
case _ => new FiniteDuration(Long.MaxValue, TimeUnit.MILLISECONDS)
trait Impl { self: Event =>
private val name = self.getClass.getSimpleName
override def equals(o: Any): Boolean = o match {
case that: Event => this.path == that.path
case _ => false
}
override def hashCode: Int = path.hashCode
override def toString: String = s"$name($path)"
}
def fromIO(fileEvent: FileEvent[FileAttributes]): Watch.Event = fileEvent match {
case c @ FileEvent.Creation(p, a) => new Watch.Creation(p, a, c.occurredAt.value.finite)
case d @ FileEvent.Deletion(p, a) => new Watch.Deletion(p, a, d.occurredAt.value.finite)
case u @ FileEvent.Update(p, prev, attrs) =>
new Watch.Update(p, prev, attrs, u.occurredAt.value.finite)
case c @ FileEvent.Creation(p, _) => new Watch.Creation(p, c.occurredAt.value.finite)
case d @ FileEvent.Deletion(p, _) => new Watch.Deletion(p, d.occurredAt.value.finite)
case u @ FileEvent.Update(p, _, _) =>
new Watch.Update(p, u.occurredAt.value.finite)
}
}
final class Deletion private[sbt] (
override val path: Path,
private[this] val attrs: FileAttributes,
override val occurredAt: FiniteDuration
) extends Event {
override def previousAttributes: Option[FileAttributes] = Some(attrs)
override def attributes: Option[FileAttributes] = None
}
object Deletion {
def unapply(deletion: Deletion): Option[(Path, FileAttributes)] =
deletion.previousAttributes.map(a => deletion.path -> a)
}
final class Creation private[sbt] (
override val path: Path,
private[this] val attrs: FileAttributes,
override val occurredAt: FiniteDuration
) extends Event {
override def attributes: Option[FileAttributes] = Some(attrs)
override def previousAttributes: Option[FileAttributes] = None
) extends Event
with Event.Impl {
override def toString: String = s"Creation($path, ${occurredAt.toEpochString})"
}
object Creation {
def unapply(creation: Creation): Option[(Path, FileAttributes)] =
creation.attributes.map(a => creation.path -> a)
def apply(event: FileEvent[FileAttributes]): Creation =
new Creation(event.path, event.occurredAt.value.finite)
def unapply(creation: Creation): Option[Path] = Some(creation.path)
}
final class Deletion private[sbt] (
override val path: Path,
override val occurredAt: FiniteDuration
) extends Event
with Event.Impl {
override def toString: String = s"Deletion($path, ${occurredAt.toEpochString})"
}
object Deletion {
def apply(event: FileEvent[FileAttributes]): Deletion =
new Deletion(event.path, event.occurredAt.value.finite)
def unapply(deletion: Deletion): Option[Path] = Some(deletion.path)
}
final class Update private[sbt] (
override val path: Path,
private[this] val prevAttrs: FileAttributes,
private[this] val attrs: FileAttributes,
override val occurredAt: FiniteDuration
) extends Event {
override def previousAttributes: Option[FileAttributes] = Some(prevAttrs)
override def attributes: Option[FileAttributes] = Some(attrs)
) extends Event
with Event.Impl {
override def toString: String = s"Update(path, ${occurredAt.toEpochString})"
}
object Update {
def unapply(update: Update): Option[(Path, FileAttributes, FileAttributes)] =
update.previousAttributes
.zip(update.attributes)
.map {
case (previous, current) => (update.path, previous, current)
}
.headOption
def apply(event: FileEvent[FileAttributes]): Update =
new Update(event.path, event.occurredAt.value.finite)
def unapply(update: Update): Option[Path] = Some(update.path)
}
/**
@ -299,18 +329,6 @@ object Watch {
def toExec: Exec = Exec(s, None)
}
private[sbt] def withCharBufferedStdIn[R](f: InputStream => R): R =
if (!Util.isWindows) JLine.usingTerminal { terminal =>
terminal.init()
val in = terminal.wrapInIfNeeded(System.in)
try {
f(in)
} finally {
terminal.reset()
}
} else
f(System.in)
/**
* A constant function that returns [[Trigger]].
*/
@ -318,14 +336,6 @@ object Watch {
Trigger
}.label("Watched.trigger")
def ifChanged(action: Action): (Int, Event) => Watch.Action =
(_: Int, event: Event) =>
event match {
case Update(_, previousAttributes, attributes) if previousAttributes != attributes => action
case _: Creation | _: Deletion => action
case _ => Ignore
}
/**
* The minimum delay between build triggers for the same file. If the file is detected
* to have changed within this period from the last build trigger, the event will be discarded.
@ -432,14 +442,14 @@ object Watch {
* `Keys.watchTriggeredMessage := Watched.defaultOnTriggerMessage`, then nothing is logged when
* a build is triggered.
*/
final val defaultOnTriggerMessage: (Int, Event, Seq[String]) => Option[String] =
((_: Int, e: Event, commands: Seq[String]) => {
val msg = s"Build triggered by ${e.path}. " +
final val defaultOnTriggerMessage: (Int, Path, Seq[String]) => Option[String] =
((_: Int, path: Path, commands: Seq[String]) => {
val msg = s"Build triggered by $path. " +
s"Running ${commands.mkString("'", "; ", "'")}."
Some(msg)
}).label("Watched.defaultOnTriggerMessage")
final val noTriggerMessage: (Int, Event, Seq[String]) => Option[String] =
final val noTriggerMessage: (Int, Path, Seq[String]) => Option[String] =
(_, _, _) => None
/**

View File

@ -8,8 +8,7 @@
package sbt
package internal
import java.io.{ ByteArrayInputStream, InputStream, File => _, _ }
import java.nio.file.Path
import java.io.{ ByteArrayInputStream, InputStream, File => _ }
import java.util.concurrent.atomic.AtomicInteger
import sbt.BasicCommandStrings.{
@ -21,6 +20,7 @@ import sbt.BasicCommandStrings.{
import sbt.BasicCommands.otherCommandParser
import sbt.Def._
import sbt.Scope.Global
import sbt.Watch.{ Creation, Deletion, Update }
import sbt.internal.LabeledFunctions._
import sbt.internal.io.WatchState
import sbt.internal.nio._
@ -28,10 +28,12 @@ import sbt.internal.util.complete.Parser._
import sbt.internal.util.complete.{ Parser, Parsers }
import sbt.internal.util.{ AttributeKey, JLine, Util }
import sbt.nio.Keys.fileInputs
import sbt.nio.file.{ FileAttributes, Glob }
import sbt.nio.file.FileAttributes
import sbt.nio.{ FileStamp, FileStamper }
import sbt.util.{ Level, _ }
import scala.annotation.tailrec
import scala.collection.mutable
import scala.concurrent.duration.FiniteDuration.FiniteDurationIsOrdered
import scala.concurrent.duration._
import scala.util.Try
@ -62,20 +64,19 @@ import scala.util.Try
* the deprecated apis should no longer be supported.
*
*/
object Continuous extends DeprecatedContinuous {
private[sbt] object Continuous extends DeprecatedContinuous {
private type Event = FileEvent[FileAttributes]
/**
* Provides the dynamic inputs to the continuous build callbacks that cannot be stored as
* settings. This wouldn't need to exist if there was a notion of a lazy setting in sbt.
*
* @param logger the Logger
* @param inputs the transitive task inputs
* @param triggers the transitive task triggers
*/
final class Arguments private[Continuous] (
private[sbt] final class Arguments private[Continuous] (
val logger: Logger,
val inputs: Seq[Glob],
val triggers: Seq[Glob]
val inputs: Seq[DynamicInput]
)
/**
@ -96,10 +97,11 @@ object Continuous extends DeprecatedContinuous {
/**
* Create a function from InputStream => [[Watch.Action]] from a [[Parser]]. This is intended
* to be used to set the watchInputHandler setting for a task.
*
* @param parser the parser
* @return the function
*/
def defaultInputHandler(parser: Parser[Watch.Action]): InputStream => Watch.Action = {
private def defaultInputHandler(parser: Parser[Watch.Action]): InputStream => Watch.Action = {
val builder = new StringBuilder
val any = matched(Parsers.any.*)
val fullParser = any ~> parser ~ any
@ -111,6 +113,7 @@ object Continuous extends DeprecatedContinuous {
* Implements continuous execution. It works by first parsing the command and generating a task to
* run with each build. It can run multiple commands that are separated by ";" in the command
* input. If any of these commands are invalid, the watch will immediately exit.
*
* @return a Command that can be used by sbt to implement continuous builds.
*/
private[sbt] def continuous: Command =
@ -122,6 +125,7 @@ object Continuous extends DeprecatedContinuous {
/**
* The task implementation is quite similar to the command implementation. The tricky part is that
* we have to modify the Task.info to apply the state transformation after the task completes.
*
* @return the [[InputTask]]
*/
private[sbt] def continuousTask: Def.Initialize[InputTask[StateTransform]] =
@ -138,14 +142,13 @@ object Continuous extends DeprecatedContinuous {
"Receives a copy of all of the bytes from System.in.",
10000
)
val dynamicInputs = taskKey[FileTree.DynamicInputs](
private[sbt] val dynamicInputs = taskKey[Option[mutable.Set[DynamicInput]]](
"The input globs found during task evaluation that are used in watch."
)
def dynamicInputsImpl: Def.Initialize[Task[FileTree.DynamicInputs]] = Def.task {
Keys.state.value.get(DynamicInputs).getOrElse(FileTree.DynamicInputs.none)
}
private[sbt] def dynamicInputsImpl: Def.Initialize[Task[Option[mutable.Set[DynamicInput]]]] =
Def.task(Keys.state.value.get(DynamicInputs))
private[sbt] val DynamicInputs =
AttributeKey[FileTree.DynamicInputs](
AttributeKey[mutable.Set[DynamicInput]](
"dynamic-inputs",
"Stores the inputs (dynamic and regular) for a task",
10000
@ -182,24 +185,26 @@ object Continuous extends DeprecatedContinuous {
)(implicit extracted: Extracted, logger: Logger): Config = {
// Extract all of the globs that we will monitor during the continuous build.
val (inputs, triggers) = {
val inputs = {
val configs = scopedKey.get(Keys.internalDependencyConfigurations).getOrElse(Nil)
val args = new InputGraph.Arguments(scopedKey, extracted, compiledMap, logger, configs, state)
InputGraph.transitiveGlobs(args)
} match {
case (i: Seq[Glob], t: Seq[Glob]) => (i.distinct.sorted, t.distinct.sorted)
InputGraph.transitiveDynamicInputs(args)
}
val repository = getRepository(state)
val registeringSet = state.get(DynamicInputs).get
registeringSet.value.foreach(_ ++= inputs)
(inputs ++ triggers).foreach(repository.register(_: Glob))
val dynamicInputs = state
.get(DynamicInputs)
.getOrElse {
val msg = "Uninitialized dynamic inputs in continuous build (should be unreachable!)"
throw new IllegalStateException(msg)
}
dynamicInputs ++= inputs
logger.debug(s"[watch] [${scopedKey.show}] Found inputs: ${inputs.map(_.glob).mkString(",")}")
inputs.foreach(i => repository.register(i.glob))
val watchSettings = new WatchSettings(scopedKey)
new Config(
scopedKey,
repository,
() => registeringSet.value.fold(Nil: Seq[Glob])(_.toSeq).sorted,
triggers,
() => dynamicInputs.toSeq.sorted,
watchSettings
)
}
@ -235,7 +240,7 @@ object Continuous extends DeprecatedContinuous {
* if they are not visible in the input graph due to the use of Def.taskDyn.
*/
def makeTask(cmd: String): (String, State, () => Boolean) = {
val newState = s.put(DynamicInputs, FileTree.DynamicInputs.empty)
val newState = s.put(DynamicInputs, mutable.Set.empty[DynamicInput])
val task = Parser
.parse(cmd, Command.combine(newState.definedCommands)(newState))
.getOrElse(
@ -279,16 +284,13 @@ object Continuous extends DeprecatedContinuous {
f(commands, s, valid, invalid)
}
private[this] def withCharBufferedStdIn[R](f: InputStream => R): R = {
val unwrapped = new FileInputStream(FileDescriptor.in) {
override def close(): Unit = {
getChannel.close() // We don't want to close the System.in file descriptor
}
}
val in = if (Util.isWindows) unwrapped else JLine.terminal.wrapInIfNeeded(unwrapped)
try f(in)
finally in.close()
}
private[this] def withCharBufferedStdIn[R](f: InputStream => R): R =
if (!Util.isWindows) {
val terminal = JLine.terminal
terminal.init()
terminal.setEchoEnabled(true)
f(terminal.wrapInIfNeeded(System.in))
} else f(System.in)
private[sbt] def runToTermination(
state: State,
@ -298,24 +300,19 @@ object Continuous extends DeprecatedContinuous {
): State = withCharBufferedStdIn { in =>
val duped = new DupedInputStream(in)
implicit val extracted: Extracted = Project.extract(state)
val (stateWithRepo, repo) = state.get(Keys.globalFileTreeRepository) match {
case Some(r) => (state, r)
case _ =>
val repo = if ("polling" == System.getProperty("sbt.watch.mode")) {
val service =
new PollingWatchService(extracted.getOpt(Keys.pollInterval).getOrElse(500.millis))
FileTreeRepository
.legacy((_: Any) => {}, service)
} else {
state
.get(BuiltinCommands.rawGlobalFileTreeRepository)
.map(FileManagement.copy)
.getOrElse(FileTreeRepository.default)
}
(state.put(Keys.globalFileTreeRepository, repo), repo)
val repo = if ("polling" == System.getProperty("sbt.watch.mode")) {
val service =
new PollingWatchService(extracted.getOpt(Keys.pollInterval).getOrElse(500.millis))
FileTreeRepository.legacy((_: Any) => {}, service)
} else {
FileTreeRepository.default
}
try {
setup(stateWithRepo.put(DupedSystemIn, duped), command) { (commands, s, valid, invalid) =>
val stateWithRepo = state
.put(Keys.globalFileTreeRepository, repo)
.put(sbt.nio.Keys.persistentFileAttributeMap, new sbt.nio.Keys.FileAttributeMap)
.put(DupedSystemIn, duped)
setup(stateWithRepo, command) { (commands, s, valid, invalid) =>
EvaluateTask.withStreams(extracted.structure, s)(_.use(Keys.streams in Global) { streams =>
implicit val logger: Logger = streams.log
if (invalid.isEmpty) {
@ -337,7 +334,6 @@ object Continuous extends DeprecatedContinuous {
val terminationAction = Watch(task, callbacks.onStart, callbacks.nextEvent)
callbacks.onTermination(terminationAction, command, currentCount.get(), state)
} finally {
configs.foreach(_.repository.close())
callbacks.onExit()
}
} else {
@ -394,7 +390,7 @@ object Continuous extends DeprecatedContinuous {
* the aggregated callback will select the minimum [[Watch.Action]] returned where the ordering
* is such that the highest priority [[Watch.Action]] have the lowest values. Finally, to
* handle user input, we read from the provided input stream and buffer the result. Each
* task's input parser is then applied to the buffered result and, again, we return the mimimum
* task's input parser is then applied to the buffered result and, again, we return the minimum
* [[Watch.Action]] returned by the parsers (when the parsers fail, they just return
* [[Watch.Ignore]], which is the lowest priority [[Watch.Action]].
*
@ -455,9 +451,9 @@ object Continuous extends DeprecatedContinuous {
logger: Logger,
count: AtomicInteger
): () => Watch.Action = {
val f = configs.map { params =>
val ws = params.watchSettings
ws.onStart.map(_.apply(params.arguments(logger))).getOrElse { () =>
val f: () => Seq[Watch.Action] = () => {
configs.map { params =>
val ws = params.watchSettings
ws.onIteration.map(_(count.get)).getOrElse {
if (configs.size == 1) { // Only allow custom start messages for single tasks
ws.startMessage match {
@ -472,18 +468,13 @@ object Continuous extends DeprecatedContinuous {
}
}
() => {
val res = f.view.map(_()).min
val res = f().min
// Print the default watch message if there are multiple tasks
if (configs.size > 1)
Watch
.defaultStartWatch(count.get(), project, commands)
.foreach(logger.info(_))
Watch.defaultStartWatch(count.get(), project, commands).foreach(logger.info(_))
res
}
}
private implicit class TraversableGlobOps(val t: Traversable[Glob]) extends AnyVal {
def toFilter: Path => Boolean = p => t.exists(_.matches(p))
}
private def getFileEvents(
configs: Seq[Config],
logger: Logger,
@ -491,64 +482,61 @@ object Continuous extends DeprecatedContinuous {
count: AtomicInteger,
commands: Seq[String]
)(implicit extracted: Extracted): (() => Option[(Watch.Event, Watch.Action)], () => Unit) = {
val attributeMap = state.get(sbt.nio.Keys.persistentFileAttributeMap).get
val trackMetaBuild = configs.forall(_.watchSettings.trackMetaBuild)
val buildGlobs =
if (trackMetaBuild) extracted.getOpt(fileInputs in Keys.settingsData).getOrElse(Nil)
else Nil
val buildFilter: Path => Boolean = buildGlobs.toFilter
val defaultTrigger = if (Util.isWindows) Watch.ifChanged(Watch.Trigger) else Watch.trigger
val retentionPeriod = configs.map(_.watchSettings.antiEntropyRetentionPeriod).max
val quarantinePeriod = configs.map(_.watchSettings.deletionQuarantinePeriod).max
val onEvent: Event => (Watch.Event, Watch.Action) = {
val f = configs.map { params =>
val ws = params.watchSettings
val oe = ws.onEvent
.map(_.apply(params.arguments(logger)))
.getOrElse {
val onInputEvent = ws.onInputEvent.getOrElse(defaultTrigger)
val onTriggerEvent = ws.onTriggerEvent.getOrElse(defaultTrigger)
val onMetaBuildEvent = ws.onMetaBuildEvent.getOrElse(Watch.ifChanged(Watch.Reload))
val triggerFilter = params.triggers.toFilter
val excludedBuildFilter = buildFilter
event: Watch.Event =>
val inputFilter = params.inputs().toFilter
val c = count.get()
Seq[Watch.Action](
if (inputFilter(event.path)) onInputEvent(c, event) else Watch.Ignore,
if (triggerFilter(event.path)) onTriggerEvent(c, event) else Watch.Ignore,
if (excludedBuildFilter(event.path)) onMetaBuildEvent(c, event)
else Watch.Ignore
).min
val onEvent: Event => Seq[(Watch.Event, Watch.Action)] = event => {
val path = event.path
def watchEvent(stamper: FileStamper, forceTrigger: Boolean): Option[Watch.Event] = {
val stamp = FileStamp(path, stamper)
if (!event.exists) {
attributeMap.remove(event.path) match {
case null => None
case _ => Some(Deletion(event))
}
event: Event =>
val watchEvent = Watch.Event.fromIO(event)
watchEvent -> oe(watchEvent)
} else {
import sbt.internal.inc.Stamp.equivStamp
attributeMap.put(path, stamp) match {
case null => Some(Creation(event))
case s =>
if (forceTrigger || !equivStamp.equiv(s.stamp, stamp.stamp))
Some(Update(event))
else None
}
}
}
if (buildGlobs.exists(_.matches(path))) {
watchEvent(FileStamper.Hash, forceTrigger = false).map(e => e -> Watch.Reload).toSeq
} else {
configs
.flatMap { config =>
config
.inputs()
.collectFirst {
case d if d.glob.matches(path) => (d.forceTrigger, true, d.fileStamper)
}
.flatMap {
case (forceTrigger, accepted, stamper) =>
if (accepted) {
watchEvent(stamper, forceTrigger).flatMap { e =>
val action = config.watchSettings.onFileInputEvent(count.get(), e)
if (action != Watch.Ignore) Some(e -> action) else None
}
} else None
}
} match {
case events if events.isEmpty => Nil
case events => events.minBy(_._2) :: Nil
}
}
event: Event => f.view.map(_.apply(event)).minBy(_._2)
}
val monitor: FileEventMonitor[Event] = new FileEventMonitor[Event] {
/**
* Create a filtered monitor that only accepts globs that have been registered for the
* task at hand.
* @param monitor the file event monitor to filter
* @param globs the globs to accept. This must be a function because we want to be able
* to accept globs that are added dynamically as part of task evaluation.
* @return the filtered FileEventMonitor.
*/
private def filter(
monitor: FileEventMonitor[Event],
globs: () => Seq[Glob]
): FileEventMonitor[Event] = {
new FileEventMonitor[Event] {
override def poll(
duration: Duration,
filter: Event => Boolean
): Seq[Event] = monitor.poll(duration, filter).filter(e => globs().toFilter(e.path))
override def close(): Unit = monitor.close()
}
}
private implicit class WatchLogger(val l: Logger) extends sbt.internal.nio.WatchLogger {
override def debug(msg: Any): Unit = l.debug(msg.toString)
}
@ -557,31 +545,24 @@ object Continuous extends DeprecatedContinuous {
configs.map { config =>
// Create a logger with a scoped key prefix so that we can tell from which
// monitor events occurred.
val l = logger.withPrefix(config.key.show)
val monitor: FileEventMonitor[Event] =
FileEventMonitor.antiEntropy(
config.repository,
config.watchSettings.antiEntropy,
l,
config.watchSettings.deletionQuarantinePeriod,
config.watchSettings.antiEntropyRetentionPeriod
)
val allGlobs: () => Seq[Glob] =
() => (config.inputs() ++ config.triggers).distinct.sorted
filter(monitor, allGlobs)
FileEventMonitor.antiEntropy(
getRepository(state),
config.watchSettings.antiEntropy,
logger.withPrefix(config.key.show),
config.watchSettings.deletionQuarantinePeriod,
config.watchSettings.antiEntropyRetentionPeriod
)
} ++ (if (trackMetaBuild) {
val l = logger.withPrefix("meta-build")
val antiEntropy = configs.map(_.watchSettings.antiEntropy).max
val repo = getRepository(state)
buildGlobs.foreach(repo.register)
val monitor = FileEventMonitor.antiEntropy(
FileEventMonitor.antiEntropy(
repo,
antiEntropy,
l,
logger.withPrefix("meta-build"),
quarantinePeriod,
retentionPeriod
)
filter(monitor, () => buildGlobs) :: Nil
) :: Nil
} else Nil)
override def poll(duration: Duration, filter: Event => Boolean): Seq[Event] = {
val res = monitors.flatMap(_.poll(0.millis, filter)).toSet.toVector
@ -605,22 +586,19 @@ object Continuous extends DeprecatedContinuous {
* they can clear the screen when the build triggers.
*/
val onTrigger: Watch.Event => Unit = { event: Watch.Event =>
configs.foreach { params =>
params.watchSettings.onTrigger.foreach(ot => ot(params.arguments(logger))(event))
}
if (configs.size == 1) {
val config = configs.head
config.watchSettings.triggerMessage match {
case Left(tm) => logger.info(tm(config.watchState(count.get())))
case Right(tm) => tm(count.get(), event, commands).foreach(logger.info(_))
case Right(tm) => tm(count.get(), event.path, commands).foreach(logger.info(_))
}
} else {
Watch.defaultOnTriggerMessage(count.get(), event, commands).foreach(logger.info(_))
Watch.defaultOnTriggerMessage(count.get(), event.path, commands).foreach(logger.info(_))
}
}
(() => {
val actions = antiEntropyMonitor.poll(2.milliseconds).map(onEvent)
val actions = antiEntropyMonitor.poll(2.milliseconds).flatMap(onEvent)
if (actions.exists(_._2 != Watch.Ignore)) {
val builder = new StringBuilder
val min = actions.minBy {
@ -728,7 +706,7 @@ object Continuous extends DeprecatedContinuous {
fileEvent
.collect {
case (event, action) if action != Watch.Ignore =>
s"Received file event $action for ${event.path}." +
s"Received file event $action for $event." +
(if (action != min) s" Dropping in favor of input event: $min" else "")
}
.foreach(logger.debug(_))
@ -755,6 +733,7 @@ object Continuous extends DeprecatedContinuous {
/**
* Generates a custom logger for the watch process that is able to log at a different level
* from the provided logger.
*
* @param logger the delegate logger.
* @param logLevel the log level for watch events
* @return the wrapped logger.
@ -821,19 +800,15 @@ object Continuous extends DeprecatedContinuous {
key.get(Keys.watchInputParser).getOrElse(Watch.defaultInputParser)
val logLevel: Level.Value = key.get(Keys.watchLogLevel).getOrElse(Level.Info)
val onEnter: () => Unit = key.get(Keys.watchOnEnter).getOrElse(() => {})
val onEvent: Option[Arguments => Watch.Event => Watch.Action] = key.get(Keys.watchOnEvent)
val onExit: () => Unit = key.get(Keys.watchOnExit).getOrElse(() => {})
val onInputEvent: Option[WatchOnEvent] = key.get(Keys.watchOnInputEvent)
val onFileInputEvent: WatchOnEvent =
key.get(Keys.watchOnFileInputEvent).getOrElse(Watch.trigger)
val onIteration: Option[Int => Watch.Action] = key.get(Keys.watchOnIteration)
val onMetaBuildEvent: Option[WatchOnEvent] = key.get(Keys.watchOnMetaBuildEvent)
val onStart: Option[Arguments => () => Watch.Action] = key.get(Keys.watchOnStart)
val onTermination: Option[(Watch.Action, String, Int, State) => State] =
key.get(Keys.watchOnTermination)
val onTrigger: Option[Arguments => Watch.Event => Unit] = key.get(Keys.watchOnTrigger)
val onTriggerEvent: Option[WatchOnEvent] = key.get(Keys.watchOnTriggerEvent)
val startMessage: StartMessage = getStartMessage(key)
val trackMetaBuild: Boolean = key.get(Keys.watchTrackMetaBuild).getOrElse(true)
val triggerMessage: TriggerMessage[Watch.Event] = getTriggerMessage(key)
val triggerMessage: TriggerMessage = getTriggerMessage(key)
// Unlike the rest of the settings, InputStream is a TaskKey which means that if it is set,
// we have to use Extracted.runTask to get the value. The reason for this is because it is
@ -846,22 +821,19 @@ object Continuous extends DeprecatedContinuous {
/**
* Container class for all of the components we need to setup a watch for a particular task or
* input task.
*
* @param key the [[ScopedKey]] instance for the task we will watch
* @param repository the task [[FileTreeRepository]] instance
* @param inputs the transitive task inputs (see [[InputGraph]])
* @param triggers the transitive triggers (see [[InputGraph]])
* @param watchSettings the [[WatchSettings]] instance for the task
*/
private final class Config private[internal] (
val key: ScopedKey[_],
val repository: FileTreeRepository[FileAttributes],
val inputs: () => Seq[Glob],
val triggers: Seq[Glob],
val inputs: () => Seq[DynamicInput],
val watchSettings: WatchSettings
) {
private[sbt] def watchState(count: Int): DeprecatedWatchState =
WatchState.empty(inputs() ++ triggers).withCount(count)
def arguments(logger: Logger): Arguments = new Arguments(logger, inputs(), triggers)
WatchState.empty(inputs().map(_.glob)).withCount(count)
def arguments(logger: Logger): Arguments = new Arguments(logger, inputs())
}
private def getStartMessage(key: ScopedKey[_])(implicit e: Extracted): StartMessage = Some {
lazy val default = key.get(Keys.watchStartMessage).getOrElse(Watch.defaultStartWatch)
@ -869,7 +841,7 @@ object Continuous extends DeprecatedContinuous {
}
private def getTriggerMessage(
key: ScopedKey[_]
)(implicit e: Extracted): TriggerMessage[Watch.Event] = {
)(implicit e: Extracted): TriggerMessage = {
lazy val default =
key.get(Keys.watchTriggeredMessage).getOrElse(Watch.defaultOnTriggerMessage)
key.get(deprecatedWatchingMessage).map(Left(_)).getOrElse(Right(default))
@ -883,6 +855,7 @@ object Continuous extends DeprecatedContinuous {
* foo/Compile/compile will pretty print as "foo / Compile / compile", not
* "ProjectRef($URI, foo) / compile / compile", where the ProjectRef part is just noise that
* is rarely relevant for debugging.
*
* @return the pretty printed output.
*/
def show: String = {
@ -957,6 +930,7 @@ object Continuous extends DeprecatedContinuous {
* foo/Compile/compile will pretty print as "foo / Compile / compile", not
* "ProjectRef($URI, foo) / compile / compile", where the ProjectRef part is just noise that
* is rarely relevant for debugging.
*
* @return the pretty printed output.
*/
def show: String = s"${scopedKey.scope.show} / ${scopedKey.key}"
@ -967,6 +941,7 @@ object Continuous extends DeprecatedContinuous {
/**
* Creates a logger that adds a prefix to the messages that it logs. The motivation is so that
* we can tell from which FileEventMonitor an event originated.
*
* @param prefix the string to prefix the message with
* @return the wrapped Logger.
*/

View File

@ -7,13 +7,14 @@
package sbt.internal
import java.nio.file.Path
import sbt.internal.io.{ WatchState => WS }
private[internal] trait DeprecatedContinuous {
protected type StartMessage =
Option[Either[WS => String, (Int, String, Seq[String]) => Option[String]]]
protected type TriggerMessage[Event] =
Either[WS => String, (Int, Event, Seq[String]) => Option[String]]
protected type TriggerMessage = Either[WS => String, (Int, Path, Seq[String]) => Option[String]]
protected type DeprecatedWatchState = WS
protected val deprecatedWatchingMessage = sbt.Keys.watchingMessage
protected val deprecatedTriggeredMessage = sbt.Keys.triggeredMessage

View File

@ -0,0 +1,45 @@
/*
* sbt
* Copyright 2011 - 2018, Lightbend, Inc.
* Copyright 2008 - 2010, Mark Harrah
* Licensed under Apache License 2.0 (see LICENSE)
*/
package sbt
package internal
import java.nio.file.{ WatchService => _ }
import sbt.nio.FileStamper
import sbt.nio.file.Glob
private[sbt] final case class DynamicInput(
glob: Glob,
fileStamper: FileStamper,
forceTrigger: Boolean
)
private[sbt] object DynamicInput {
implicit object ordering extends Ordering[DynamicInput] {
private implicit val globOrdering: Ordering[Glob] = Glob.ordering
private implicit object fileStamperOrdering extends Ordering[FileStamper] {
override def compare(left: FileStamper, right: FileStamper): Int = left match {
case FileStamper.Hash =>
right match {
case FileStamper.Hash => 0
case _ => -1
}
case FileStamper.LastModified =>
right match {
case FileStamper.LastModified => 0
case _ => 1
}
}
}
override def compare(left: DynamicInput, right: DynamicInput): Int = {
globOrdering.compare(left.glob, right.glob) match {
case 0 => fileStamperOrdering.compare(left.fileStamper, right.fileStamper)
case i => i
}
}
}
}

View File

@ -10,43 +10,53 @@ package sbt.internal
import java.nio.file.Paths
import java.util.Optional
import sbt.internal.inc.ExternalLookup
import sbt.io.AllPassFilter
import sbt.Def
import sbt.Keys._
import sbt.internal.inc.{ EmptyStamp, ExternalLookup, Stamper }
import sbt.io.syntax._
import sbt.nio.FileStamp
import sbt.nio.FileStamp.StampedFile
import sbt.nio.Keys._
import sbt.nio.file.RecursiveGlob
import sbt.nio.file.syntax._
import sbt.nio.file.{ FileAttributes, FileTreeView }
import xsbti.compile._
import xsbti.compile.analysis.Stamp
import scala.collection.JavaConverters._
import scala.collection.mutable
private[sbt] object ExternalHooks {
private val javaHome = Option(System.getProperty("java.home")).map(Paths.get(_))
def apply(
private[this] implicit class StampOps(val s: Stamp) extends AnyVal {
def hash: String = s.getHash.orElse("")
def lastModified: Long = s.getLastModified.orElse(-1L)
}
def default: Def.Initialize[sbt.Task[ExternalHooks]] = Def.task {
val attributeMap = fileAttributeMap.value
val cp = dependencyClasspath.value.map(_.data)
cp.foreach { file =>
val path = file.toPath
attributeMap.get(path) match {
case null => attributeMap.put(path, sbt.nio.FileStamp.lastModified(path))
case _ =>
}
}
val classGlob = classDirectory.value.toGlob / RecursiveGlob / "*.class"
fileTreeView.value.list(classGlob).foreach {
case (path, _) => attributeMap.put(path, sbt.nio.FileStamp.lastModified(path))
}
apply(
(compileOptions in compile).value,
(file: File) => {
attributeMap.get(file.toPath) match {
case null => EmptyStamp
case s => s.stamp
}
}
)
}
private def apply(
options: CompileOptions,
view: FileTreeView.Nio[FileAttributes]
attributeMap: File => Stamp
): DefaultExternalHooks = {
import scala.collection.JavaConverters._
val sources = options.sources()
val cachedSources = new java.util.HashMap[File, Stamp]
sources.foreach {
case sf: StampedFile => cachedSources.put(sf, sf.stamp)
case f: File => cachedSources.put(f, FileStamp.stamped(f))
}
val allBinaries = new java.util.HashMap[File, Stamp]
options.classpath.foreach {
case f if f.getName.endsWith(".jar") =>
view.list(f.toGlob) foreach {
case (p, a) => allBinaries.put(p.toFile, FileStamp(p, a).stamp)
}
case f =>
view.list(f ** AllPassFilter) foreach {
case (p, a) => allBinaries.put(p.toFile, FileStamp(p, a).stamp)
}
}
val lookup = new ExternalLookup {
override def changedSources(previousAnalysis: CompileAnalysis): Option[Changes[File]] = Some {
new Changes[File] {
@ -60,19 +70,19 @@ private[sbt] object ExternalHooks {
previousAnalysis.readStamps().getAllSourceStamps.asScala
prevSources.foreach {
case (file: File, s: Stamp) =>
cachedSources.get(file) match {
attributeMap(file) match {
case null =>
getRemoved.add(file)
case stamp =>
if ((stamp.getHash.orElse("") == s.getHash.orElse("")) && (stamp.getLastModified
.orElse(-1L) == s.getLastModified.orElse(-1L))) {
val hash = (if (stamp.getHash.isPresent) stamp else Stamper.forHash(file)).hash
if (hash == s.hash) {
getUnmodified.add(file)
} else {
getChanged.add(file)
}
}
}
sources.foreach(file => if (!prevSources.contains(file)) getAdded.add(file))
options.sources.foreach(file => if (!prevSources.contains(file)) getAdded.add(file))
}
}
@ -88,26 +98,23 @@ private[sbt] object ExternalHooks {
override def changedBinaries(previousAnalysis: CompileAnalysis): Option[Set[File]] = {
Some(previousAnalysis.readStamps.getAllBinaryStamps.asScala.flatMap {
case (file, stamp) =>
allBinaries.get(file) match {
case null =>
attributeMap(file) match {
case cachedStamp if stamp.getLastModified == cachedStamp.getLastModified => None
case _ =>
javaHome match {
case Some(h) if file.toPath.startsWith(h) => None
case _ => Some(file)
}
case cachedStamp if stamp == cachedStamp => None
case _ => Some(file)
}
}.toSet)
}
override def removedProducts(previousAnalysis: CompileAnalysis): Option[Set[File]] = {
Some(previousAnalysis.readStamps.getAllProductStamps.asScala.flatMap {
case (file, s) =>
allBinaries get file match {
case null => Some(file)
case stamp if stamp.getLastModified.orElse(0L) != s.getLastModified.orElse(0L) =>
Some(file)
case _ => None
case (file, stamp) =>
attributeMap(file) match {
case s if s.getLastModified == stamp.getLastModified => None
case _ => Some(file)
}
}.toSet)
}

View File

@ -1,29 +0,0 @@
/*
* sbt
* Copyright 2011 - 2018, Lightbend, Inc.
* Copyright 2008 - 2010, Mark Harrah
* Licensed under Apache License 2.0 (see LICENSE)
*/
package sbt
package internal
import java.nio.file.{ WatchService => _ }
import sbt.internal.util.appmacro.MacroDefaults
import sbt.nio.file.Glob
import scala.collection.mutable
import scala.language.experimental.macros
object FileTree {
private[sbt] trait DynamicInputs {
def value: Option[mutable.Set[Glob]]
}
private[sbt] object DynamicInputs {
def empty: DynamicInputs = new impl(Some(mutable.Set.empty[Glob]))
final val none: DynamicInputs = new impl(None)
private final class impl(override val value: Option[mutable.Set[Glob]]) extends DynamicInputs
implicit def default: DynamicInputs = macro MacroDefaults.dynamicInputs
}
}

View File

@ -1,88 +0,0 @@
/*
* sbt
* Copyright 2011 - 2018, Lightbend, Inc.
* Copyright 2008 - 2010, Mark Harrah
* Licensed under Apache License 2.0 (see LICENSE)
*/
package sbt
package internal
import java.nio.file.Path
import sbt.nio.file.{ FileAttributes, FileTreeView, Glob }
/**
* Retrieve files from a repository. This should usually be an extension class for
* sbt.io.internal.Glob (or a Traversable collection of source instances) that allows us to
* actually retrieve the files corresponding to those sources.
*/
private[sbt] sealed trait GlobLister extends Any {
final def all(view: FileTreeView.Nio[FileAttributes]): Seq[(Path, FileAttributes)] = {
all(view, FileTree.DynamicInputs.empty)
}
/**
* Get the sources described this `GlobLister`. The results should not return any duplicate
* entries for each path in the result set.
*
* @param view the file tree view
* @param dynamicInputs the task dynamic inputs to track for watch.
* @return the files described by this `GlobLister`.
*/
def all(
implicit view: FileTreeView.Nio[FileAttributes],
dynamicInputs: FileTree.DynamicInputs
): Seq[(Path, FileAttributes)]
}
/**
* Provides implicit definitions to provide a `GlobLister` given a Glob or
* Traversable[Glob].
*/
private[sbt] object GlobLister extends GlobListers
/**
* Provides implicit definitions to provide a `GlobLister` given a Glob or
* Traversable[Glob].
*/
private[sbt] trait GlobListers {
import GlobListers._
/**
* Generate a GlobLister given a particular `Glob`s.
*
* @param source the input Glob
*/
implicit def fromGlob(source: Glob): GlobLister = new impl(source :: Nil)
/**
* Generate a GlobLister given a collection of Globs.
*
* @param sources the collection of sources
* @tparam T the source collection type
*/
implicit def fromTraversableGlob[T <: Traversable[Glob]](sources: T): GlobLister =
new impl(sources)
}
private[internal] object GlobListers {
/**
* Implements `GlobLister` given a collection of Globs. If the input collection type
* preserves uniqueness, e.g. `Set[Glob]`, then the output will be the unique source list.
* Otherwise duplicates are possible.
*
* @param globs the input globs
* @tparam T the collection type
*/
private class impl[T <: Traversable[Glob]](val globs: T) extends AnyVal with GlobLister {
override def all(
implicit view: FileTreeView.Nio[FileAttributes],
dynamicInputs: FileTree.DynamicInputs
): Seq[(Path, FileAttributes)] = {
dynamicInputs.value.foreach(_ ++= globs)
view.list(globs)
}
}
}

View File

@ -15,16 +15,15 @@ import sbt.internal.io.Source
import sbt.internal.util.AttributeMap
import sbt.internal.util.complete.Parser
import sbt.io.syntax._
import sbt.nio.Keys._
import sbt.nio.file.Glob
import sbt.nio.FileStamper
import sbt.nio.Keys._
import scala.annotation.tailrec
object TransitiveGlobs {
val transitiveTriggers = Def.taskKey[Seq[Glob]]("The transitive triggers for a key")
val transitiveInputs = Def.taskKey[Seq[Glob]]("The transitive inputs for a key")
val transitiveGlobs =
Def.taskKey[(Seq[Glob], Seq[Glob])]("The transitive inputs and triggers for a key")
private[sbt] object TransitiveDynamicInputs {
val transitiveDynamicInputs =
Def.taskKey[Seq[DynamicInput]]("The transitive inputs and triggers for a key")
}
private[sbt] object InputGraph {
private implicit class SourceOps(val source: Source) {
@ -33,18 +32,12 @@ private[sbt] object InputGraph {
if (source.recursive) source.base ** filter else source.base * filter
}
}
private[sbt] def inputsTask: Def.Initialize[Task[Seq[Glob]]] =
Def.task(transitiveGlobs(arguments.value)._1.sorted)
private[sbt] def inputsTask(key: ScopedKey[_]): Def.Initialize[Task[Seq[Glob]]] =
withParams((e, cm) => Def.task(transitiveGlobs(argumentsImpl(key, e, cm).value)._1.sorted))
private[sbt] def triggersTask: Def.Initialize[Task[Seq[Glob]]] =
Def.task(transitiveGlobs(arguments.value)._2.sorted)
private[sbt] def triggersTask(key: ScopedKey[_]): Def.Initialize[Task[Seq[Glob]]] =
withParams((e, cm) => Def.task(transitiveGlobs(argumentsImpl(key, e, cm).value)._2.sorted))
private[sbt] def task: Def.Initialize[Task[(Seq[Glob], Seq[Glob])]] =
Def.task(transitiveGlobs(arguments.value))
private[sbt] def task(key: ScopedKey[_]): Def.Initialize[Task[(Seq[Glob], Seq[Glob])]] =
withParams((e, cm) => Def.task(transitiveGlobs(argumentsImpl(key, e, cm).value)))
private[sbt] def task: Def.Initialize[Task[Seq[DynamicInput]]] =
Def.task(transitiveDynamicInputs(arguments.value))
private[sbt] def task(
key: ScopedKey[_]
): Def.Initialize[Task[Seq[DynamicInput]]] =
withParams((e, cm) => Def.task(transitiveDynamicInputs(argumentsImpl(key, e, cm).value)))
private def withParams[R](
f: (Extracted, CompiledMap) => Def.Initialize[Task[R]]
): Def.Initialize[Task[R]] = Def.taskDyn {
@ -100,7 +93,7 @@ private[sbt] object InputGraph {
}
}.value
}
private[sbt] def transitiveGlobs(args: Arguments): (Seq[Glob], Seq[Glob]) = {
private[sbt] def transitiveDynamicInputs(args: Arguments): Seq[DynamicInput] = {
import args._
val taskScope = Project.fillTaskAxis(scopedKey).scope
def delegates(sk: ScopedKey[_]): Seq[ScopedKey[_]] =
@ -111,15 +104,35 @@ private[sbt] object InputGraph {
val allKeys: Seq[ScopedKey[_]] =
(delegates(scopedKey).toSet ++ delegates(ScopedKey(taskScope, watchTriggers.key))).toSeq
val keys = collectKeys(args, allKeys, Set.empty, Set.empty)
def getGlobs(scopedKey: ScopedKey[Seq[Glob]]): Seq[Glob] =
data.get(scopedKey.scope).flatMap(_.get(scopedKey.key)).getOrElse(Nil)
val (inputGlobs, triggerGlobs) = keys.partition(_.key == fileInputs.key) match {
case (i, t) => (i.flatMap(getGlobs), t.flatMap(getGlobs))
def getDynamicInputs(scopedKey: ScopedKey[Seq[Glob]], trigger: Boolean): Seq[DynamicInput] = {
data
.get(scopedKey.scope)
.map { am =>
am.get(scopedKey.key) match {
case Some(globs: Seq[Glob]) =>
if (trigger) {
val stamper = am.get(fileStamper.key).getOrElse(FileStamper.Hash)
val forceTrigger = am.get(watchForceTriggerOnAnyChange.key).getOrElse(false)
globs.map(g => DynamicInput(g, stamper, forceTrigger))
} else {
globs.map(g => DynamicInput(g, FileStamper.LastModified, forceTrigger = true))
}
case None => Nil: Seq[DynamicInput]
}
}
.getOrElse(Nil)
}
(inputGlobs.distinct, (triggerGlobs ++ legacy(keys :+ scopedKey, args)).distinct)
val (inputGlobs, triggerGlobs) = keys.partition(_.key == fileInputs.key) match {
case (inputs, triggers) =>
(
inputs.flatMap(getDynamicInputs(_, trigger = false)),
triggers.flatMap(getDynamicInputs(_, trigger = true))
)
}
(inputGlobs ++ triggerGlobs ++ legacy(keys :+ scopedKey, args)).distinct.sorted
}
private def legacy(keys: Seq[ScopedKey[_]], args: Arguments): Seq[Glob] = {
private def legacy(keys: Seq[ScopedKey[_]], args: Arguments): Seq[DynamicInput] = {
import args._
val projectScopes =
keys.view
@ -143,10 +156,12 @@ private[sbt] object InputGraph {
None
}
}.toSeq
def toDynamicInput(glob: Glob): DynamicInput =
DynamicInput(glob, FileStamper.LastModified, forceTrigger = true)
scopes.flatMap {
case Left(scope) =>
extracted.runTask(Keys.watchSources in scope, state)._2.map(_.toGlob)
case Right(globs) => globs
extracted.runTask(Keys.watchSources in scope, state)._2.map(s => toDynamicInput(s.toGlob))
case Right(globs) => globs.map(toDynamicInput)
}
}
@tailrec

View File

@ -8,47 +8,32 @@
package sbt
package internal
import BuildPaths._
import BuildStreams._
import collection.mutable
import compiler.Eval
import Def.{ isDummy, ScopedKey, ScopeLocal, Setting }
import java.io.File
import java.net.URI
import Keys.{
appConfiguration,
baseDirectory,
configuration,
exportedProducts,
fullClasspath,
fullResolvers,
isMetaBuild,
loadedBuild,
onLoadMessage,
pluginData,
resolvedScoped,
sbtPlugin,
scalacOptions,
streams,
thisProject,
thisProjectRef,
update
}
import Project.inScope
import sbt.BuildPaths._
import sbt.Def.{ ScopeLocal, ScopedKey, Setting, isDummy }
import sbt.Keys._
import sbt.Project.inScope
import sbt.Scope.GlobalScope
import sbt.compiler.Eval
import sbt.internal.BuildStreams._
import sbt.internal.inc.classpath.ClasspathUtilities
import sbt.librarymanagement.ivy.{ InlineIvyConfiguration, IvyDependencyResolution, IvyPaths }
import sbt.internal.inc.{ ZincLmUtil, ZincUtil, ScalaInstance }
import sbt.internal.inc.{ ScalaInstance, ZincLmUtil, ZincUtil }
import sbt.internal.util.Attributed.data
import sbt.internal.util.Types.const
import sbt.internal.util.{ Attributed, Settings, ~> }
import sbt.io.{ GlobFilter, IO, Path }
import sbt.librarymanagement.ivy.{ InlineIvyConfiguration, IvyDependencyResolution, IvyPaths }
import sbt.librarymanagement.{ Configuration, Configurations, Resolver }
import sbt.util.{ Show, Logger }
import scala.annotation.tailrec
import scala.tools.nsc.reporters.ConsoleReporter
import Scope.GlobalScope
import sbt.nio.Settings
import sbt.util.{ Logger, Show }
import xsbti.compile.{ ClasspathOptionsUtil, Compilers }
import scala.annotation.tailrec
import scala.collection.mutable
import scala.tools.nsc.reporters.ConsoleReporter
private[sbt] object Load {
// note that there is State passed in but not pulled out
def defaultLoad(
@ -415,8 +400,14 @@ private[sbt] object Load {
uri: URI,
rootProject: URI => String,
settings: Seq[Setting[_]]
): Seq[Setting[_]] =
Project.transform(Scope.resolveScope(thisScope, uri, rootProject), settings)
): Seq[Setting[_]] = {
val transformed = Project.transform(Scope.resolveScope(thisScope, uri, rootProject), settings)
transformed.flatMap {
case s if s.key.key == sbt.nio.Keys.fileInputs.key =>
Seq[Setting[_]](s, Settings.allPathsAndAttributes(s.key), Settings.fileStamps(s.key))
case s => s :: Nil
}
}
def projectScope(project: Reference): Scope = Scope(Select(project), Zero, Zero, Zero)
@ -836,7 +827,6 @@ private[sbt] object Load {
* @param makeOrDiscoverRoot True if we should autogenerate a root project.
* @param buildUri The URI of the build this is loading
* @param context The plugin management context for autogenerated IDs.
*
* @return The completely resolved/updated sequence of projects defined, with all settings expanded.
*
* TODO - We want to attach the known (at this time) vals/lazy vals defined in each project's
@ -1030,7 +1020,6 @@ private[sbt] object Load {
*
* Ordering all Setting[_]s for the project
*
*
* @param p The project with manipulation.
* @param projectPlugins The deduced list of plugins for the given project.
* @param loadedPlugins The project definition (and classloader) of the build.
@ -1152,7 +1141,7 @@ private[sbt] object Load {
merge(fs.sortBy(_.getName).map(memoLoadSettingsFile))
// Finds all the build files associated with this project
import AddSettings.{ SbtFiles, DefaultSbtFiles, Sequence }
import AddSettings.{ DefaultSbtFiles, SbtFiles, Sequence }
def associatedFiles(auto: AddSettings): Seq[File] = auto match {
case sf: SbtFiles => sf.files.map(f => IO.resolve(projectBase, f)).filterNot(_.isHidden)
case sf: DefaultSbtFiles => defaultSbtFiles.filter(sf.include).filterNot(_.isHidden)

View File

@ -7,60 +7,40 @@
package sbt.nio
import java.io.{ File, IOException }
import java.nio.file.Path
import java.util
import java.io.IOException
import java.nio.file.{ Path, Paths }
import sbt.internal.Repository
import sbt.internal.inc.{ EmptyStamp, Stamper, LastModified => IncLastModified }
import sbt.internal.util.AttributeKey
import sbt.io.IO
import sbt.nio.file.FileAttributes
import sbt.{ Def, Task }
import xsbti.compile.analysis.Stamp
import sjsonnew.{ Builder, JsonFormat, Unbuilder, deserializationError }
import xsbti.compile.analysis.{ Stamp => XStamp }
import scala.util.Try
sealed trait FileStamp
object FileStamp {
private[nio] type Id[T] = T
private[nio] val attributeMapKey =
AttributeKey[util.HashMap[Path, (Option[Hash], Option[LastModified])]]("task-attribute-map")
private[sbt] def fileHashMap: Def.Initialize[Task[Repository[Id, Path, Hash]]] = Def.task {
val attributeMap = Keys.fileAttributeMap.value
path: Path =>
attributeMap.get(path) match {
case null =>
val h = hash(path)
attributeMap.put(path, (Some(h), None))
h
case (Some(h), _) => h
case (None, lm) =>
val h = hash(path)
attributeMap.put(path, (Some(h), lm))
h
}
}
private[sbt] final class StampedFile(path: Path, val stamp: Stamp)
extends java.io.File(path.toString)
private[sbt] val stampedFile: ((Path, FileAttributes)) => File = {
case (p: Path, a: FileAttributes) => new StampedFile(p, apply(p, a).stamp)
}
private[sbt] val stamped: File => Stamp = file => {
val path = file.toPath
FileAttributes(path).map(apply(path, _).stamp).getOrElse(EmptyStamp)
}
sealed trait FileStamper
object FileStamper {
case object Hash extends FileStamper
case object LastModified extends FileStamper
}
private[sbt] sealed trait FileStamp
private[sbt] object FileStamp {
private[sbt] type Id[T] = T
private[sbt] implicit class Ops(val fileStamp: FileStamp) {
private[sbt] def stamp: Stamp = fileStamp match {
private[sbt] def stamp: XStamp = fileStamp match {
case f: FileHashImpl => f.xstamp
case LastModified(time) => new IncLastModified(time)
case _ => EmptyStamp
}
}
private[sbt] val extractor: Try[FileStamp] => FileStamp = (_: Try[FileStamp]).getOrElse(Empty)
private[sbt] val converter: (Path, FileAttributes) => Try[FileStamp] = (p, a) => Try(apply(p, a))
def apply(path: Path, fileStamper: FileStamper): FileStamp = fileStamper match {
case FileStamper.Hash => hash(path)
case FileStamper.LastModified => lastModified(path)
}
def apply(path: Path, fileAttributes: FileAttributes): FileStamp =
try {
if (fileAttributes.isDirectory) lastModified(path)
@ -73,11 +53,125 @@ object FileStamp {
} catch {
case e: IOException => Error(e)
}
def hash(string: String): Hash = new FileHashImpl(sbt.internal.inc.Hash.unsafeFromString(string))
def hash(path: Path): Hash = new FileHashImpl(Stamper.forHash(path.toFile))
def lastModified(path: Path): LastModified = LastModified(IO.getModifiedTimeOrZero(path.toFile))
private[this] class FileHashImpl(val xstamp: Stamp) extends Hash(xstamp.getHash.orElse(""))
private[this] class FileHashImpl(val xstamp: XStamp) extends Hash(xstamp.getHash.orElse(""))
sealed abstract case class Hash private[sbt] (hex: String) extends FileStamp
case class LastModified private[sbt] (time: Long) extends FileStamp
case class Error(exception: IOException) extends FileStamp
case object Empty extends FileStamp
implicit val pathJsonFormatter: JsonFormat[Seq[Path]] = new JsonFormat[Seq[Path]] {
override def write[J](obj: Seq[Path], builder: Builder[J]): Unit = {
builder.beginArray()
obj.foreach { path =>
builder.writeString(path.toString)
}
builder.endArray()
}
override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): Seq[Path] =
jsOpt match {
case Some(js) =>
val size = unbuilder.beginArray(js)
val res = (1 to size) map { _ =>
Paths.get(unbuilder.readString(unbuilder.nextElement))
}
unbuilder.endArray()
res
case None =>
deserializationError("Expected JsArray but found None")
}
}
implicit val fileStampJsonFormatter: JsonFormat[Seq[(Path, FileStamp)]] =
new JsonFormat[Seq[(Path, FileStamp)]] {
override def write[J](obj: Seq[(Path, FileStamp)], builder: Builder[J]): Unit = {
val (hashes, lastModifiedTimes) = obj.partition(_._2.isInstanceOf[Hash])
builder.beginObject()
builder.addField("hashes", hashes.asInstanceOf[Seq[(Path, Hash)]])(fileHashJsonFormatter)
builder.addField(
"lastModifiedTimes",
lastModifiedTimes.asInstanceOf[Seq[(Path, LastModified)]]
)(
fileLastModifiedJsonFormatter
)
builder.endObject()
}
override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): Seq[(Path, FileStamp)] =
jsOpt match {
case Some(js) =>
unbuilder.beginObject(js)
val hashes = unbuilder.readField("hashes")(fileHashJsonFormatter)
val lastModifieds =
unbuilder.readField("lastModifiedTimes")(fileLastModifiedJsonFormatter)
unbuilder.endObject()
hashes ++ lastModifieds
case None =>
deserializationError("Expected JsObject but found None")
}
}
val fileHashJsonFormatter: JsonFormat[Seq[(Path, Hash)]] =
new JsonFormat[Seq[(Path, Hash)]] {
override def write[J](obj: Seq[(Path, Hash)], builder: Builder[J]): Unit = {
builder.beginArray()
obj.foreach {
case (p, h) =>
builder.beginArray()
builder.writeString(p.toString)
builder.writeString(h.hex)
builder.endArray()
}
builder.endArray()
}
override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): Seq[(Path, Hash)] =
jsOpt match {
case Some(js) =>
val size = unbuilder.beginArray(js)
val res = (1 to size) map { _ =>
unbuilder.beginArray(unbuilder.nextElement)
val path = Paths.get(unbuilder.readString(unbuilder.nextElement))
val hash = FileStamp.hash(unbuilder.readString(unbuilder.nextElement))
unbuilder.endArray()
path -> hash
}
unbuilder.endArray()
res
case None =>
deserializationError("Expected JsArray but found None")
}
}
val fileLastModifiedJsonFormatter: JsonFormat[Seq[(Path, LastModified)]] =
new JsonFormat[Seq[(Path, LastModified)]] {
override def write[J](obj: Seq[(Path, LastModified)], builder: Builder[J]): Unit = {
builder.beginArray()
obj.foreach {
case (p, lm) =>
builder.beginArray()
builder.writeString(p.toString)
builder.writeLong(lm.time)
builder.endArray()
}
builder.endArray()
}
override def read[J](jsOpt: Option[J], unbuilder: Unbuilder[J]): Seq[(Path, LastModified)] =
jsOpt match {
case Some(js) =>
val size = unbuilder.beginArray(js)
val res = (1 to size) map { _ =>
unbuilder.beginArray(unbuilder.nextElement)
val path = Paths.get(unbuilder.readString(unbuilder.nextElement))
val hash = FileStamp.LastModified(unbuilder.readLong(unbuilder.nextElement))
unbuilder.endArray()
path -> hash
}
unbuilder.endArray()
res
case None =>
deserializationError("Expected JsArray but found None")
}
}
final case class LastModified private[sbt] (time: Long) extends FileStamp
final case class Error(exception: IOException) extends FileStamp
}

View File

@ -10,19 +10,46 @@ package sbt.nio
import java.nio.file.Path
import sbt.BuildSyntax.{ settingKey, taskKey }
import sbt.nio.file.Glob
import sbt.internal.util.AttributeKey
import sbt.nio.file.{ FileAttributes, FileTreeView, Glob }
object Keys {
val allPaths = taskKey[Seq[Path]](
"All of the file inputs for a task with no filters applied. Regular files and directories are included."
)
val changedFiles =
taskKey[Seq[Path]](
"All of the file inputs for a task that have changed since the last run. Includes new and modified files but excludes deleted files."
)
val modifiedFiles =
taskKey[Seq[Path]](
"All of the file inputs for a task that have changed since the last run. Files are considered modified based on either the last modified time or the file stamp for the file."
)
val removedFiles =
taskKey[Seq[Path]]("All of the file inputs for a task that have changed since the last run.")
val allFiles =
taskKey[Seq[Path]]("All of the file inputs for a task excluding directories and hidden files.")
val fileInputs = settingKey[Seq[Glob]](
"The file globs that are used by a task. This setting will generally be scoped per task. It will also be used to determine the sources to watch during continuous execution."
)
val fileOutputs = taskKey[Seq[Glob]]("Describes the output files of a task")
val fileHashes = taskKey[Seq[(Path, FileStamp.Hash)]]("Retrieves the hashes for a set of files")
val fileLastModifiedTimes = taskKey[Seq[(Path, FileStamp.LastModified)]](
"Retrieves the last modified times for a set of files"
val fileOutputs = taskKey[Seq[Glob]]("Describes the output files of a task.")
val fileStamper = settingKey[FileStamper](
"Toggles the file stamping implementation used to determine whether or not a file has been modified."
)
val fileTreeView =
taskKey[FileTreeView.Nio[FileAttributes]]("A view of the local file system tree")
private[sbt] val fileStamps =
taskKey[Seq[(Path, FileStamp)]]("Retrieves the hashes for a set of files")
private[sbt] type FileAttributeMap =
java.util.HashMap[Path, FileStamp]
private[sbt] val persistentFileAttributeMap =
AttributeKey[FileAttributeMap]("persistent-file-attribute-map", Int.MaxValue)
private[sbt] val allPathsAndAttributes =
taskKey[Seq[(Path, FileAttributes)]]("Get all of the file inputs for a task")
private[sbt] val fileAttributeMap = taskKey[FileAttributeMap](
"Map of file stamps that may be cleared between task evaluation runs."
)
private[sbt] val stamper = taskKey[Path => FileStamp](
"A function that computes a file stamp for a path. It may have the side effect of updating a cache."
)
private[sbt] val fileAttributeMap =
taskKey[java.util.HashMap[Path, (Option[FileStamp.Hash], Option[FileStamp.LastModified])]](
"Map of file stamps that may be cleared between task evaluation runs."
)
}

View File

@ -0,0 +1,186 @@
/*
* sbt
* Copyright 2011 - 2018, Lightbend, Inc.
* Copyright 2008 - 2010, Mark Harrah
* Licensed under Apache License 2.0 (see LICENSE)
*/
package sbt
package nio
import java.nio.file.{ Files, Path }
import sbt.Keys._
import sbt.internal.{ Continuous, DynamicInput }
import sbt.nio.FileStamp.{ fileStampJsonFormatter, pathJsonFormatter }
import sbt.nio.FileStamper.{ Hash, LastModified }
import sbt.nio.Keys._
private[sbt] object Settings {
/**
* This adds the [[sbt.Keys.taskDefinitionKey]] to the work for each [[Task]]. Without
* this, the previous macro doesn't work correctly because [[Previous]] is unable to
* reference the task.
*
* @param setting the [[Def.Setting[_}]] for which we add the task definition
* @tparam T the generic type of the task (needed for type checking because [[Task]] is invariant)
* @return the setting with the task definition
*/
private[this] def addTaskDefinition[T](setting: Def.Setting[Task[T]]): Def.Setting[Task[T]] =
setting.mapInit((sk, task) => Task(task.info.set(sbt.Keys.taskDefinitionKey, sk), task.work))
/**
* Returns all of the paths described by a glob along with their basic file attributes.
* No additional filtering is performed.
*
* @param scopedKey the key whose fileInputs we are seeking
* @return a task definition that retrieves the file input files and their attributes scoped to a particular task.
*/
private[sbt] def allPathsAndAttributes(scopedKey: Def.ScopedKey[_]): Def.Setting[_] =
Keys.allPathsAndAttributes in scopedKey.scope := {
val view = (fileTreeView in scopedKey.scope).value
val inputs = (fileInputs in scopedKey.scope).value
val stamper = (fileStamper in scopedKey.scope).value
val forceTrigger = (watchForceTriggerOnAnyChange in scopedKey.scope).value
val dynamicInputs = Continuous.dynamicInputs.value
sbt.Keys.state.value.get(globalFileTreeRepository).foreach { repo =>
inputs.foreach(repo.register)
}
dynamicInputs.foreach(_ ++= inputs.map(g => DynamicInput(g, stamper, forceTrigger)))
view.list(inputs)
}
/**
* Returns all of the paths described by a glob with no additional filtering.
* No additional filtering is performed.
*
* @param scopedKey the key whose file inputs we are seeking
* @return a task definition that retrieves the input files and their attributes scoped to a particular task.
*/
private[sbt] def allPaths(scopedKey: Def.ScopedKey[_]): Def.Setting[_] =
addTaskDefinition(Keys.allPaths in scopedKey.scope := {
(Keys.allPathsAndAttributes in scopedKey.scope).value.map(_._1)
})
/**
* Returns all of the paths for the regular files described by a glob. Directories and hidden
* files are excluded.
*
* @param scopedKey the key whose file inputs we are seeking
* @return a task definition that retrieves all of the input paths scoped to the input key.
*/
private[sbt] def allFiles(scopedKey: Def.ScopedKey[_]): Def.Setting[_] =
addTaskDefinition(Keys.allFiles in scopedKey.scope := {
(Keys.allPathsAndAttributes in scopedKey.scope).value.collect {
case (p, a) if a.isRegularFile && !Files.isHidden(p) => p
}
})
/**
* Returns all of the regular files whose stamp has changed since the last time the
* task was evaluated. The result includes new and modified files but not deleted
* files or files whose stamp has not changed since the previous run. Directories and hidden
* files are excluded
*
* @param scopedKey the key whose fileInputs we are seeking
* @return a task definition that retrieves the changed input files scoped to the key.
*/
private[sbt] def changedFiles(scopedKey: Def.ScopedKey[_]): Seq[Def.Setting[_]] =
addTaskDefinition(Keys.changedFiles in scopedKey.scope := {
val current = (Keys.fileStamps in scopedKey.scope).value
(Keys.fileStamps in scopedKey.scope).previous match {
case Some(previous) => (current diff previous).map(_._1)
case None => current.map(_._1)
}
}) :: (watchForceTriggerOnAnyChange in scopedKey.scope := {
(watchForceTriggerOnAnyChange in scopedKey.scope).?.value match {
case Some(t) => t
case None => false
}
}) :: Nil
/**
* Returns all of the regular files and the corresponding file stamps for the file inputs
* scoped to the input key. Directories and hidden files are excluded.
*
* @param scopedKey the key whose fileInputs we are seeking
* @return a task definition that retrieves the input files and their file stamps scoped to the
* input key.
*/
private[sbt] def fileStamps(scopedKey: Def.ScopedKey[_]): Def.Setting[_] =
addTaskDefinition(Keys.fileStamps in scopedKey.scope := {
val stamper = (Keys.stamper in scopedKey.scope).value
(Keys.allPathsAndAttributes in scopedKey.scope).value.collect {
case (p, a) if a.isRegularFile && !Files.isHidden(p) => p -> stamper(p)
}
})
/**
* Returns all of the regular files whose stamp has changed since the last time the
* task was evaluated. The result includes modified files but neither new nor deleted
* files nor files whose stamp has not changed since the previous run. Directories and
* hidden files are excluded.
*
* @param scopedKey the key whose modified files we are seeking
* @return a task definition that retrieves the changed input files scoped to the key.
*/
private[sbt] def modifiedFiles(scopedKey: Def.ScopedKey[_]): Seq[Def.Setting[_]] =
(Keys.modifiedFiles in scopedKey.scope := {
val current = (Keys.fileStamps in scopedKey.scope).value
(Keys.fileStamps in scopedKey.scope).previous match {
case Some(previous) =>
val previousPathSet = previous.view.map(_._1).toSet
(current diff previous).collect { case (p, a) if previousPathSet(p) => p }
case None => current.map(_._1)
}
}).mapInit((sk, task) => Task(task.info.set(sbt.Keys.taskDefinitionKey, sk), task.work)) ::
(watchForceTriggerOnAnyChange in scopedKey.scope := {
(watchForceTriggerOnAnyChange in scopedKey.scope).?.value match {
case Some(t) => t
case None => false
}
}) :: Nil
/**
* Returns all of the files that have been removed since the previous run.
* task was evaluated. The result includes modified files but neither new nor deleted
* files nor files whose stamp has not changed since the previous run. Directories and
* hidden files are excluded
*
* @param scopedKey the key whose removed files we are seeking
* @return a task definition that retrieves the changed input files scoped to the key.
*/
private[sbt] def removedFiles(scopedKey: Def.ScopedKey[_]): Def.Setting[_] =
addTaskDefinition(Keys.removedFiles in scopedKey.scope := {
val current = (Keys.allFiles in scopedKey.scope).value
(Keys.allFiles in scopedKey.scope).previous match {
case Some(previous) => previous diff current
case None => Nil
}
}).mapInit((sk, task) => Task(task.info.set(sbt.Keys.taskDefinitionKey, sk), task.work))
/**
* Returns a function from `Path` to [[FileStamp]] that can be used by tasks to retrieve
* the stamp for a file. It has the side effect of stamping the file if it has not already
* been stamped during the task evaluation.
*
* @return a task definition for a function from `Path` to [[FileStamp]].
*/
private[sbt] def stamper(scopedKey: Def.ScopedKey[_]): Def.Setting[_] =
addTaskDefinition((Keys.stamper in scopedKey.scope) := {
val attributeMap = Keys.fileAttributeMap.value
val stamper = (Keys.fileStamper in scopedKey.scope).value
path: Path =>
attributeMap.get(path) match {
case null =>
val stamp = stamper match {
case Hash => FileStamp.hash(path)
case LastModified => FileStamp.lastModified(path)
}
attributeMap.put(path, stamp)
stamp
case s => s
}
})
}

View File

@ -0,0 +1,43 @@
package sbt.internal
import java.nio.file.{ Path, Paths }
import org.scalatest.FlatSpec
import sbt.nio.FileStamp
import sbt.nio.FileStamp._
import sjsonnew.support.scalajson.unsafe.Converter
class FileStampJsonSpec extends FlatSpec {
"file hashes" should "be serializable" in {
val hashes = Seq(
Paths.get("foo") -> FileStamp.hash("bar"),
Paths.get("bar") -> FileStamp.hash("buzz")
)
val json = Converter.toJsonUnsafe(hashes)(fileHashJsonFormatter)
val deserialized = Converter.fromJsonUnsafe(json)(fileHashJsonFormatter)
assert(hashes == deserialized)
}
"file last modified times" should "be serializable" in {
val lastModifiedTimes = Seq(
Paths.get("foo") -> FileStamp.LastModified(1234),
Paths.get("bar") -> FileStamp.LastModified(5678)
)
val json = Converter.toJsonUnsafe(lastModifiedTimes)(fileLastModifiedJsonFormatter)
val deserialized = Converter.fromJsonUnsafe(json)(fileLastModifiedJsonFormatter)
assert(lastModifiedTimes == deserialized)
}
"both" should "be serializable" in {
val hashes = Seq(
Paths.get("foo") -> FileStamp.hash("bar"),
Paths.get("bar") -> FileStamp.hash("buzz")
)
val lastModifiedTimes = Seq(
Paths.get("foo") -> FileStamp.LastModified(1234),
Paths.get("bar") -> FileStamp.LastModified(5678)
)
val both: Seq[(Path, FileStamp)] = hashes ++ lastModifiedTimes
val json = Converter.toJsonUnsafe(both)(fileStampJsonFormatter)
val deserialized = Converter.fromJsonUnsafe(json)(fileStampJsonFormatter)
assert(both.sameElements(deserialized))
}
}

View File

@ -5,6 +5,10 @@
* Licensed under Apache License 2.0 (see LICENSE)
*/
import sbt.nio.FileStamp
import sjsonnew.JsonFormat
import java.nio.file.{ Path => NioPath }
import scala.language.experimental.macros
package object sbt
@ -21,8 +25,7 @@ package object sbt
with sbt.BuildSyntax
with sbt.OptionSyntax
with sbt.SlashSyntax
with sbt.Import
with sbt.internal.GlobListers {
with sbt.Import {
// IO
def uri(s: String): URI = new URI(s)
def file(s: String): File = new File(s)
@ -30,7 +33,9 @@ package object sbt
implicit def fileToRichFile(file: File): sbt.io.RichFile = new sbt.io.RichFile(file)
implicit def filesToFinder(cc: Traversable[File]): sbt.io.PathFinder =
sbt.io.PathFinder.strict(cc)
implicit val fileStampJsonFormatter: JsonFormat[Seq[(NioPath, FileStamp)]] =
FileStamp.fileStampJsonFormatter
implicit val pathJsonFormatter: JsonFormat[Seq[NioPath]] = FileStamp.pathJsonFormatter
// others
object CompileOrder {

View File

@ -12,7 +12,7 @@ private[sbt] trait IOSyntax0 extends IOSyntax1 {
override def |(g: A => Option[B]): A => Option[B] = (a: A) => f(a) orElse g(a)
}
}
private[sbt] trait IOSyntax1 extends sbt.io.IOSyntax
private[sbt] trait Alternative[A, B] {
private[sbt] sealed trait IOSyntax1 extends sbt.io.IOSyntax with sbt.nio.file.syntax0
private[sbt] sealed trait Alternative[A, B] {
def |(g: A => Option[B]): A => Option[B]
}

View File

@ -1,4 +1,4 @@
import sbt.nio.file.syntax._
import sbt.nio.file.Glob
Compile / sourceGenerators += Def.task {
val files = Seq(sourceManaged.value / "foo.txt", sourceManaged.value / "bar.txt")
@ -6,4 +6,4 @@ Compile / sourceGenerators += Def.task {
files
}
cleanKeepGlobs += (sourceManaged.value / "bar.txt").toGlob
cleanKeepGlobs += Glob(sourceManaged.value, "bar.txt")

View File

@ -1,4 +1,4 @@
import sbt.nio.file.syntax._
import sbt.nio.file.Glob
cleanKeepGlobs in Compile +=
((classDirectory in Compile in compile).value / "X.class").toGlob
Glob((classDirectory in Compile in compile).value, "X.class")

View File

@ -1 +0,0 @@
val root = Build.root

View File

@ -1,33 +0,0 @@
import java.nio.file.{ Path, Paths }
import sbt._
import sbt.io.Glob
import sbt.Keys._
object Build {
val simpleTest = taskKey[Unit]("Check that glob file selectors work")
val relativeSubdir = Paths.get("subdir")
val relativeFiles =
Seq(Paths.get("foo.txt"), Paths.get("bar.json"), relativeSubdir.resolve("baz.yml"))
val files = taskKey[Path]("The files subdirectory")
val subdir = taskKey[Path]("The subdir path in the files subdirectory")
val allFiles = taskKey[Seq[Path]]("Returns all of the regular files in the files subdirectory")
private def check(actual: Any, expected: Any): Unit =
if (actual != expected) throw new IllegalStateException(s"$actual did not equal $expected")
val root = (project in file("."))
.settings(
files := (baseDirectory.value / "files").toPath,
subdir := files.value.resolve("subdir"),
allFiles := {
val f = files.value
relativeFiles.map(f.resolve(_))
},
simpleTest := {
val allPaths: Glob = files.value.allPaths
val af = allFiles.value.toSet
val sub = subdir.value
check(allPaths.all.map(_._1).toSet, af + sub)
check(allPaths.all.filter(_._2.isRegularFile).map(_._1).toSet, af)
check(allPaths.all.filter(_._2.isDirectory).map(_._1).toSet, Set(sub))
}
)
}

View File

@ -1 +0,0 @@
> simpleTest

View File

@ -0,0 +1 @@
### Bar

View File

@ -0,0 +1,11 @@
import sbt.nio.Keys._
import sbt.nio.file._
val fileInputTask = taskKey[Unit]("task with file inputs")
fileInputTask / fileInputs += Glob(baseDirectory.value / "base", "*.md")
fileInputTask := Def.taskDyn {
if ((fileInputTask / changedFiles).value.nonEmpty) Def.task(assert(true))
else Def.task(assert(false))
}.value

View File

@ -0,0 +1 @@
### new bar

View File

@ -0,0 +1,7 @@
> fileInputTask
-> fileInputTask
$ copy-file changes/Bar.md base/Bar.md
> fileInputTask

View File

@ -0,0 +1 @@
### Bar

View File

@ -0,0 +1 @@
foo

View File

@ -0,0 +1,40 @@
import sbt.nio.Keys._
Global / fileInputs := Seq(
(baseDirectory.value / "base").toGlob / "*.md",
(baseDirectory.value / "base").toGlob / "*.txt",
)
val checkModified = taskKey[Unit]("check that modified files are returned")
checkModified := Def.taskDyn {
val changed = (Global / changedFiles).value
val modified = (Global / modifiedFiles).value
if (modified.sameElements(changed)) Def.task(assert(true))
else Def.task {
assert(modified != changed)
assert(modified == Seq((baseDirectory.value / "base" / "Bar.md").toPath))
}
}.value
val checkRemoved = taskKey[Unit]("check that modified files are returned")
checkRemoved := Def.taskDyn {
val files = (Global / allFiles).value
val removed = (Global / removedFiles).value
if (removed.isEmpty) Def.task(assert(true))
else Def.task {
assert(files == Seq((baseDirectory.value / "base" / "Foo.txt").toPath))
assert(removed == Seq((baseDirectory.value / "base" / "Bar.md").toPath))
}
}.value
val checkAdded = taskKey[Unit]("check that modified files are returned")
checkAdded := Def.taskDyn {
val files = (Global / allFiles).value
val added = (Global / modifiedFiles).value
if (added.isEmpty || files.sameElements(added)) Def.task(assert(true))
else Def.task {
val base = baseDirectory.value / "base"
assert(files.sameElements(Seq("Bar.md", "Foo.txt").map(p => (base / p).toPath)))
assert(added == Seq((baseDirectory.value / "base" / "Bar.md").toPath))
}
}.value

View File

@ -0,0 +1 @@
### Bar updated

View File

@ -0,0 +1 @@
### Bar

View File

@ -0,0 +1 @@
foo

View File

@ -0,0 +1,17 @@
> checkModified
$ copy-file changes/Bar.md base/Bar.md
> checkModified
> checkRemoved
$ delete base/Bar.md
> checkRemoved
> checkAdded
$ copy-file original/Bar.md base/Bar.md
> checkAdded

View File

@ -2,6 +2,7 @@ import java.nio.file._
import sbt.nio.Keys._
import sbt.nio.file._
import sbt.Keys._
// The project contains two files: { Foo.txt, Bar.md } in the subdirector base/subdir/nested-subdir
@ -10,7 +11,7 @@ val foo = taskKey[Seq[File]]("Retrieve Foo.txt")
foo / fileInputs += baseDirectory.value ** "*.txt"
foo := (foo / fileInputs).value.all(fileTreeView.value).map(_._1.toFile)
foo := (foo / allPaths).value.map(_.toFile)
val checkFoo = taskKey[Unit]("Check that the Foo.txt file is retrieved")
@ -21,7 +22,7 @@ val bar = taskKey[Seq[File]]("Retrieve Bar.md")
bar / fileInputs += baseDirectory.value / "base/subdir/nested-subdir" * "*.md"
bar := (bar / fileInputs).value.all(fileTreeView.value).map(_._1.toFile)
bar := (bar / allPaths).value.map(_.toFile)
val checkBar = taskKey[Unit]("Check that the Bar.md file is retrieved")
@ -37,7 +38,7 @@ val checkAll = taskKey[Unit]("Check that the Bar.md file is retrieved")
checkAll := {
import sbt.dsl.LinterLevel.Ignore
val expected = Set("Foo.txt", "Bar.md").map(baseDirectory.value / "base/subdir/nested-subdir" / _)
val actual = (all / fileInputs).value.all(fileTreeView.value).filter(_._2.isRegularFile).map(_._1.toFile).toSet
val actual = (all / allFiles).value.map(_.toFile).toSet
assert(actual == expected)
}
@ -48,17 +49,6 @@ set / fileInputs ++= Seq(
baseDirectory.value / "base" / "subdir" / "nested-subdir" * -DirectoryFilter
)
val checkSet = taskKey[Unit]("Verify that redundant sources are handled")
checkSet := {
val redundant = (set / fileInputs).value.all(fileTreeView.value).map(_._1.toFile)
assert(redundant.size == 2)
val deduped = (set / fileInputs).value.toSet[Glob].all(fileTreeView.value).map(_._1.toFile)
val expected = Seq("Bar.md", "Foo.txt").map(baseDirectory.value / "base/subdir/nested-subdir" / _)
assert(deduped.sorted == expected)
}
val depth = taskKey[Seq[File]]("Specify redundant sources with limited depth")
val checkDepth = taskKey[Unit]("Check that the Bar.md file is retrieved")
@ -71,6 +61,6 @@ depth / fileInputs ++= {
checkDepth := {
val expected = Seq("Bar.md").map(baseDirectory.value / "base/subdir/nested-subdir" / _)
val actual = (depth / fileInputs).value.all(fileTreeView.value).map(_._1.toFile)
val actual = (depth / allFiles).value.map(_.toFile)
assert(actual == expected)
}

View File

@ -4,6 +4,4 @@
> checkAll
> checkSet
> checkDepth
> checkDepth

View File

@ -10,7 +10,7 @@ val allInputsExplicit = taskKey[Seq[File]]("")
val checkInputs = inputKey[Unit]("")
val checkInputsExplicit = inputKey[Unit]("")
allInputs := (Compile / unmanagedSources / fileInputs).value.all(fileTreeView.value).map(_._1.toFile)
allInputs := (Compile / unmanagedSources / allFiles).value.map(_.toFile)
checkInputs := {
val res = allInputs.value
@ -23,7 +23,7 @@ checkInputs := {
allInputsExplicit := {
val files = scala.collection.mutable.Set.empty[File]
val underlying = fileTreeView.value
val view = new FileTreeView[(Path, FileAttributes)] {
val view: FileTreeView[(Path, FileAttributes)] = new FileTreeView[(Path, FileAttributes)] {
override def list(path: Path): Seq[(Path, FileAttributes)] = {
val res = underlying.list(path)
files ++= res.map(_._1.toFile)
@ -31,7 +31,7 @@ allInputsExplicit := {
}
}
val include = (Compile / unmanagedSources / includeFilter).value
val _ = (Compile / unmanagedSources / fileInputs).value.all(view).map(_._1.toFile).toSet
view.list((Compile / unmanagedSources / fileInputs).value)
files.filter(include.accept).toSeq
}

View File

@ -0,0 +1 @@
val root = sbt.interproject.inputs.Build.root

View File

@ -0,0 +1,71 @@
package sbt
package interproject.inputs
import sbt.Keys._
import sbt.nio.Keys._
/**
* This test is for internal logic so it must be in the sbt package because it uses package
* private apis.
*/
object Build {
import sbt.internal.TransitiveDynamicInputs._
val cached = settingKey[Unit]("")
val newInputs = settingKey[Unit]("")
val checkCompile = taskKey[Unit]("check compile inputs")
val checkRun = taskKey[Unit]("check runtime inputs")
val checkTest = taskKey[Unit]("check test inputs")
val root = (project in file(".")).settings(
Compile / cached / fileInputs := (Compile / unmanagedSources / fileInputs).value ++
(Compile / unmanagedResources / fileInputs).value,
Test / cached / fileInputs := (Test / unmanagedSources / fileInputs).value ++
(Test / unmanagedResources / fileInputs).value,
Compile / newInputs / fileInputs += baseDirectory.value * "*.sc",
Compile / unmanagedSources / fileInputs ++= (Compile / newInputs / fileInputs).value,
checkCompile := {
val actual = (Compile / compile / transitiveDynamicInputs).value.map(_.glob).toSet
val expected = ((Compile / cached / fileInputs).value ++
(Compile / newInputs / fileInputs).value).toSet
streams.value.log.debug(s"actual: $actual\nexpected:$expected")
if (actual != expected) {
val actualExtra = actual diff expected
val expectedExtra = expected diff actual
throw new IllegalStateException(
s"$actual did not equal $expected\n" +
s"${if (actualExtra.nonEmpty) s"Actual result had extra fields $actualExtra" else ""}" +
s"${if (expectedExtra.nonEmpty) s"Actual result was missing: $expectedExtra" else ""}")
}
},
checkRun := {
val actual = (Runtime / run / transitiveDynamicInputs).value.map(_.glob).toSet
// Runtime doesn't add any new inputs, but it should correctly find the Compile inputs via
// delegation.
val expected = ((Compile / cached / fileInputs).value ++
(Compile / newInputs / fileInputs).value).toSet
streams.value.log.debug(s"actual: $actual\nexpected:$expected")
if (actual != expected) {
val actualExtra = actual diff expected
val expectedExtra = expected diff actual
throw new IllegalStateException(
s"${if (actualExtra.nonEmpty) s"Actual result had extra fields: $actualExtra" else ""}" +
s"${if (expectedExtra.nonEmpty) s"Actual result was missing: $expectedExtra" else ""}")
}
},
checkTest := {
val actual = (Test / compile / transitiveDynamicInputs).value.map(_.glob).toSet
val expected = ((Test / cached / fileInputs).value ++
(Compile / newInputs / fileInputs).value ++ (Compile / cached / fileInputs).value).toSet
streams.value.log.debug(s"actual: $actual\nexpected:$expected")
if (actual != expected) {
val actualExtra = actual diff expected
val expectedExtra = expected diff actual
throw new IllegalStateException(
s"$actual did not equal $expected\n" +
s"${if (actualExtra.nonEmpty) s"Actual result had extra fields $actualExtra" else ""}" +
s"${if (expectedExtra.nonEmpty) s"Actual result was missing: $expectedExtra" else ""}")
}
}
)
}

View File

@ -0,0 +1 @@
### Bar

View File

@ -0,0 +1,18 @@
import sbt.nio.Keys._
val fileInputTask = taskKey[Unit]("task with file inputs")
fileInputTask / fileInputs += (baseDirectory.value / "base").toGlob / "*.md"
fileInputTask / fileStamper := sbt.nio.FileStamper.LastModified
fileInputTask := Def.taskDyn {
if ((fileInputTask / changedFiles).value.nonEmpty) Def.task(assert(true))
else Def.task(assert(false))
}.value
val setLastModified = taskKey[Unit]("Reset the last modified time")
setLastModified := {
val file = baseDirectory.value / "base" / "Bar.md"
IO.setModifiedTimeOrFalse(file, 1234567890L)
}

View File

@ -0,0 +1 @@
### new bar

View File

@ -0,0 +1 @@
### new bar 2

View File

@ -0,0 +1,28 @@
> fileInputTask
-> fileInputTask
$ touch base/Bar.md
# this should succeed even though the contents didn't change
> fileInputTask
$ copy-file changes/Bar.md base/Bar.md
# the last modified should change due to the copy
> fileInputTask
> setLastModified
> fileInputTask
$ copy-file changes/Bar2.md base/Bar.md
> setLastModified
# this should fail even though we changed the file with a copy
-> fileInputTask
$ touch base/Bar.md
> fileInputTask

View File

@ -1,61 +0,0 @@
import sbt.internal.TransitiveGlobs._
import sbt.nio.Keys._
val cached = settingKey[Unit]("")
val newInputs = settingKey[Unit]("")
Compile / cached / fileInputs := (Compile / unmanagedSources / fileInputs).value ++
(Compile / unmanagedResources / fileInputs).value
Test / cached / fileInputs := (Test / unmanagedSources / fileInputs).value ++
(Test / unmanagedResources / fileInputs).value
Compile / newInputs / fileInputs += baseDirectory.value * "*.sc"
Compile / unmanagedSources / fileInputs ++= (Compile / newInputs / fileInputs).value
val checkCompile = taskKey[Unit]("check compile inputs")
checkCompile := {
val actual = (Compile / compile / transitiveInputs).value.toSet
val expected = ((Compile / cached / fileInputs).value ++ (Compile / newInputs / fileInputs).value).toSet
streams.value.log.debug(s"actual: $actual\nexpected:$expected")
if (actual != expected) {
val actualExtra = actual diff expected
val expectedExtra = expected diff actual
throw new IllegalStateException(
s"$actual did not equal $expected\n" +
s"${if (actualExtra.nonEmpty) s"Actual result had extra fields $actualExtra" else ""}" +
s"${if (expectedExtra.nonEmpty) s"Actual result was missing: $expectedExtra" else ""}")
}
}
val checkRun = taskKey[Unit]("check runtime inputs")
checkRun := {
val actual = (Runtime / run / transitiveInputs).value.toSet
// Runtime doesn't add any new inputs, but it should correctly find the Compile inputs via
// delegation.
val expected = ((Compile / cached / fileInputs).value ++ (Compile / newInputs / fileInputs).value).toSet
streams.value.log.debug(s"actual: $actual\nexpected:$expected")
if (actual != expected) {
val actualExtra = actual diff expected
val expectedExtra = expected diff actual
throw new IllegalStateException(
s"${if (actualExtra.nonEmpty) s"Actual result had extra fields: $actualExtra" else ""}" +
s"${if (expectedExtra.nonEmpty) s"Actual result was missing: $expectedExtra" else ""}")
}
}
val checkTest = taskKey[Unit]("check test inputs")
checkTest := {
val actual = (Test / compile / transitiveInputs).value.toSet
val expected = ((Test / cached / fileInputs).value ++ (Compile / newInputs / fileInputs).value ++
(Compile / cached / fileInputs).value).toSet
streams.value.log.debug(s"actual: $actual\nexpected:$expected")
if (actual != expected) {
val actualExtra = actual diff expected
val expectedExtra = expected diff actual
throw new IllegalStateException(
s"$actual did not equal $expected\n" +
s"${if (actualExtra.nonEmpty) s"Actual result had extra fields $actualExtra" else ""}" +
s"${if (expectedExtra.nonEmpty) s"Actual result was missing: $expectedExtra" else ""}")
}
}

View File

@ -10,4 +10,4 @@ checkStringValue := checkStringValueImpl.evaluated
setStringValue / watchTriggers := baseDirectory.value * "string.txt" :: Nil
watchOnEvent := { _ => _ => Watch.CancelWatch }
watchOnFileInputEvent := { (_, _) => Watch.CancelWatch }

View File

@ -32,10 +32,10 @@ object Build {
setStringValueImpl.evaluated
},
checkStringValue := checkStringValueImpl.evaluated,
watchOnEvent := { _ => _ => Watch.CancelWatch }
watchOnFileInputEvent := { (_, _) => Watch.CancelWatch }
)
lazy val bar = project.settings(fileInputs in setStringValue += baseDirectory.value * "foo.txt")
lazy val root = (project in file(".")).aggregate(foo, bar).settings(
watchOnEvent := { _ => _ => Watch.CancelWatch }
watchOnFileInputEvent := { (_, _) => Watch.CancelWatch }
)
}

View File

@ -1,5 +1,6 @@
package sbt.watch.task
import java.nio.file.Path
import sbt._
import Keys._
import sbt.nio.Keys._
@ -8,7 +9,7 @@ object Build {
val reloadFile = settingKey[File]("file to toggle whether or not to reload")
val setStringValue = taskKey[Unit]("set a global string to a value")
val checkStringValue = inputKey[Unit]("check the value of a global")
val foo = taskKey[Unit]("foo")
val foo = taskKey[Seq[Path]]("foo")
def setStringValueImpl: Def.Initialize[Task[Unit]] = Def.task {
val i = (setStringValue / fileInputs).value
val (stringFile, string) = ("foo.txt", "bar")
@ -22,23 +23,19 @@ object Build {
lazy val root = (project in file(".")).settings(
reloadFile := baseDirectory.value / "reload",
foo / fileInputs += baseDirectory.value * "foo.txt",
foo := (foo / allFiles).value,
setStringValue := Def.taskDyn {
// This hides foo / fileInputs from the input graph
Def.taskDyn {
val _ = (foo / fileInputs).value
.all(fileTreeView.value, sbt.internal.Continuous.dynamicInputs.value)
val inputs = foo.value
// By putting setStringValueImpl.value inside a Def.task, we ensure that
// (foo / fileInputs).value is registered with the file repository before modifying the file.
Def.task(setStringValueImpl.value)
if (inputs.isEmpty) Def.task(setStringValueImpl.value)
else Def.task(assert(false))
}
}.value,
checkStringValue := checkStringValueImpl.evaluated,
watchOnInputEvent := { (_, _) =>
Watch.CancelWatch
},
watchOnTriggerEvent := { (_, _) =>
Watch.CancelWatch
},
watchOnFileInputEvent := { (_, _) => Watch.CancelWatch },
watchTasks := Def.inputTask {
val prev = watchTasks.evaluated
new StateTransform(prev.state.fail)

View File

@ -1,11 +1,17 @@
package sbt.input.aggregation
package sbt
package input.aggregation
import sbt.Keys._
import sbt._
import sbt.internal.TransitiveGlobs._
import sbt.internal.DynamicInput
import sbt.internal.TransitiveDynamicInputs._
import sbt.nio.Keys._
import sbt.nio.file._
import sbt.nio.file.Glob
import java.nio.file.Paths
/**
* This test is for internal logic so it must be in the sbt package because it uses package
* private apis.
*/
object Build {
val setStringValue = inputKey[Unit]("set a global string to a value")
val checkStringValue = inputKey[Unit]("check the value of a global")
@ -19,24 +25,21 @@ object Build {
val Seq(stringFile, string) = Def.spaceDelimited().parsed
assert(IO.read(file(stringFile)) == string)
}
def checkGlobsImpl: Def.Initialize[Task[Unit]] = Def.task {
val (globInputs, globTriggers) = (Compile / compile / transitiveGlobs).value
val inputs = (Compile / compile / transitiveInputs).value.toSet
val triggers = (Compile / compile / transitiveTriggers).value.toSet
assert(globInputs.toSet == inputs)
assert(globTriggers.toSet == triggers)
// This is a hack to exclude the default compile file inputs
def triggers(t: Seq[DynamicInput]): Seq[Glob] = t.collect {
case i if !i.glob.toString.contains("*") => i.glob
}
lazy val foo = project.settings(
setStringValue := {
val _ = (fileInputs in (bar, setStringValue)).value
setStringValueImpl.evaluated
},
checkStringValue := checkStringValueImpl.evaluated,
watchOnTriggerEvent := { (_, _) => Watch.CancelWatch },
watchOnInputEvent := { (_, _) => Watch.CancelWatch },
Compile / compile / watchOnStart := { _ => () => Watch.CancelWatch },
watchOnFileInputEvent := { (_, _) => Watch.CancelWatch },
Compile / compile / watchOnIteration := { _ => Watch.CancelWatch },
checkTriggers := {
val actual = (Compile / compile / transitiveTriggers).value.toSet
val actual = triggers((Compile / compile / transitiveDynamicInputs).value).toSet
val base = baseDirectory.value.getParentFile
// This checks that since foo depends on bar there is a transitive trigger generated
// for the "bar.txt" trigger added to bar / Compile / unmanagedResources (which is a
@ -46,20 +49,20 @@ object Build {
},
Test / test / watchTriggers += baseDirectory.value * "test.txt",
Test / checkTriggers := {
val testTriggers = (Test / test / transitiveTriggers).value.toSet
val testTriggers = triggers((Test / test / transitiveDynamicInputs).value).toSet
// This validates that since the "test.txt" trigger is only added to the Test / test task,
// that the Test / compile does not pick it up. Both of them pick up the the triggers that
// are found in the test above for the compile configuration because of the transitive
// classpath dependency that is added in Defaults.internalDependencies.
val compileTriggers = (Test / compile / transitiveTriggers).value.toSet
val compileTriggers = triggers((Test / compile / transitiveDynamicInputs).value).toSet
val base = baseDirectory.value.getParentFile
val expected: Set[Glob] = Set(
base * "baz.txt", (base / "bar") * "bar.txt", (base / "foo") * "test.txt")
assert(testTriggers == expected)
assert((testTriggers - ((base / "foo") * "test.txt")) == compileTriggers)
},
checkGlobs := checkGlobsImpl.value
).dependsOn(bar)
lazy val bar = project.settings(
fileInputs in setStringValue += baseDirectory.value * "foo.txt",
setStringValue / watchTriggers += baseDirectory.value * "bar.txt",
@ -67,30 +70,28 @@ object Build {
Compile / unmanagedResources / watchTriggers += baseDirectory.value * "bar.txt",
checkTriggers := {
val base = baseDirectory.value.getParentFile
val actual = (Compile / compile / transitiveTriggers).value
val actual = triggers((Compile / compile / transitiveDynamicInputs).value).toSet
val expected: Set[Glob] = Set((base / "bar") * "bar.txt", base * "baz.txt")
assert(actual.toSet == expected)
assert(actual == expected)
},
// This trigger should not transitively propagate to any foo task
Test / unmanagedResources / watchTriggers += baseDirectory.value * "bar-test.txt",
Test / checkTriggers := {
val testTriggers = (Test / test / transitiveTriggers).value.toSet
val compileTriggers = (Test / compile / transitiveTriggers).value.toSet
val testTriggers = triggers((Test / test / transitiveDynamicInputs).value).toSet
val compileTriggers = triggers((Test / compile / transitiveDynamicInputs).value).toSet
val base = baseDirectory.value.getParentFile
val expected: Set[Glob] = Set(
base * "baz.txt", (base / "bar") * "bar.txt", (base / "bar") * "bar-test.txt")
assert(testTriggers == expected)
assert(testTriggers == compileTriggers)
},
checkGlobs := checkGlobsImpl.value
)
lazy val root = (project in file(".")).aggregate(foo, bar).settings(
watchOnEvent := { _ => _ => Watch.CancelWatch },
watchOnFileInputEvent := { (_, _) => Watch.CancelWatch },
checkTriggers := {
val actual = (Compile / compile / transitiveTriggers).value
val actual = triggers((Compile / compile / transitiveDynamicInputs).value)
val expected: Seq[Glob] = baseDirectory.value * "baz.txt" :: Nil
assert(actual == expected)
},
checkGlobs := checkGlobsImpl.value
)
}

View File

@ -2,8 +2,6 @@
> Test / checkTriggers
> checkGlobs
# do not set the project here to ensure the bar/bar.txt trigger is captured by aggregation
# also add random spaces and multiple commands to ensure the parser is sane.
> ~ setStringValue bar/bar.txt bar; root / setStringValue bar/bar.txt baz

View File

@ -8,6 +8,4 @@ setStringValue := setStringValueImpl.evaluated
checkStringValue := checkStringValueImpl.evaluated
watchOnTriggerEvent := { (_, _) => Watch.CancelWatch }
watchOnInputEvent := { (_, _) => Watch.CancelWatch }
watchOnMetaBuildEvent := { (_, _) => Watch.CancelWatch }
watchOnFileInputEvent := { (_, _) => Watch.CancelWatch }

View File

@ -1 +1 @@
watchOnStart := { _ => () => Watch.Reload }
watchOnIteration := { _ => Watch.Reload }

View File

@ -20,7 +20,7 @@ object Build {
setStringValue / watchTriggers += baseDirectory.value * "foo.txt",
setStringValue := setStringValueImpl.evaluated,
checkStringValue := checkStringValueImpl.evaluated,
watchOnTriggerEvent := { (_, _) => Watch.CancelWatch },
watchOnFileInputEvent := { (_, _) => Watch.CancelWatch },
watchTasks := Def.inputTask {
val prev = watchTasks.evaluated
new StateTransform(prev.state.fail)

View File

@ -22,6 +22,6 @@ object Build {
IO.touch(baseDirectory.value / "foo.txt", true)
Some("watching")
},
watchOnStart := { _ => () => Watch.CancelWatch }
watchOnIteration := { _ => Watch.CancelWatch }
)
}

View File

@ -24,7 +24,7 @@ object Build {
IO.touch(baseDirectory.value / "foo.txt", true)
Some("watching")
},
watchOnTriggerEvent := { (f, e) =>
watchOnFileInputEvent := { (_, _) =>
if (reloadFile.value.exists) Watch.CancelWatch else {
IO.touch(reloadFile.value, true)
Watch.Reload