Merge branch '1.x' of https://github.com/BennyHill/sbt into fix/3042

Conflicts:
	main/src/main/scala/sbt/internal/BuildDef.scala
	main/src/main/scala/sbt/internal/EvaluateConfigurations.scala
This commit is contained in:
Alistair Johnson 2018-04-04 18:30:56 +02:00
commit 77b536b25f
171 changed files with 3412 additions and 2469 deletions

4
.gitattributes vendored
View File

@ -5,3 +5,7 @@
# to native line endings on checkout.
*.scala text
*.java text
# Exclude contraband generated files from diff (by default - you can see it if you want)
**/contraband-scala/**/* -diff merge=ours
**/contraband-scala/**/* linguist-generated=true

View File

@ -16,8 +16,10 @@ matrix:
fast_finish: true
env:
global:
- secure: d3bu2KNwsVHwfhbGgO+gmRfDKBJhfICdCJFGWKf2w3Gv86AJZX9nuTYRxz0KtdvEHO5Xw8WTBZLPb2thSJqhw9OCm4J8TBAVqCP0ruUj4+aqBUFy4bVexQ6WKE6nWHs4JPzPk8c6uC1LG3hMuzlC8RGETXtL/n81Ef1u7NjyXjs=
matrix:
- SBT_CMD=";mimaReportBinaryIssues ;scalafmt::test ;test:scalafmt::test ;sbt:scalafmt::test ;headerCheck ;test:headerCheck ;test:compile ;mainSettingsProj/test ;safeUnitTests ;otherUnitTests"
- SBT_CMD=";mimaReportBinaryIssues ;scalafmt::test ;test:scalafmt::test ;sbt:scalafmt::test ;headerCheck ;test:headerCheck ;whitesourceCheckPolicies ;test:compile ;mainSettingsProj/test ;safeUnitTests ;otherUnitTests"
- SBT_CMD="scripted actions/*"
- SBT_CMD="scripted apiinfo/* compiler-project/* ivy-deps-management/*"
- SBT_CMD="scripted dependency-management/*1of4"
@ -46,5 +48,5 @@ script:
- sbt -J-XX:ReservedCodeCacheSize=128m -J-Xmx800M -J-Xms800M -J-server "$SBT_CMD"
before_cache:
- find $HOME/.ivy2 -name "ivydata-*.properties" -print -delete
- find $HOME/.sbt -name "*.lock" -print -delete
- find $HOME/.ivy2 -name "ivydata-*.properties" -delete
- find $HOME/.sbt -name "*.lock" -delete

View File

@ -231,6 +231,160 @@ command. To run a single test, such as the test in
sbt "scripted project/global-plugin"
Profiling sbt
-------------
There are several ways to profile sbt. The new hotness in profiling is FlameGraph.
You first collect stack trace samples, and then it is processed into svg graph.
See:
- [Using FlameGraphs To Illuminate The JVM by Nitsan Wakart](https://www.youtube.com/watch?v=ugRrFdda_JQ)
- [USENIX ATC '17: Visualizing Performance with Flame Graphs](https://www.youtube.com/watch?v=D53T1Ejig1Q)
### jvm-profiling-tools/async-profiler
The first one I recommend is async-profiler. This is available for macOS and Linux,
and works fairly well.
1. Download the installer from https://github.com/jvm-profiling-tools/async-profiler/releases/tag/v1.2
2. Make symbolic link to `build/` and `profiler.sh` to `$HOME/bin`, assuming you have PATH to `$HOME/bin`:
`ln -s ~/Applications/async-profiler/profiler.sh $HOME/bin/profiler.sh`
`ln -s ~/Applications/async-profiler/build $HOME/bin/build`
Next, close all Java appliations and anything that may affect the profiling, and run sbt in one terminal:
```
$ sbt exit
```
In another terminal, run:
```
$ jps
92746 sbt-launch.jar
92780 Jps
```
This tells you the process ID of sbt. In this case, it's 92746. While it's running, run
```
$ profiler.sh -d 60 <process id>
Started [cpu] profiling
--- Execution profile ---
Total samples: 31602
Non-Java: 3239 (10.25%)
GC active: 46 (0.15%)
Unknown (native): 14667 (46.41%)
Not walkable (native): 3 (0.01%)
Unknown (Java): 433 (1.37%)
Not walkable (Java): 8 (0.03%)
Thread exit: 1 (0.00%)
Deopt: 9 (0.03%)
Frame buffer usage: 55.658%
Total: 1932000000 (6.11%) samples: 1932
[ 0] java.lang.ClassLoader$NativeLibrary.load
[ 1] java.lang.ClassLoader.loadLibrary0
[ 2] java.lang.ClassLoader.loadLibrary
[ 3] java.lang.Runtime.loadLibrary0
[ 4] java.lang.System.loadLibrary
....
```
This should show a bunch of stacktraces that are useful.
To visualize this as a flamegraph, run:
```
$ profiler.sh -d 60 -f /tmp/flamegraph.svg <process id>
```
This should produce `/tmp/flamegraph.svg` at the end.
![flamegraph](project/flamegraph_svg.png)
See https://gist.github.com/eed3si9n/82d43acc95a002876d357bd8ad5f40d5
### running sbt with standby
One of the tricky things you come across while profiling is figuring out the process ID,
while wnating to profile the beginning of the application.
For this purpose, we've added `sbt.launcher.standby` JVM flag.
In the next version of sbt, you should be able to run:
```
$ sbt -J-Dsbt.launcher.standby=20s exit
```
This will count down for 20s before doing anything else.
### jvm-profiling-tools/perf-map-agent
If you want to try the mixed flamegraph, you can try perf-map-agent.
This uses `dtrace` on macOS and `perf` on Linux.
You first have to compile https://github.com/jvm-profiling-tools/perf-map-agent.
For macOS, here to how to export `JAVA_HOME` before running `cmake .`:
```
$ export JAVA_HOME=$(/usr/libexec/java_home)
$ cmake .
-- The C compiler identification is AppleClang 9.0.0.9000039
-- The CXX compiler identification is AppleClang 9.0.0.9000039
...
$ make
```
In addition, you have to git clone https://github.com/brendangregg/FlameGraph
In a fresh termimal, run sbt with `-XX:+PreserveFramePointer` flag:
```
$ sbt -J-Dsbt.launcher.standby=20s -J-XX:+PreserveFramePointer exit
```
In the terminal that you will run the perf-map:
```
$ cd quicktest/
$ export JAVA_HOME=$(/usr/libexec/java_home)
$ export FLAMEGRAPH_DIR=$HOME/work/FlameGraph
$ jps
94592 Jps
94549 sbt-launch.jar
$ $HOME/work/perf-map-agent/bin/dtrace-java-flames 94549
dtrace: system integrity protection is on, some features will not be available
dtrace: description 'profile-99 ' matched 2 probes
Flame graph SVG written to DTRACE_FLAME_OUTPUT='/Users/xxx/work/quicktest/flamegraph-94549.svg'.
```
This would produce better flamegraph in theory, but the output looks too messy for `sbt exit` case.
See https://gist.github.com/eed3si9n/b5856ff3d987655513380d1a551aa0df
This might be because it assumes that the operations are already JITed.
### ktoso/sbt-jmh
https://github.com/ktoso/sbt-jmh
Due to JIT warmup etc, benchmarking is difficult. JMH runs the same tests multiple times to
remove these effects and comes closer to measuring the performance of your code.
There's also an integration with jvm-profiling-tools/async-profiler, apparently.
### VisualVM
I'd also mention traditional JVM profiling tool. Since VisualVM is opensource,
I'll mention this one: https://visualvm.github.io/
1. First VisualVM.
2. Start sbt from a terminal.
3. You should see `xsbt.boot.Boot` under Local.
4. Open it, and select either sampler or profiler, and hit CPU button at the point when you want to start.
If you are familiar with YourKit, it also works similarly.
Other notes for maintainers
---------------------------
@ -248,3 +402,12 @@ cd vscode-sbt-scala/client
$ vsce package
$ vsce publish
```
## Signing the CLA
Contributing to sbt requires you or your employer to sign the
[Lightbend Contributor License Agreement](https://www.lightbend.com/contribute/cla).
To make it easier to respect our license agreements, we have added an sbt task
that takes care of adding the LICENSE headers to new files. Run `headerCreate`
and sbt will put a copyright notice into it.

View File

@ -21,10 +21,10 @@ sbt is a build tool for Scala, Java, and more.
For general documentation, see http://www.scala-sbt.org/.
sbt 1.0.x
sbt 1.x
---------
This is the 1.0.x series of sbt. The source code of sbt is split across
This is the 1.x series of sbt. The source code of sbt is split across
several Github repositories, including this one.
- [sbt/io][sbt/io] hosts `sbt.io` module.

180
build.sbt
View File

@ -9,7 +9,7 @@ def buildLevelSettings: Seq[Setting[_]] =
inThisBuild(
Seq(
organization := "org.scala-sbt",
version := "1.1.3-SNAPSHOT",
version := "1.2.0-SNAPSHOT",
description := "sbt is an interactive build tool",
bintrayOrganization := Some("sbt"),
bintrayRepository := {
@ -24,10 +24,12 @@ def buildLevelSettings: Seq[Setting[_]] =
Developer("eed3si9n", "Eugene Yokota", "@eed3si9n", url("https://github.com/eed3si9n")),
Developer("jsuereth", "Josh Suereth", "@jsuereth", url("https://github.com/jsuereth")),
Developer("dwijnand", "Dale Wijnand", "@dwijnand", url("https://github.com/dwijnand")),
Developer("gkossakowski",
"Grzegorz Kossakowski",
"@gkossakowski",
url("https://github.com/gkossakowski")),
Developer(
"gkossakowski",
"Grzegorz Kossakowski",
"@gkossakowski",
url("https://github.com/gkossakowski")
),
Developer("Duhemm", "Martin Duhem", "@Duhemm", url("https://github.com/Duhemm"))
),
homepage := Some(url("https://github.com/sbt/sbt")),
@ -38,32 +40,31 @@ def buildLevelSettings: Seq[Setting[_]] =
scalafmtVersion := "1.3.0",
))
def commonSettings: Seq[Setting[_]] =
Seq[SettingsDefinition](
headerLicense := Some(HeaderLicense.Custom(
"""|sbt
|Copyright 2011 - 2017, Lightbend, Inc.
|Copyright 2008 - 2010, Mark Harrah
|Licensed under BSD-3-Clause license (see LICENSE)
|""".stripMargin
)),
scalaVersion := baseScalaVersion,
componentID := None,
resolvers += Resolver.typesafeIvyRepo("releases"),
resolvers += Resolver.sonatypeRepo("snapshots"),
resolvers += "bintray-sbt-maven-releases" at "https://dl.bintray.com/sbt/maven-releases/",
addCompilerPlugin("org.spire-math" % "kind-projector" % "0.9.4" cross CrossVersion.binary),
concurrentRestrictions in Global += Util.testExclusiveRestriction,
testOptions in Test += Tests.Argument(TestFrameworks.ScalaCheck, "-w", "1"),
testOptions in Test += Tests.Argument(TestFrameworks.ScalaCheck, "-verbosity", "2"),
javacOptions in compile ++= Seq("-Xlint", "-Xlint:-serial"),
crossScalaVersions := Seq(baseScalaVersion),
bintrayPackage := (bintrayPackage in ThisBuild).value,
bintrayRepository := (bintrayRepository in ThisBuild).value,
publishArtifact in Test := false,
fork in compile := true,
fork in run := true
) flatMap (_.settings)
def commonSettings: Seq[Setting[_]] = Def.settings(
headerLicense := Some(HeaderLicense.Custom(
"""|sbt
|Copyright 2011 - 2017, Lightbend, Inc.
|Copyright 2008 - 2010, Mark Harrah
|Licensed under BSD-3-Clause license (see LICENSE)
|""".stripMargin
)),
scalaVersion := baseScalaVersion,
componentID := None,
resolvers += Resolver.typesafeIvyRepo("releases"),
resolvers += Resolver.sonatypeRepo("snapshots"),
resolvers += "bintray-sbt-maven-releases" at "https://dl.bintray.com/sbt/maven-releases/",
addCompilerPlugin("org.spire-math" % "kind-projector" % "0.9.4" cross CrossVersion.binary),
concurrentRestrictions in Global += Util.testExclusiveRestriction,
testOptions in Test += Tests.Argument(TestFrameworks.ScalaCheck, "-w", "1"),
testOptions in Test += Tests.Argument(TestFrameworks.ScalaCheck, "-verbosity", "2"),
javacOptions in compile ++= Seq("-Xlint", "-Xlint:-serial"),
crossScalaVersions := Seq(baseScalaVersion),
bintrayPackage := (bintrayPackage in ThisBuild).value,
bintrayRepository := (bintrayRepository in ThisBuild).value,
publishArtifact in Test := false,
fork in compile := true,
fork in run := true
)
def minimalSettings: Seq[Setting[_]] =
commonSettings ++ customCommands ++
@ -83,7 +84,14 @@ val mimaSettings = Def settings (
).map { v =>
organization.value % moduleName.value % v cross (if (crossPaths.value) CrossVersion.binary else CrossVersion.disabled)
}.toSet
}
},
mimaBinaryIssueFilters ++= Seq(
// Changes in the internal pacakge
exclude[DirectMissingMethodProblem]("sbt.internal.*"),
exclude[FinalClassProblem]("sbt.internal.*"),
exclude[FinalMethodProblem]("sbt.internal.*"),
exclude[IncompatibleResultTypeProblem]("sbt.internal.*"),
),
)
lazy val sbtRoot: Project = (project in file("."))
@ -163,6 +171,11 @@ val collectionProj = (project in file("internal") / "util-collection")
exclude[MissingClassProblem]("sbt.internal.util.Fn1"),
exclude[DirectMissingMethodProblem]("sbt.internal.util.TypeFunctions.toFn1"),
exclude[DirectMissingMethodProblem]("sbt.internal.util.Types.toFn1"),
// Instead of defining foldr in KList & overriding in KCons,
// it's now abstract in KList and defined in both KCons & KNil.
exclude[FinalMethodProblem]("sbt.internal.util.KNil.foldr"),
exclude[DirectAbstractMethodProblem]("sbt.internal.util.KList.foldr"),
),
)
.configure(addSbtUtilPosition)
@ -175,6 +188,8 @@ val completeProj = (project in file("internal") / "util-complete")
name := "Completion",
libraryDependencies += jline,
mimaSettings,
mimaBinaryIssueFilters ++= Seq(
),
)
.configure(addSbtIO, addSbtUtilControl)
@ -204,6 +219,10 @@ lazy val testingProj = (project in file("testing"))
contrabandFormatsForType in generateContrabands in Compile := ContrabandConfig.getFormats,
mimaSettings,
mimaBinaryIssueFilters ++= Seq(
// private[sbt]
exclude[IncompatibleMethTypeProblem]("sbt.TestStatus.write"),
exclude[IncompatibleResultTypeProblem]("sbt.TestStatus.read"),
// copy method was never meant to be public
exclude[DirectMissingMethodProblem]("sbt.protocol.testing.EndTestGroupErrorEvent.copy"),
exclude[DirectMissingMethodProblem]("sbt.protocol.testing.EndTestGroupErrorEvent.copy$default$*"),
@ -283,22 +302,45 @@ lazy val runProj = (project in file("run"))
)
.configure(addSbtIO, addSbtUtilLogging, addSbtCompilerClasspath)
val sbtProjDepsCompileScopeFilter =
ScopeFilter(inDependencies(LocalProject("sbtProj"), includeRoot = false), inConfigurations(Compile))
lazy val scriptedSbtProj = (project in scriptedPath / "sbt")
.dependsOn(commandProj)
.settings(
baseSettings,
name := "Scripted sbt",
libraryDependencies ++= Seq(launcherInterface % "provided"),
resourceGenerators in Compile += Def task {
val mainClassDir = (classDirectory in Compile in LocalProject("sbtProj")).value
val testClassDir = (classDirectory in Test in LocalProject("sbtProj")).value
val classDirs = (classDirectory all sbtProjDepsCompileScopeFilter).value
val extDepsCp = (externalDependencyClasspath in Compile in LocalProject("sbtProj")).value
val cpStrings = (mainClassDir +: testClassDir +: classDirs) ++ extDepsCp.files map (_.toString)
val file = (resourceManaged in Compile).value / "RunFromSource.classpath"
IO.writeLines(file, cpStrings)
List(file)
},
mimaSettings,
mimaBinaryIssueFilters ++= Seq(
// sbt.test package is renamed to sbt.scriptedtest.
exclude[MissingClassProblem]("sbt.test.*"),
),
)
.configure(addSbtIO, addSbtUtilLogging, addSbtCompilerInterface, addSbtUtilScripted, addSbtLmCore)
lazy val scriptedPluginProj = (project in scriptedPath / "plugin")
.dependsOn(sbtProj)
.dependsOn(mainProj)
.settings(
baseSettings,
name := "Scripted Plugin",
mimaSettings,
mimaBinaryIssueFilters ++= Seq(
// scripted plugin has moved into sbt mothership.
exclude[MissingClassProblem]("sbt.ScriptedPlugin*")
),
)
.configure(addSbtCompilerClasspath)
@ -310,6 +352,12 @@ lazy val actionsProj = (project in file("main-actions"))
name := "Actions",
libraryDependencies += sjsonNewScalaJson.value,
mimaSettings,
mimaBinaryIssueFilters ++= Seq(
// Removed unused private[sbt] nested class
exclude[MissingClassProblem]("sbt.Doc$Scaladoc"),
// Removed no longer used private[sbt] method
exclude[DirectMissingMethodProblem]("sbt.Doc.generate"),
),
)
.configure(
addSbtIO,
@ -329,6 +377,8 @@ lazy val protocolProj = (project in file("protocol"))
.dependsOn(collectionProj)
.settings(
testedBaseSettings,
scalacOptions -= "-Ywarn-unused",
scalacOptions += "-Xlint:-unused",
name := "Protocol",
libraryDependencies ++= Seq(sjsonNewScalaJson.value, ipcSocket),
managedSourceDirectories in Compile +=
@ -391,6 +441,9 @@ lazy val commandProj = (project in file("main-command"))
exclude[MissingClassProblem]("sbt.internal.NG*"),
exclude[MissingClassProblem]("sbt.internal.ReferenceCountedFileDescriptor"),
// made private[sbt] method private[this]
exclude[DirectMissingMethodProblem]("sbt.State.handleException"),
// copy method was never meant to be public
exclude[DirectMissingMethodProblem]("sbt.CommandSource.copy"),
exclude[DirectMissingMethodProblem]("sbt.CommandSource.copy$default$*"),
@ -415,7 +468,7 @@ lazy val commandProj = (project in file("main-command"))
lazy val coreMacrosProj = (project in file("core-macros"))
.dependsOn(collectionProj)
.settings(
commonSettings,
baseSettings,
name := "Core Macros",
libraryDependencies += "org.scala-lang" % "scala-compiler" % scalaVersion.value,
mimaSettings,
@ -454,7 +507,7 @@ lazy val mainSettingsProj = (project in file("main-settings"))
// The main integration project for sbt. It brings all of the projects together, configures them, and provides for overriding conventions.
lazy val mainProj = (project in file("main"))
.enablePlugins(ContrabandPlugin)
.dependsOn(logicProj, actionsProj, mainSettingsProj, runProj, commandProj, collectionProj)
.dependsOn(logicProj, actionsProj, mainSettingsProj, runProj, commandProj, collectionProj, scriptedSbtProj)
.settings(
testedBaseSettings,
name := "Main",
@ -464,17 +517,14 @@ lazy val mainProj = (project in file("main"))
sourceManaged in (Compile, generateContrabands) := baseDirectory.value / "src" / "main" / "contraband-scala",
mimaSettings,
mimaBinaryIssueFilters ++= Vector(
// Changed the signature of NetworkChannel ctor. internal.
exclude[DirectMissingMethodProblem]("sbt.internal.server.NetworkChannel.*"),
// ctor for ConfigIndex. internal.
exclude[DirectMissingMethodProblem]("sbt.internal.ConfigIndex.*"),
// New and changed methods on KeyIndex. internal.
exclude[ReversedMissingMethodProblem]("sbt.internal.KeyIndex.*"),
exclude[DirectMissingMethodProblem]("sbt.internal.KeyIndex.*"),
// Removed unused val. internal.
exclude[DirectMissingMethodProblem]("sbt.internal.RelayAppender.jsonFormat"),
// Removed unused def. internal.
exclude[DirectMissingMethodProblem]("sbt.internal.Load.isProjectThis"),
// Changed signature or removed private[sbt] methods
exclude[DirectMissingMethodProblem]("sbt.Classpaths.unmanagedLibs0"),
exclude[DirectMissingMethodProblem]("sbt.Defaults.allTestGroupsTask"),
exclude[DirectMissingMethodProblem]("sbt.Plugins.topologicalSort"),
exclude[IncompatibleMethTypeProblem]("sbt.Defaults.allTestGroupsTask"),
)
)
.configure(
@ -502,6 +552,7 @@ lazy val sbtProj = (project in file("sbt"))
mimaBinaryIssueFilters ++= sbtIgnoredProblems,
BuildInfoPlugin.buildInfoDefaultSettings,
addBuildInfoToConfig(Test),
BuildInfoPlugin.buildInfoDefaultSettings,
buildInfoObject in Test := "TestBuildInfo",
buildInfoKeys in Test := Seq[BuildInfoKey](
// WORKAROUND https://github.com/sbt/sbt-buildinfo/issues/117
@ -510,6 +561,7 @@ lazy val sbtProj = (project in file("sbt"))
connectInput in run in Test := true,
outputStrategy in run in Test := Some(StdoutOutput),
fork in Test := true,
parallelExecution in Test := false,
)
.configure(addSbtCompilerBridge)
@ -575,35 +627,29 @@ lazy val vscodePlugin = (project in file("vscode-sbt-scala"))
)
def scriptedTask: Def.Initialize[InputTask[Unit]] = Def.inputTask {
val result = scriptedSource(dir => (s: State) => Scripted.scriptedParser(dir)).parsed
// publishLocalBinAll.value // TODO: Restore scripted needing only binary jars.
publishAll.value
// These two projects need to be visible in a repo even if the default
// local repository is hidden, so we publish them to an alternate location and add
// that alternate repo to the running scripted test (in Scripted.scriptedpreScripted).
// (altLocalPublish in interfaceProj).value
// (altLocalPublish in compileInterfaceProj).value
(sbtProj / Test / compile).value // make sure sbt.RunFromSourceMain is compiled
Scripted.doScripted(
(sbtLaunchJar in bundledLauncherProj).value,
(fullClasspath in scriptedSbtProj in Test).value,
(scalaInstance in scriptedSbtProj).value,
scriptedSource.value,
scriptedBufferLog.value,
result,
Def.setting(Scripted.scriptedParser(scriptedSource.value)).parsed,
scriptedPrescripted.value,
scriptedLaunchOpts.value
)
}
def scriptedUnpublishedTask: Def.Initialize[InputTask[Unit]] = Def.inputTask {
val result = scriptedSource(dir => (s: State) => Scripted.scriptedParser(dir)).parsed
Scripted.doScripted(
(sbtLaunchJar in bundledLauncherProj).value,
(fullClasspath in scriptedSbtProj in Test).value,
(scalaInstance in scriptedSbtProj).value,
scriptedSource.value,
scriptedBufferLog.value,
result,
Def.setting(Scripted.scriptedParser(scriptedSource.value)).parsed,
scriptedPrescripted.value,
scriptedLaunchOpts.value
)
@ -640,14 +686,12 @@ def otherRootSettings =
scripted := scriptedTask.evaluated,
scriptedUnpublished := scriptedUnpublishedTask.evaluated,
scriptedSource := (sourceDirectory in sbtProj).value / "sbt-test",
// scriptedPrescripted := { addSbtAlternateResolver _ },
scriptedLaunchOpts := List("-Xmx1500M", "-Xms512M", "-server"),
publishAll := { val _ = (publishLocal).all(ScopeFilter(inAnyProject)).value },
publishLocalBinAll := { val _ = (publishLocalBin).all(ScopeFilter(inAnyProject)).value },
aggregate in bintrayRelease := false
) ++ inConfig(Scripted.RepoOverrideTest)(
Seq(
scriptedPrescripted := (_ => ()),
scriptedLaunchOpts := List(
"-Xmx1500M",
"-Xms512M",
@ -660,23 +704,6 @@ def otherRootSettings =
scriptedSource := (sourceDirectory in sbtProj).value / "repo-override-test"
))
// def addSbtAlternateResolver(scriptedRoot: File) = {
// val resolver = scriptedRoot / "project" / "AddResolverPlugin.scala"
// if (!resolver.exists) {
// IO.write(resolver, s"""import sbt._
// |import Keys._
// |
// |object AddResolverPlugin extends AutoPlugin {
// | override def requires = sbt.plugins.JvmPlugin
// | override def trigger = allRequirements
// |
// | override lazy val projectSettings = Seq(resolvers += alternativeLocalResolver)
// | lazy val alternativeLocalResolver = Resolver.file("$altLocalRepoName", file("$altLocalRepoPath"))(Resolver.ivyStylePatterns)
// |}
// |""".stripMargin)
// }
// }
lazy val docProjects: ScopeFilter = ScopeFilter(
inAnyProject -- inProjects(sbtRoot, sbtProj, scriptedSbtProj, scriptedPluginProj),
inConfigurations(Compile)
@ -772,3 +799,12 @@ def customCommands: Seq[Setting[_]] = Seq(
state
}
)
inThisBuild(Seq(
whitesourceProduct := "Lightbend Reactive Platform",
whitesourceAggregateProjectName := "sbt-master",
whitesourceAggregateProjectToken := "e7a1e55518c0489a98e9c7430c8b2ccd53d9f97c12ed46148b592ebe4c8bf128",
whitesourceIgnoredScopes ++= Seq("plugin", "scalafmt", "sxr"),
whitesourceFailOnError := sys.env.contains("WHITESOURCE_PASSWORD"), // fail if pwd is present
whitesourceForceCheckAllDependencies := true,
))

View File

@ -33,9 +33,9 @@ object ContextUtil {
f: (c.Expr[Any], c.Position) => c.Expr[T]): c.Expr[T] = {
import c.universe._
c.macroApplication match {
case s @ Select(Apply(_, t :: Nil), tp) => f(c.Expr[Any](t), s.pos)
case a @ Apply(_, t :: Nil) => f(c.Expr[Any](t), a.pos)
case x => unexpectedTree(x)
case s @ Select(Apply(_, t :: Nil), _) => f(c.Expr[Any](t), s.pos)
case a @ Apply(_, t :: Nil) => f(c.Expr[Any](t), a.pos)
case x => unexpectedTree(x)
}
}

View File

@ -31,7 +31,8 @@ sealed trait AttributeKey[T] {
def description: Option[String]
/**
* In environments that support delegation, looking up this key when it has no associated value will delegate to the values associated with these keys.
* In environments that support delegation, looking up this key when it has no associated value
* will delegate to the values associated with these keys.
* The delegation proceeds in order the keys are returned here.
*/
def extend: Seq[AttributeKey[_]]

View File

@ -7,6 +7,8 @@
package sbt.internal.util
import scala.collection.JavaConverters._
/** A mutable set interface that uses object identity to test for set membership.*/
trait IDSet[T] {
def apply(t: T): Boolean
@ -41,7 +43,7 @@ object IDSet {
def +=(t: T) = { backing.put(t, Dummy); () }
def ++=(t: Iterable[T]) = t foreach +=
def -=(t: T) = if (backing.remove(t) eq null) false else true
def all = collection.JavaConverters.collectionAsScalaIterable(backing.keySet)
def all = backing.keySet.asScala
def toList = all.toList
def isEmpty = backing.isEmpty

View File

@ -10,7 +10,7 @@ package sbt.internal.util
import Types._
import Classes.Applicative
/** Heterogeneous list with each element having type M[T] for some type T.*/
/** A higher-kinded heterogeneous list of elements that share the same type constructor `M[_]`. */
sealed trait KList[+M[_]] {
type Transform[N[_]] <: KList[N]
@ -18,7 +18,7 @@ sealed trait KList[+M[_]] {
def transform[N[_]](f: M ~> N): Transform[N]
/** Folds this list using a function that operates on the homogeneous type of the elements of this list. */
def foldr[B](f: (M[_], B) => B, init: B): B = init // had trouble defining it in KNil
def foldr[B](f: (M[_], B) => B, init: B): B
/** Applies `f` to the elements of this list in the applicative functor defined by `ap`. */
def apply[N[x] >: M[x], Z](f: Transform[Id] => Z)(implicit ap: Applicative[N]): N[Z]
@ -54,13 +54,14 @@ final case class KCons[H, +T <: KList[M], +M[_]](head: M[H], tail: T) extends KL
override def foldr[B](f: (M[_], B) => B, init: B): B = f(head, tail.foldr(f, init))
}
sealed abstract class KNil extends KList[Nothing] {
sealed abstract class KNil extends KList[NothingK] {
final type Transform[N[_]] = KNil
final def transform[N[_]](f: Nothing ~> N): Transform[N] = KNil
final def transform[N[_]](f: NothingK ~> N): Transform[N] = KNil
final def foldr[B](f: (NothingK[_], B) => B, init: B): B = init
final def toList = Nil
final def apply[N[x], Z](f: KNil => Z)(implicit ap: Applicative[N]): N[Z] = ap.pure(f(KNil))
final def traverse[N[_], P[_]](f: Nothing ~> (N P)#l)(implicit np: Applicative[N]): N[KNil] =
final def traverse[N[_], P[_]](f: NothingK ~> (N P)#l)(implicit np: Applicative[N]): N[KNil] =
np.pure(KNil)
}

View File

@ -65,7 +65,7 @@ object Signals {
}
// Must only be referenced using a
// try { } catch { case e: LinkageError => ... }
// try { } catch { case _: LinkageError => ... }
// block to
private final class Signals0 {
def supported(signal: String): Boolean = {

View File

@ -9,6 +9,7 @@ package sbt.internal.util
trait TypeFunctions {
type Id[X] = X
type NothingK[X] = Nothing
sealed trait Const[A] { type Apply[B] = A }
sealed trait ConstK[A] { type l[L[x]] = A }
sealed trait Compose[A[_], B[_]] { type Apply[T] = A[B[T]] }

View File

@ -7,8 +7,7 @@
package sbt.internal.util
import org.scalacheck._
import Prop._
import org.scalacheck._, Prop._
object SettingsTest extends Properties("settings") {
val settingsExample: SettingsExample = SettingsExample()
@ -160,7 +159,7 @@ object SettingsTest extends Properties("settings") {
final def checkCircularReferences(intermediate: Int): Prop = {
val ccr = new CCR(intermediate)
try { evaluate(setting(chk, ccr.top) :: Nil); false } catch {
case e: java.lang.Exception => true
case _: java.lang.Exception => true
}
}
@ -197,18 +196,18 @@ object SettingsTest extends Properties("settings") {
def evaluate(settings: Seq[Setting[_]]): Settings[Scope] =
try { make(settings)(delegates, scopeLocal, showFullKey) } catch {
case e: Throwable => e.printStackTrace; throw e
case e: Throwable => e.printStackTrace(); throw e
}
}
// This setup is a workaround for module synchronization issues
final class CCR(intermediate: Int) {
import SettingsTest.settingsExample._
lazy val top = iterate(value(intermediate), intermediate)
def iterate(init: Initialize[Int], i: Int): Initialize[Int] =
lazy val top = iterate(value(intermediate))
def iterate(init: Initialize[Int]): Initialize[Int] =
bind(init) { t =>
if (t <= 0)
top
else
iterate(value(t - 1), t - 1)
iterate(value(t - 1))
}
}

View File

@ -132,7 +132,7 @@ private[sbt] object JLine {
def createReader(): ConsoleReader = createReader(None, JLine.makeInputStream(true))
def createReader(historyPath: Option[File], in: InputStream): ConsoleReader =
usingTerminal { t =>
usingTerminal { _ =>
val cr = new ConsoleReader(in, System.out)
cr.setExpandEvents(false) // https://issues.scala-lang.org/browse/SI-7650
cr.setBellEnabled(false)

View File

@ -10,7 +10,7 @@ package complete
import java.lang.Character.{ toLowerCase => lower }
/** @author Paul Phillips*/
/** @author Paul Phillips */
object EditDistance {
/**
@ -24,7 +24,6 @@ object EditDistance {
insertCost: Int = 1,
deleteCost: Int = 1,
subCost: Int = 1,
transposeCost: Int = 1,
matchCost: Int = 0,
caseCost: Int = 1,
transpositions: Boolean = false

View File

@ -11,11 +11,7 @@ package complete
import History.number
import java.io.File
final class History private (
val lines: IndexedSeq[String],
val path: Option[File],
error: String => Unit
) {
final class History private (val lines: IndexedSeq[String], val path: Option[File]) {
private def reversed = lines.reverse
def all: Seq[String] = lines
@ -52,8 +48,8 @@ final class History private (
}
object History {
def apply(lines: Seq[String], path: Option[File], error: String => Unit): History =
new History(lines.toIndexedSeq, path, sys.error)
def apply(lines: Seq[String], path: Option[File]): History =
new History(lines.toIndexedSeq, path)
def number(s: String): Option[Int] =
try { Some(s.toInt) } catch { case _: NumberFormatException => None }

View File

@ -11,7 +11,7 @@ package complete
import jline.console.ConsoleReader
import jline.console.completer.{ Completer, CompletionHandler }
import scala.annotation.tailrec
import scala.collection.JavaConverters
import scala.collection.JavaConverters._
object JLineCompletion {
def installCustomCompletor(reader: ConsoleReader, parser: Parser[_]): Unit =
@ -154,7 +154,7 @@ object JLineCompletion {
if (line.charAt(line.length - 1) != '\n')
reader.println()
}
reader.printColumns(JavaConverters.seqAsJavaList(columns.map(_.trim)))
reader.printColumns(columns.map(_.trim).asJava)
}
def hasNewline(s: String): Boolean = s.indexOf('\n') >= 0

View File

@ -12,15 +12,17 @@ import Parser._
import java.io.File
import java.net.URI
import java.lang.Character.{
getType,
MATH_SYMBOL,
OTHER_SYMBOL,
CURRENCY_SYMBOL,
DASH_PUNCTUATION,
OTHER_PUNCTUATION,
MATH_SYMBOL,
MODIFIER_SYMBOL,
CURRENCY_SYMBOL
OTHER_PUNCTUATION,
OTHER_SYMBOL,
getType
}
import scala.annotation.tailrec
/** Provides standard implementations of commonly useful [[Parser]]s. */
trait Parsers {
@ -313,6 +315,16 @@ object DefaultParsers extends Parsers with ParserMain {
apply(p)(s).resultEmpty.isValid
/** Returns `true` if `s` parses successfully according to [[ID]].*/
def validID(s: String): Boolean = matches(ID, s)
def validID(s: String): Boolean = {
// Handwritten version of `matches(ID, s)` because validID turned up in profiling.
def isIdChar(c: Char): Boolean = Character.isLetterOrDigit(c) || (c == '-') || (c == '_')
@tailrec def isRestIdChar(cur: Int, s: String, length: Int): Boolean =
if (cur < length)
isIdChar(s.charAt(cur)) && isRestIdChar(cur + 1, s, length)
else
true
!s.isEmpty && Character.isLetter(s.charAt(0)) && isRestIdChar(1, s, s.length)
}
}

View File

@ -0,0 +1,28 @@
/*
* sbt
* Copyright 2011 - 2017, Lightbend, Inc.
* Copyright 2008 - 2010, Mark Harrah
* Licensed under BSD-3-Clause license (see LICENSE)
*/
package sbt.internal.util
package complete
import org.scalacheck._, Gen._, Prop._
object DefaultParsersSpec extends Properties("DefaultParsers") {
import DefaultParsers.{ ID, isIDChar, matches, validID }
property("∀ s ∈ String: validID(s) == matches(ID, s)") = forAll(
(s: String) => validID(s) == matches(ID, s))
property("∀ s ∈ genID: matches(ID, s)") = forAll(genID)(s => matches(ID, s))
property("∀ s ∈ genID: validID(s)") = forAll(genID)(s => validID(s))
private val chars: Seq[Char] = Char.MinValue to Char.MaxValue
private val genID: Gen[String] =
for {
c <- oneOf(chars filter (_.isLetter))
cs <- listOf(oneOf(chars filter isIDChar))
} yield (c :: cs).mkString
}

View File

@ -9,60 +9,66 @@ package sbt.internal.util
package complete
import java.io.File
import sbt.io.IO._
import org.scalatest.Assertion
import sbt.io.IO
class FileExamplesTest extends UnitSpec {
"listing all files in an absolute base directory" should
"produce the entire base directory's contents" in {
val _ = new DirectoryStructure {
fileExamples().toList should contain theSameElementsAs (allRelativizedPaths)
withDirectoryStructure() { ds =>
ds.fileExamples().toList should contain theSameElementsAs (ds.allRelativizedPaths)
}
}
"listing files with a prefix that matches none" should
"produce an empty list" in {
val _ = new DirectoryStructure(withCompletionPrefix = "z") {
fileExamples().toList shouldBe empty
"listing files with a prefix that matches none" should "produce an empty list" in {
withDirectoryStructure(withCompletionPrefix = "z") { ds =>
ds.fileExamples().toList shouldBe empty
}
}
"listing single-character prefixed files" should
"produce matching paths only" in {
val _ = new DirectoryStructure(withCompletionPrefix = "f") {
fileExamples().toList should contain theSameElementsAs (prefixedPathsOnly)
"listing single-character prefixed files" should "produce matching paths only" in {
withDirectoryStructure(withCompletionPrefix = "f") { ds =>
ds.fileExamples().toList should contain theSameElementsAs (ds.prefixedPathsOnly)
}
}
"listing directory-prefixed files" should
"produce matching paths only" in {
val _ = new DirectoryStructure(withCompletionPrefix = "far") {
fileExamples().toList should contain theSameElementsAs (prefixedPathsOnly)
"listing directory-prefixed files" should "produce matching paths only" in {
withDirectoryStructure(withCompletionPrefix = "far") { ds =>
ds.fileExamples().toList should contain theSameElementsAs (ds.prefixedPathsOnly)
}
}
it should "produce sub-dir contents only when appending a file separator to the directory" in {
val _ = new DirectoryStructure(withCompletionPrefix = "far" + File.separator) {
fileExamples().toList should contain theSameElementsAs (prefixedPathsOnly)
withDirectoryStructure(withCompletionPrefix = "far" + File.separator) { ds =>
ds.fileExamples().toList should contain theSameElementsAs (ds.prefixedPathsOnly)
}
}
"listing files with a sub-path prefix" should
"produce matching paths only" in {
val _ = new DirectoryStructure(withCompletionPrefix = "far" + File.separator + "ba") {
fileExamples().toList should contain theSameElementsAs (prefixedPathsOnly)
"listing files with a sub-path prefix" should "produce matching paths only" in {
withDirectoryStructure(withCompletionPrefix = "far" + File.separator + "ba") { ds =>
ds.fileExamples().toList should contain theSameElementsAs (ds.prefixedPathsOnly)
}
}
"completing a full path" should
"produce a list with an empty string" in {
val _ = new DirectoryStructure(withCompletionPrefix = "bazaar") {
fileExamples().toList shouldEqual List("")
"completing a full path" should "produce a list with an empty string" in {
withDirectoryStructure(withCompletionPrefix = "bazaar") { ds =>
ds.fileExamples().toList shouldEqual List("")
}
}
// TODO: Remove DelayedInit - https://github.com/scala/scala/releases/tag/v2.11.0-RC1
class DirectoryStructure(withCompletionPrefix: String = "") extends DelayedInit {
def withDirectoryStructure[A](withCompletionPrefix: String = "")(
thunk: DirectoryStructure => Assertion
): Assertion = {
IO.withTemporaryDirectory { tempDir =>
val ds = new DirectoryStructure(withCompletionPrefix)
ds.createSampleDirStructure(tempDir)
ds.fileExamples = new FileExamples(ds.baseDir, withCompletionPrefix)
thunk(ds)
}
}
final class DirectoryStructure(withCompletionPrefix: String) {
var fileExamples: FileExamples = _
var baseDir: File = _
var childFiles: List[File] = _
@ -72,22 +78,14 @@ class FileExamplesTest extends UnitSpec {
def allRelativizedPaths: List[String] =
(childFiles ++ childDirectories ++ nestedFiles ++ nestedDirectories)
.map(relativize(baseDir, _).get)
.map(IO.relativize(baseDir, _).get)
def prefixedPathsOnly: List[String] =
allRelativizedPaths
.filter(_ startsWith withCompletionPrefix)
.map(_ substring withCompletionPrefix.length)
override def delayedInit(testBody: => Unit): Unit = {
withTemporaryDirectory { tempDir =>
createSampleDirStructure(tempDir)
fileExamples = new FileExamples(baseDir, withCompletionPrefix)
testBody
}
}
private def createSampleDirStructure(tempDir: File): Unit = {
def createSampleDirStructure(tempDir: File): Unit = {
childFiles = toChildFiles(tempDir, List("foo", "bar", "bazaar"))
childDirectories = toChildFiles(tempDir, List("moo", "far"))
nestedFiles = toChildFiles(childDirectories(1), List("farfile1", "barfile2"))

View File

@ -24,14 +24,14 @@ object LogicTest extends Properties("Logic") {
property("Properly orders results.") = secure(expect(ordering, Set(B, A, C, E, F)))
property("Detects cyclic negation") = secure(
Logic.reduceAll(badClauses, Set()) match {
case Right(res) => false
case Left(err: Logic.CyclicNegation) => true
case Left(err) => sys.error(s"Expected cyclic error, got: $err")
case Right(_) => false
case Left(_: Logic.CyclicNegation) => true
case Left(err) => sys.error(s"Expected cyclic error, got: $err")
}
)
def expect(result: Either[LogicException, Matched], expected: Set[Atom]) = result match {
case Left(err) => false
case Left(_) => false
case Right(res) =>
val actual = res.provenSet
if (actual != expected)

View File

@ -10,10 +10,6 @@ package sbt
import java.io.File
import sbt.internal.inc.AnalyzingCompiler
import Predef.{ conforms => _, _ }
import sbt.io.syntax._
import sbt.io.IO
import sbt.util.CacheStoreFactory
import xsbti.Reporter
import xsbti.compile.JavaTools
@ -23,10 +19,12 @@ import sbt.internal.util.ManagedLogger
object Doc {
import RawCompileLike._
def scaladoc(label: String,
cacheStoreFactory: CacheStoreFactory,
compiler: AnalyzingCompiler): Gen =
scaladoc(label, cacheStoreFactory, compiler, Seq())
def scaladoc(label: String,
cacheStoreFactory: CacheStoreFactory,
compiler: AnalyzingCompiler,
@ -34,82 +32,32 @@ object Doc {
cached(cacheStoreFactory,
fileInputOptions,
prepare(label + " Scala API documentation", compiler.doc))
def javadoc(label: String,
cacheStoreFactory: CacheStoreFactory,
doc: JavaTools,
log: Logger,
reporter: Reporter): Gen =
javadoc(label, cacheStoreFactory, doc, log, reporter, Seq())
def javadoc(label: String,
cacheStoreFactory: CacheStoreFactory,
doc: JavaTools,
log: Logger,
reporter: Reporter,
fileInputOptions: Seq[String]): Gen =
cached(
cacheStoreFactory,
fileInputOptions,
prepare(
label + " Java API documentation",
filterSources(
javaSourcesOnly,
(sources: Seq[File],
classpath: Seq[File],
outputDirectory: File,
options: Seq[String],
maxErrors: Int,
log: Logger) => {
// doc.doc
???
}
)
)
)
@deprecated("Going away", "1.1.1")
def javadoc(
label: String,
cacheStoreFactory: CacheStoreFactory,
doc: JavaTools,
log: Logger,
reporter: Reporter,
): Gen = ???
@deprecated("Going away", "1.1.1")
def javadoc(
label: String,
cacheStoreFactory: CacheStoreFactory,
doc: JavaTools,
log: Logger,
reporter: Reporter,
fileInputOptions: Seq[String],
): Gen = ???
@deprecated("Going away", "1.1.1")
val javaSourcesOnly: File => Boolean = _.getName.endsWith(".java")
private[sbt] final class Scaladoc(maximumErrors: Int, compiler: AnalyzingCompiler) extends Doc {
def apply(label: String,
sources: Seq[File],
classpath: Seq[File],
outputDirectory: File,
options: Seq[String],
log: ManagedLogger): Unit = {
generate("Scala",
label,
compiler.doc,
sources,
classpath,
outputDirectory,
options,
maximumErrors,
log)
}
}
}
@deprecated("Going away", "1.1.1")
sealed trait Doc {
@deprecated("Going away", "1.1.1")
type Gen = (Seq[File], Seq[File], File, Seq[String], Int, ManagedLogger) => Unit
private[sbt] final def generate(variant: String,
label: String,
docf: Gen,
sources: Seq[File],
classpath: Seq[File],
outputDirectory: File,
options: Seq[String],
maxErrors: Int,
log: ManagedLogger): Unit = {
val logSnip = variant + " API documentation"
if (sources.isEmpty)
log.info("No sources available, skipping " + logSnip + "...")
else {
log.info(
"Generating " + logSnip + " for " + label + " sources to " + outputDirectory.absolutePath + "...")
IO.delete(outputDirectory)
IO.createDirectory(outputDirectory)
docf(sources, classpath, outputDirectory, options, maxErrors, log)
log.info(logSnip + " generation successful.")
}
}
}

View File

@ -17,6 +17,7 @@ import sbt.io.IO
import sbt.util.Logger
import sbt.ConcurrentRestrictions.Tag
import sbt.protocol.testing._
import sbt.internal.util.ConsoleAppender
private[sbt] object ForkTests {
def apply(runners: Map[TestFramework, Runner],
@ -78,7 +79,7 @@ private[sbt] object ForkTests {
val is = new ObjectInputStream(socket.getInputStream)
try {
val config = new ForkConfiguration(log.ansiCodesSupported, parallel)
val config = new ForkConfiguration(ConsoleAppender.formatEnabledInEnv, parallel)
os.writeObject(config)
val taskdefs = opts.tests.map(

View File

@ -7,7 +7,6 @@
package sbt
import scala.Predef.{ conforms => _, _ }
import java.io.File
import java.util.jar.{ Attributes, Manifest }
import scala.collection.JavaConverters._
@ -85,8 +84,10 @@ object Package {
}
def setVersion(main: Attributes): Unit = {
val version = Attributes.Name.MANIFEST_VERSION
if (main.getValue(version) eq null)
if (main.getValue(version) eq null) {
main.put(version, "1.0")
()
}
}
def addSpecManifestAttributes(name: String, version: String, orgName: String): PackageOption = {
import Attributes.Name._
@ -100,10 +101,18 @@ object Package {
org: String,
orgName: String): PackageOption = {
import Attributes.Name._
val attribKeys = Seq(IMPLEMENTATION_TITLE,
IMPLEMENTATION_VERSION,
IMPLEMENTATION_VENDOR,
IMPLEMENTATION_VENDOR_ID)
// The ones in Attributes.Name are deprecated saying:
// "Extension mechanism will be removed in a future release. Use class path instead."
val IMPLEMENTATION_VENDOR_ID = new Attributes.Name("Implementation-Vendor-Id")
val IMPLEMENTATION_URL = new Attributes.Name("Implementation-URL")
val attribKeys = Seq(
IMPLEMENTATION_TITLE,
IMPLEMENTATION_VERSION,
IMPLEMENTATION_VENDOR,
IMPLEMENTATION_VENDOR_ID,
)
val attribVals = Seq(name, version, orgName, org)
ManifestAttributes((attribKeys zip attribVals) ++ {
homepage map (h => (IMPLEMENTATION_URL, h.toString))

View File

@ -7,10 +7,10 @@
package sbt
import scala.annotation.tailrec
import java.io.File
import sbt.internal.inc.{ RawCompiler, ScalaInstance }
import Predef.{ conforms => _, _ }
import sbt.io.syntax._
import sbt.io.IO
@ -30,7 +30,7 @@ object RawCompileLike {
type Gen = (Seq[File], Seq[File], File, Seq[String], Int, ManagedLogger) => Unit
private def optionFiles(options: Seq[String], fileInputOpts: Seq[String]): List[File] = {
@annotation.tailrec
@tailrec
def loop(opt: List[String], result: List[File]): List[File] = {
opt.dropWhile(!fileInputOpts.contains(_)) match {
case List(_, fileOpt, tail @ _*) => {
@ -46,6 +46,7 @@ object RawCompileLike {
def cached(cacheStoreFactory: CacheStoreFactory, doCompile: Gen): Gen =
cached(cacheStoreFactory, Seq(), doCompile)
def cached(cacheStoreFactory: CacheStoreFactory,
fileInputOpts: Seq[String],
doCompile: Gen): Gen =
@ -67,6 +68,7 @@ object RawCompileLike {
}
cachedComp(inputs)(exists(outputDirectory.allPaths.get.toSet))
}
def prepare(description: String, doCompile: Gen): Gen =
(sources, classpath, outputDirectory, options, maxErrors, log) => {
if (sources.isEmpty)
@ -79,20 +81,22 @@ object RawCompileLike {
log.info(description.capitalize + " successful.")
}
}
def filterSources(f: File => Boolean, doCompile: Gen): Gen =
(sources, classpath, outputDirectory, options, maxErrors, log) =>
doCompile(sources filter f, classpath, outputDirectory, options, maxErrors, log)
def rawCompile(instance: ScalaInstance, cpOptions: ClasspathOptions): Gen =
(sources, classpath, outputDirectory, options, maxErrors, log) => {
(sources, classpath, outputDirectory, options, _, log) => {
val compiler = new RawCompiler(instance, cpOptions, log)
compiler(sources, classpath, outputDirectory, options)
}
def compile(label: String,
cacheStoreFactory: CacheStoreFactory,
instance: ScalaInstance,
cpOptions: ClasspathOptions): Gen =
cached(cacheStoreFactory, prepare(label + " sources", rawCompile(instance, cpOptions)))
val nop: Gen = (sources, classpath, outputDirectory, options, maxErrors, log) => ()
val nop: Gen = (_, _, _, _, _, _) => ()
}

View File

@ -30,10 +30,18 @@ import sjsonnew.{ Builder, JsonFormat, Unbuilder, deserializationError }
* It is safe to use for its intended purpose: copying resources to a class output directory.
*/
object Sync {
def apply(store: CacheStore,
inStyle: FileInfo.Style = FileInfo.lastModified,
outStyle: FileInfo.Style = FileInfo.exists)
: Traversable[(File, File)] => Relation[File, File] =
@deprecated("Use sync, which doesn't take the unused outStyle param", "1.1.1")
def apply(
store: CacheStore,
inStyle: FileInfo.Style = FileInfo.lastModified,
outStyle: FileInfo.Style = FileInfo.exists,
): Traversable[(File, File)] => Relation[File, File] =
sync(store, inStyle)
def sync(
store: CacheStore,
inStyle: FileInfo.Style = FileInfo.lastModified,
): Traversable[(File, File)] => Relation[File, File] =
mappings => {
val relation = Relation.empty ++ mappings
noDuplicateTargets(relation)
@ -63,20 +71,16 @@ object Sync {
def copy(source: File, target: File): Unit =
if (source.isFile)
IO.copyFile(source, target, true)
else if (!target.exists) // we don't want to update the last modified time of an existing directory
{
IO.createDirectory(target)
IO.copyLastModified(source, target)
}
else if (!target.exists) { // we don't want to update the last modified time of an existing directory
IO.createDirectory(target)
IO.copyLastModified(source, target)
()
}
def noDuplicateTargets(relation: Relation[File, File]): Unit = {
val dups = relation.reverseMap.filter {
case (_, srcs) =>
srcs.size >= 2 && srcs.exists(!_.isDirectory)
} map {
case (target, srcs) =>
"\n\t" + target + "\nfrom\n\t" + srcs.mkString("\n\t\t")
}
val dups = relation.reverseMap
.filter { case (_, srcs) => srcs.size >= 2 && srcs.exists(!_.isDirectory) }
.map { case (target, srcs) => "\n\t" + target + "\nfrom\n\t" + srcs.mkString("\n\t\t") }
if (dups.nonEmpty)
sys.error("Duplicate mappings:" + dups.mkString)
}

View File

@ -133,17 +133,20 @@ object TestResultLogger {
failuresCount,
ignoredCount,
canceledCount,
pendingCount) =
pendingCount,
) =
results.events.foldLeft((0, 0, 0, 0, 0, 0, 0)) {
case ((skippedAcc, errorAcc, passedAcc, failureAcc, ignoredAcc, canceledAcc, pendingAcc),
(name @ _, testEvent)) =>
case (acc, (_, testEvent)) =>
val (skippedAcc, errorAcc, passedAcc, failureAcc, ignoredAcc, canceledAcc, pendingAcc) =
acc
(skippedAcc + testEvent.skippedCount,
errorAcc + testEvent.errorCount,
passedAcc + testEvent.passedCount,
failureAcc + testEvent.failureCount,
ignoredAcc + testEvent.ignoredCount,
canceledAcc + testEvent.canceledCount,
pendingAcc + testEvent.pendingCount)
pendingAcc + testEvent.pendingCount,
)
}
val totalCount = failuresCount + errorsCount + skippedCount + passedCount
val base =

View File

@ -34,6 +34,7 @@ import sbt.util.Logger
import sbt.protocol.testing.TestResult
sealed trait TestOption
object Tests {
/**
@ -227,7 +228,7 @@ object Tests {
if (config.parallel)
makeParallel(loader, runnables, setupTasks, config.tags) //.toSeq.join
else
makeSerial(loader, runnables, setupTasks, config.tags)
makeSerial(loader, runnables, setupTasks)
val taggedMainTasks = mainTasks.tagw(config.tags: _*)
taggedMainTasks map processResults flatMap { results =>
val cleanupTasks = fj(partApp(userCleanup) :+ frameworkCleanup(results.overall))
@ -294,10 +295,20 @@ object Tests {
}
}
def makeSerial(loader: ClassLoader,
runnables: Seq[TestRunnable],
setupTasks: Task[Unit],
tags: Seq[(Tag, Int)]): Task[List[(String, SuiteResult)]] = {
@deprecated("Use the variant without tags", "1.1.1")
def makeSerial(
loader: ClassLoader,
runnables: Seq[TestRunnable],
setupTasks: Task[Unit],
tags: Seq[(Tag, Int)],
): Task[List[(String, SuiteResult)]] =
makeSerial(loader, runnables, setupTasks)
def makeSerial(
loader: ClassLoader,
runnables: Seq[TestRunnable],
setupTasks: Task[Unit],
): Task[List[(String, SuiteResult)]] = {
@tailrec
def processRunnable(runnableList: List[TestRunnable],
acc: List[(String, SuiteResult)]): List[(String, SuiteResult)] =

View File

@ -94,10 +94,13 @@ object BasicCommands {
}
def completionsCommand: Command =
Command(CompletionsCommand, CompletionsBrief, CompletionsDetailed)(completionsParser)(
Command(CompletionsCommand, CompletionsBrief, CompletionsDetailed)(_ => completionsParser)(
runCompletions(_)(_))
def completionsParser(state: State): Parser[String] = {
@deprecated("No longer public", "1.1.1")
def completionsParser(state: State): Parser[String] = completionsParser
private[this] def completionsParser: Parser[String] = {
val notQuoted = (NotQuoted ~ any.*) map { case (nq, s) => nq ++ s }
val quotedOrUnquotedSingleArgument = Space ~> (StringVerbatim | StringEscapable | notQuoted)
token(quotedOrUnquotedSingleArgument ?? "" examples ("", " "))
@ -175,19 +178,19 @@ object BasicCommands {
}
def reboot: Command =
Command(RebootCommand, Help.more(RebootCommand, RebootDetailed))(rebootOptionParser) {
Command(RebootCommand, Help.more(RebootCommand, RebootDetailed))(_ => rebootOptionParser) {
case (s, (full, currentOnly)) =>
s.reboot(full, currentOnly)
}
@deprecated("Use rebootOptionParser", "1.1.0")
def rebootParser(s: State): Parser[Boolean] =
rebootOptionParser(s) map { case (full, currentOnly) => full }
def rebootParser(s: State): Parser[Boolean] = rebootOptionParser map { case (full, _) => full }
private[sbt] def rebootOptionParser(s: State): Parser[(Boolean, Boolean)] =
token(
Space ~> (("full" ^^^ ((true, false))) |
("dev" ^^^ ((false, true))))) ?? ((false, false))
private[sbt] def rebootOptionParser: Parser[(Boolean, Boolean)] = {
val fullOption = "full" ^^^ ((true, false))
val devOption = "dev" ^^^ ((false, true))
token(Space ~> (fullOption | devOption)) ?? ((false, false))
}
def call: Command =
Command(ApplyCommand, Help.more(ApplyCommand, ApplyDetailed))(_ => callParser) {
@ -236,10 +239,9 @@ object BasicCommands {
def historyParser(s: State): Parser[() => State] =
Command.applyEffect(HistoryCommands.actionParser) { histFun =>
val logError = (msg: String) => s.log.error(msg)
val hp = s get historyPath getOrElse None
val hp = (s get historyPath).flatten
val lines = hp.toList.flatMap(p => IO.readLines(p)).toIndexedSeq
histFun(CHistory(lines, hp, logError)) match {
histFun(CHistory(lines, hp)) match {
case Some(commands) =>
commands foreach println //printing is more appropriate than logging
(commands ::: s).continue

View File

@ -10,6 +10,7 @@ package sbt
import java.io.File
import sbt.internal.util.AttributeKey
import sbt.internal.inc.classpath.ClassLoaderCache
import sbt.internal.server.ServerHandler
import sbt.librarymanagement.ModuleID
import sbt.util.Level
@ -39,6 +40,11 @@ object BasicKeys {
"The wire protocol for the server command.",
10000)
val fullServerHandlers =
AttributeKey[Seq[ServerHandler]]("fullServerHandlers",
"Combines default server handlers and user-defined handlers.",
10000)
val autoStartServer =
AttributeKey[Boolean](
"autoStartServer",

View File

@ -163,6 +163,16 @@ object Command {
case Some(c) => c(state)
})
def process(command: String, state: State): State = {
val parser = combine(state.definedCommands)
parse(command, parser(state)) match {
case Right(s) => s() // apply command. command side effects happen here
case Left(errMsg) =>
state.log error errMsg
state.fail
}
}
def invalidValue(label: String, allowed: Iterable[String])(value: String): String =
s"Not a valid $label: $value" + similar(value, allowed)
@ -178,15 +188,16 @@ object Command {
bs map (b => (b, distance(a, b))) filter (_._2 <= maxDistance) sortBy (_._2) take (maxSuggestions) map (_._1)
def distance(a: String, b: String): Int =
EditDistance.levenshtein(a,
b,
insertCost = 1,
deleteCost = 1,
subCost = 2,
transposeCost = 1,
matchCost = -1,
caseCost = 1,
transpositions = true)
EditDistance.levenshtein(
a,
b,
insertCost = 1,
deleteCost = 1,
subCost = 2,
matchCost = -1,
caseCost = 1,
transpositions = true
)
def spacedAny(name: String): Parser[String] = spacedC(name, any)

View File

@ -238,14 +238,16 @@ object State {
def process(f: (Exec, State) => State): State = {
def runCmd(cmd: Exec, remainingCommands: List[Exec]) = {
log.debug(s"> $cmd")
f(cmd,
s.copy(remainingCommands = remainingCommands,
currentCommand = Some(cmd),
history = cmd :: s.history))
val s1 = s.copy(
remainingCommands = remainingCommands,
currentCommand = Some(cmd),
history = cmd :: s.history,
)
f(cmd, s1)
}
s.remainingCommands match {
case List() => exit(true)
case List(x, xs @ _*) => runCmd(x, xs.toList)
case Nil => exit(true)
case x :: xs => runCmd(x, xs)
}
}
def :::(newCommands: List[String]): State = ++:(newCommands map { Exec(_, s.source) })
@ -321,7 +323,7 @@ object State {
import ExceptionCategory._
private[sbt] def handleException(t: Throwable, s: State, log: Logger): State = {
private[this] def handleException(t: Throwable, s: State, log: Logger): State = {
ExceptionCategory(t) match {
case AlreadyHandled => ()
case m: MessageOnly => log.error(m.message)

View File

@ -23,8 +23,8 @@ import scala.util.Properties
trait Watched {
/** The files watched when an action is run with a preceeding ~ */
def watchSources(s: State): Seq[Watched.WatchSource] = Nil
/** The files watched when an action is run with a proceeding ~ */
def watchSources(@deprecated("unused", "") s: State): Seq[Watched.WatchSource] = Nil
def terminateWatch(key: Int): Boolean = Watched.isEnter(key)
/**
@ -44,8 +44,13 @@ trait Watched {
}
object Watched {
val defaultWatchingMessage
: WatchState => String = _.count + ". Waiting for source changes... (press enter to interrupt)"
val defaultWatchingMessage: WatchState => String = ws =>
s"${ws.count}. Waiting for source changes... (press enter to interrupt)"
def projectWatchingMessage(projectId: String): WatchState => String =
ws =>
s"${ws.count}. Waiting for source changes in project $projectId... (press enter to interrupt)"
val defaultTriggeredMessage: WatchState => String = const("")
val clearWhenTriggered: WatchState => String = const(clearScreen)
def clearScreen: String = "\u001b[2J\u001b[0;0H"
@ -70,8 +75,8 @@ object Watched {
* @param base The base directory from which to include files.
* @return An instance of `Source`.
*/
def apply(base: File): Source =
apply(base, AllPassFilter, NothingFilter)
def apply(base: File): Source = apply(base, AllPassFilter, NothingFilter)
}
private[this] class AWatched extends Watched
@ -107,9 +112,9 @@ object Watched {
(triggered, newWatchState)
} catch {
case e: Exception =>
val log = s.log
log.error("Error occurred obtaining files to watch. Terminating continuous execution...")
State.handleException(e, s, log)
s.log.error(
"Error occurred obtaining files to watch. Terminating continuous execution...")
s.handleError(e)
(false, watchState)
}

View File

@ -19,12 +19,11 @@ import sjsonnew.JsonFormat
*/
abstract class CommandChannel {
private val commandQueue: ConcurrentLinkedQueue[Exec] = new ConcurrentLinkedQueue()
def append(exec: Exec): Boolean =
commandQueue.add(exec)
def append(exec: Exec): Boolean = commandQueue.add(exec)
def poll: Option[Exec] = Option(commandQueue.poll)
def publishEvent[A: JsonFormat](event: A, execId: Option[String]): Unit
def publishEvent[A: JsonFormat](event: A): Unit
final def publishEvent[A: JsonFormat](event: A): Unit = publishEvent(event, None)
def publishEventMessage(event: EventMessage): Unit
def publishBytes(bytes: Array[Byte]): Unit
def shutdown(): Unit

View File

@ -40,8 +40,6 @@ private[sbt] final class ConsoleChannel(val name: String) extends CommandChannel
def publishEvent[A: JsonFormat](event: A, execId: Option[String]): Unit = ()
def publishEvent[A: JsonFormat](event: A): Unit = ()
def publishEventMessage(event: EventMessage): Unit =
event match {
case e: ConsolePromptEvent =>
@ -50,7 +48,7 @@ private[sbt] final class ConsoleChannel(val name: String) extends CommandChannel
case _ =>
val x = makeAskUserThread(e.state)
askUserThread = Some(x)
x.start
x.start()
}
case e: ConsoleUnpromptEvent =>
e.lastSource match {
@ -70,7 +68,7 @@ private[sbt] final class ConsoleChannel(val name: String) extends CommandChannel
def shutdown(): Unit =
askUserThread match {
case Some(x) if x.isAlive =>
x.interrupt
x.interrupt()
askUserThread = None
case _ => ()
}

View File

@ -126,6 +126,7 @@ object NetworkClient {
def run(arguments: List[String]): Unit =
try {
new NetworkClient(arguments)
()
} catch {
case NonFatal(e) => println(e.getMessage)
}

View File

@ -103,7 +103,7 @@ private[sbt] object Server {
def tryClient(f: => Socket): Unit = {
if (portfile.exists) {
Try { f } match {
case Failure(e) => ()
case Failure(_) => ()
case Success(socket) =>
socket.close()
throw new AlreadyRunningException()

View File

@ -0,0 +1,69 @@
/*
* sbt
* Copyright 2011 - 2017, Lightbend, Inc.
* Copyright 2008 - 2010, Mark Harrah
* Licensed under BSD-3-Clause license (see LICENSE)
*/
package sbt
package internal
package server
import sjsonnew.JsonFormat
import sbt.internal.protocol._
import sbt.util.Logger
import sbt.protocol.{ SettingQuery => Q }
/**
* ServerHandler allows plugins to extend sbt server.
* It's a wrapper around curried function ServerCallback => JsonRpcRequestMessage => Unit.
*/
final class ServerHandler(val handler: ServerCallback => ServerIntent) {
override def toString: String = s"Serverhandler(...)"
}
object ServerHandler {
def apply(handler: ServerCallback => ServerIntent): ServerHandler =
new ServerHandler(handler)
lazy val fallback: ServerHandler = ServerHandler({ handler =>
ServerIntent(
{ case x => handler.log.debug(s"Unhandled notification received: ${x.method}: $x") },
{ case x => handler.log.debug(s"Unhandled request received: ${x.method}: $x") }
)
})
}
final class ServerIntent(val onRequest: PartialFunction[JsonRpcRequestMessage, Unit],
val onNotification: PartialFunction[JsonRpcNotificationMessage, Unit]) {
override def toString: String = s"ServerIntent(...)"
}
object ServerIntent {
def apply(onRequest: PartialFunction[JsonRpcRequestMessage, Unit],
onNotification: PartialFunction[JsonRpcNotificationMessage, Unit]): ServerIntent =
new ServerIntent(onRequest, onNotification)
def request(onRequest: PartialFunction[JsonRpcRequestMessage, Unit]): ServerIntent =
new ServerIntent(onRequest, PartialFunction.empty)
def notify(onNotification: PartialFunction[JsonRpcNotificationMessage, Unit]): ServerIntent =
new ServerIntent(PartialFunction.empty, onNotification)
}
/**
* Interface to invoke JSON-RPC response.
*/
trait ServerCallback {
def jsonRpcRespond[A: JsonFormat](event: A, execId: Option[String]): Unit
def jsonRpcRespondError(execId: Option[String], code: Long, message: String): Unit
def jsonRpcNotify[A: JsonFormat](method: String, params: A): Unit
def appendExec(exec: Exec): Boolean
def log: Logger
def name: String
private[sbt] def authOptions: Set[ServerAuthentication]
private[sbt] def authenticate(token: String): Boolean
private[sbt] def setInitialized(value: Boolean): Unit
private[sbt] def onSettingQuery(execId: Option[String], req: Q): Unit
}

View File

@ -10,47 +10,57 @@ package xsbt
import java.io.{ BufferedReader, BufferedWriter, InputStreamReader, OutputStreamWriter }
import java.net.{ InetAddress, ServerSocket, Socket }
import scala.annotation.tailrec
import scala.util.control.NonFatal
object IPC {
private val portMin = 1025
private val portMax = 65536
private val loopback = InetAddress.getByName(null) // loopback
private val loopback = InetAddress.getByName(null)
def client[T](port: Int)(f: IPC => T): T =
ipc(new Socket(loopback, port))(f)
def client[T](port: Int)(f: IPC => T): T = ipc(new Socket(loopback, port))(f)
def pullServer[T](f: Server => T): T = {
val server = makeServer
try { f(new Server(server)) } finally { server.close() }
try f(new Server(server))
finally server.close()
}
def unmanagedServer: Server = new Server(makeServer)
def makeServer: ServerSocket = {
val random = new java.util.Random
def nextPort = random.nextInt(portMax - portMin + 1) + portMin
def createServer(attempts: Int): ServerSocket =
if (attempts > 0)
try { new ServerSocket(nextPort, 1, loopback) } catch {
case NonFatal(_) => createServer(attempts - 1)
} else
sys.error("Could not connect to socket: maximum attempts exceeded")
if (attempts > 0) {
try new ServerSocket(nextPort, 1, loopback)
catch { case NonFatal(_) => createServer(attempts - 1) }
} else sys.error("Could not connect to socket: maximum attempts exceeded")
createServer(10)
}
def server[T](f: IPC => Option[T]): T = serverImpl(makeServer, f)
def server[T](port: Int)(f: IPC => Option[T]): T =
serverImpl(new ServerSocket(port, 1, loopback), f)
private def serverImpl[T](server: ServerSocket, f: IPC => Option[T]): T = {
def listen(): T = {
@tailrec def listen(): T = {
ipc(server.accept())(f) match {
case Some(done) => done
case None => listen()
}
}
try { listen() } finally { server.close() }
try listen()
finally server.close()
}
private def ipc[T](s: Socket)(f: IPC => T): T =
try { f(new IPC(s)) } finally { s.close() }
try f(new IPC(s))
finally s.close()
final class Server private[IPC] (s: ServerSocket) {
def port = s.getLocalPort
@ -59,6 +69,7 @@ object IPC {
def connection[T](f: IPC => T): T = IPC.ipc(s.accept())(f)
}
}
final class IPC private (s: Socket) {
def port = s.getLocalPort
private val in = new BufferedReader(new InputStreamReader(s.getInputStream))

View File

@ -38,30 +38,44 @@ object Def extends Init[Scope] with TaskMacroExtra {
def showFullKey(keyNameColor: Option[String]): Show[ScopedKey[_]] =
Show[ScopedKey[_]]((key: ScopedKey[_]) => displayFull(key, keyNameColor))
@deprecated("Use showRelativeKey2 which doesn't take the unused multi param", "1.1.1")
def showRelativeKey(
current: ProjectRef,
multi: Boolean,
keyNameColor: Option[String] = None
): Show[ScopedKey[_]] =
Show[ScopedKey[_]](
key =>
Scope.display(
key.scope,
withColor(key.key.label, keyNameColor),
ref => displayRelative(current, multi, ref)
))
showRelativeKey2(current, keyNameColor)
def showBuildRelativeKey(
currentBuild: URI,
multi: Boolean,
keyNameColor: Option[String] = None
def showRelativeKey2(
current: ProjectRef,
keyNameColor: Option[String] = None,
): Show[ScopedKey[_]] =
Show[ScopedKey[_]](
key =>
Scope.display(
key.scope,
withColor(key.key.label, keyNameColor),
ref => displayBuildRelative(currentBuild, multi, ref)
ref => displayRelative2(current, ref)
))
@deprecated("Use showBuildRelativeKey2 which doesn't take the unused multi param", "1.1.1")
def showBuildRelativeKey(
currentBuild: URI,
multi: Boolean,
keyNameColor: Option[String] = None,
): Show[ScopedKey[_]] =
showBuildRelativeKey2(currentBuild, keyNameColor)
def showBuildRelativeKey2(
currentBuild: URI,
keyNameColor: Option[String] = None,
): Show[ScopedKey[_]] =
Show[ScopedKey[_]](
key =>
Scope.display(
key.scope,
withColor(key.key.label, keyNameColor),
ref => displayBuildRelative(currentBuild, ref)
))
/**
@ -71,8 +85,11 @@ object Def extends Init[Scope] with TaskMacroExtra {
def displayRelativeReference(current: ProjectRef, project: Reference): String =
displayRelative(current, project, false)
@deprecated("Use displayRelativeReference", "1.1.0")
@deprecated("Use displayRelative2 which doesn't take the unused multi param", "1.1.1")
def displayRelative(current: ProjectRef, multi: Boolean, project: Reference): String =
displayRelative2(current, project)
def displayRelative2(current: ProjectRef, project: Reference): String =
displayRelative(current, project, true)
/**
@ -91,7 +108,11 @@ object Def extends Init[Scope] with TaskMacroExtra {
}
}
@deprecated("Use variant without multi", "1.1.1")
def displayBuildRelative(currentBuild: URI, multi: Boolean, project: Reference): String =
displayBuildRelative(currentBuild, project)
def displayBuildRelative(currentBuild: URI, project: Reference): String =
project match {
case BuildRef(`currentBuild`) => "ThisBuild /"
case ProjectRef(`currentBuild`, x) => x + " /"
@ -173,16 +194,31 @@ object Def extends Init[Scope] with TaskMacroExtra {
// The following conversions enable the types Initialize[T], Initialize[Task[T]], and Task[T] to
// be used in task and setting macros as inputs with an ultimate result of type T
implicit def macroValueI[T](in: Initialize[T]): MacroValue[T] = ???
implicit def macroValueIT[T](in: Initialize[Task[T]]): MacroValue[T] = ???
implicit def macroValueIInT[T](in: Initialize[InputTask[T]]): InputEvaluated[T] = ???
implicit def taskMacroValueIT[T](in: Initialize[Task[T]]): MacroTaskValue[T] = ???
implicit def macroPrevious[T](in: TaskKey[T]): MacroPrevious[T] = ???
implicit def macroValueI[T](@deprecated("unused", "") in: Initialize[T]): MacroValue[T] = ???
// The following conversions enable the types Parser[T], Initialize[Parser[T]], and Initialize[State => Parser[T]] to
// be used in the inputTask macro as an input with an ultimate result of type T
implicit def parserInitToInput[T](p: Initialize[Parser[T]]): ParserInput[T] = ???
implicit def parserInitStateToInput[T](p: Initialize[State => Parser[T]]): ParserInput[T] = ???
implicit def macroValueIT[T](@deprecated("unused", "") in: Initialize[Task[T]]): MacroValue[T] =
???
implicit def macroValueIInT[T](
@deprecated("unused", "") in: Initialize[InputTask[T]]
): InputEvaluated[T] = ???
implicit def taskMacroValueIT[T](
@deprecated("unused", "") in: Initialize[Task[T]]
): MacroTaskValue[T] = ???
implicit def macroPrevious[T](@deprecated("unused", "") in: TaskKey[T]): MacroPrevious[T] = ???
// The following conversions enable the types Parser[T], Initialize[Parser[T]], and
// Initialize[State => Parser[T]] to be used in the inputTask macro as an input with an ultimate
// result of type T
implicit def parserInitToInput[T](
@deprecated("unused", "") p: Initialize[Parser[T]]
): ParserInput[T] = ???
implicit def parserInitStateToInput[T](
@deprecated("unused", "") p: Initialize[State => Parser[T]]
): ParserInput[T] = ???
def settingKey[T](description: String): SettingKey[T] = macro std.KeyMacro.settingKeyImpl[T]
def taskKey[T](description: String): TaskKey[T] = macro std.KeyMacro.taskKeyImpl[T]
@ -190,27 +226,40 @@ object Def extends Init[Scope] with TaskMacroExtra {
private[sbt] def dummy[T: Manifest](name: String, description: String): (TaskKey[T], Task[T]) =
(TaskKey[T](name, description, DTask), dummyTask(name))
private[sbt] def dummyTask[T](name: String): Task[T] = {
import std.TaskExtra.{ task => newTask, _ }
val base: Task[T] = newTask(
sys.error("Dummy task '" + name + "' did not get converted to a full task.")) named name
base.copy(info = base.info.set(isDummyTask, true))
}
private[sbt] def isDummy(t: Task[_]): Boolean =
t.info.attributes.get(isDummyTask) getOrElse false
private[sbt] val isDummyTask = AttributeKey[Boolean](
"is-dummy-task",
"Internal: used to identify dummy tasks. sbt injects values for these tasks at the start of task execution.",
Invisible)
private[sbt] val (stateKey, dummyState) = dummy[State]("state", "Current build state.")
private[sbt] val (streamsManagerKey, dummyStreamsManager) = Def.dummy[std.Streams[ScopedKey[_]]](
"streams-manager",
"Streams manager, which provides streams for different contexts.")
}
// these need to be mixed into the sbt package object because the target doesn't involve Initialize or anything in Def
// these need to be mixed into the sbt package object
// because the target doesn't involve Initialize or anything in Def
trait TaskMacroExtra {
implicit def macroValueT[T](in: Task[T]): std.MacroValue[T] = ???
implicit def macroValueIn[T](in: InputTask[T]): std.InputEvaluated[T] = ???
implicit def parserToInput[T](in: Parser[T]): std.ParserInput[T] = ???
implicit def stateParserToInput[T](in: State => Parser[T]): std.ParserInput[T] = ???
implicit def macroValueT[T](@deprecated("unused", "") in: Task[T]): std.MacroValue[T] = ???
implicit def macroValueIn[T](@deprecated("unused", "") in: InputTask[T]): std.InputEvaluated[T] =
???
implicit def parserToInput[T](@deprecated("unused", "") in: Parser[T]): std.ParserInput[T] = ???
implicit def stateParserToInput[T](
@deprecated("unused", "") in: State => Parser[T]
): std.ParserInput[T] = ???
}

View File

@ -49,8 +49,13 @@ object InputTask {
)
}
implicit def inputTaskParsed[T](in: InputTask[T]): std.ParserInputTask[T] = ???
implicit def inputTaskInitParsed[T](in: Initialize[InputTask[T]]): std.ParserInputTask[T] = ???
implicit def inputTaskParsed[T](
@deprecated("unused", "") in: InputTask[T]
): std.ParserInputTask[T] = ???
implicit def inputTaskInitParsed[T](
@deprecated("unused", "") in: Initialize[InputTask[T]]
): std.ParserInputTask[T] = ???
def make[T](p: State => Parser[Task[T]]): InputTask[T] = new InputTask[T](p)

View File

@ -201,23 +201,6 @@ object Scope {
if (s == "") ""
else s + " "
// sbt 0.12 style
def display012StyleMasked(scope: Scope,
sep: String,
showProject: Reference => String,
mask: ScopeMask): String = {
import scope.{ project, config, task, extra }
val configPrefix = config.foldStrict(displayConfigKey012Style, "*:", ".:")
val taskPrefix = task.foldStrict(_.label + "::", "", ".::")
val extras = extra.foldStrict(_.entries.map(_.toString).toList, Nil, Nil)
val postfix = if (extras.isEmpty) "" else extras.mkString("(", ", ", ")")
mask.concatShow(projectPrefix012Style(project, showProject012Style),
configPrefix,
taskPrefix,
sep,
postfix)
}
def equal(a: Scope, b: Scope, mask: ScopeMask): Boolean =
(!mask.project || a.project == b.project) &&
(!mask.config || a.config == b.config) &&
@ -236,11 +219,33 @@ object Scope {
def showProject012Style = (ref: Reference) => Reference.display(ref) + "/"
@deprecated("No longer used", "1.1.3")
def transformTaskName(s: String) = {
val parts = s.split("-+")
(parts.take(1) ++ parts.drop(1).map(_.capitalize)).mkString
}
@deprecated("Use variant without extraInherit", "1.1.1")
def delegates[Proj](
refs: Seq[(ProjectRef, Proj)],
configurations: Proj => Seq[ConfigKey],
resolve: Reference => ResolvedReference,
rootProject: URI => String,
projectInherit: ProjectRef => Seq[ProjectRef],
configInherit: (ResolvedReference, ConfigKey) => Seq[ConfigKey],
taskInherit: AttributeKey[_] => Seq[AttributeKey[_]],
extraInherit: (ResolvedReference, AttributeMap) => Seq[AttributeMap]
): Scope => Seq[Scope] =
delegates(
refs,
configurations,
resolve,
rootProject,
projectInherit,
configInherit,
taskInherit,
)
// *Inherit functions should be immediate delegates and not include argument itself. Transitivity will be provided by this method
def delegates[Proj](
refs: Seq[(ProjectRef, Proj)],
@ -250,19 +255,27 @@ object Scope {
projectInherit: ProjectRef => Seq[ProjectRef],
configInherit: (ResolvedReference, ConfigKey) => Seq[ConfigKey],
taskInherit: AttributeKey[_] => Seq[AttributeKey[_]],
extraInherit: (ResolvedReference, AttributeMap) => Seq[AttributeMap]
): Scope => Seq[Scope] = {
val index = delegates(refs, configurations, projectInherit, configInherit)
scope =>
indexedDelegates(resolve, index, rootProject, taskInherit, extraInherit)(scope)
indexedDelegates(resolve, index, rootProject, taskInherit)(scope)
}
@deprecated("Use variant without extraInherit", "1.1.1")
def indexedDelegates(
resolve: Reference => ResolvedReference,
index: DelegateIndex,
rootProject: URI => String,
taskInherit: AttributeKey[_] => Seq[AttributeKey[_]],
extraInherit: (ResolvedReference, AttributeMap) => Seq[AttributeMap]
)(rawScope: Scope): Seq[Scope] =
indexedDelegates(resolve, index, rootProject, taskInherit)(rawScope)
def indexedDelegates(
resolve: Reference => ResolvedReference,
index: DelegateIndex,
rootProject: URI => String,
taskInherit: AttributeKey[_] => Seq[AttributeKey[_]],
)(rawScope: Scope): Seq[Scope] = {
val scope = Scope.replaceThis(GlobalScope)(rawScope)

View File

@ -324,6 +324,8 @@ object Scoped {
"0.13.2")
def task: SettingKey[Task[S]] = scopedSetting(scope, key)
def toSettingKey: SettingKey[Task[S]] = scopedSetting(scope, key)
def get(settings: Settings[Scope]): Option[Task[S]] = settings.get(scope, key)
def ? : Initialize[Task[Option[S]]] = Def.optional(scopedKey) {
@ -336,6 +338,11 @@ object Scoped {
(this.? zipWith i)((x, y) => (x, y) map { case (a, b) => a getOrElse b })
}
/** Enriches `Initialize[Task[S]]` types.
*
* @param i the original `Initialize[Task[S]]` value to enrich
* @tparam S the type of the underlying value
*/
final class RichInitializeTask[S](i: Initialize[Task[S]]) extends RichInitTaskBase[S, Task] {
protected def onTask[T](f: Task[S] => Task[T]): Initialize[Task[T]] = i apply f
@ -365,8 +372,14 @@ object Scoped {
}
}
/** Enriches `Initialize[InputTask[S]]` types.
*
* @param i the original `Initialize[InputTask[S]]` value to enrich
* @tparam S the type of the underlying value
*/
final class RichInitializeInputTask[S](i: Initialize[InputTask[S]])
extends RichInitTaskBase[S, InputTask] {
protected def onTask[T](f: Task[S] => Task[T]): Initialize[InputTask[T]] = i(_ mapTask f)
def dependsOn(tasks: AnyInitTask*): Initialize[InputTask[S]] = {
@ -376,11 +389,18 @@ object Scoped {
}
}
/** Enriches `Initialize[R[S]]` types. Abstracts over the specific task-like type constructor.
*
* @tparam S the type of the underlying vault
* @tparam R the task-like type constructor (either Task or InputTask)
*/
sealed abstract class RichInitTaskBase[S, R[_]] {
protected def onTask[T](f: Task[S] => Task[T]): Initialize[R[T]]
def flatMap[T](f: S => Task[T]): Initialize[R[T]] = flatMapR(f compose successM)
def map[T](f: S => T): Initialize[R[T]] = mapR(f compose successM)
def flatMap[T](f: S => Task[T]): Initialize[R[T]] =
onTask(_.result flatMap (f compose successM))
def map[T](f: S => T): Initialize[R[T]] = onTask(_.result map (f compose successM))
def andFinally(fin: => Unit): Initialize[R[S]] = onTask(_ andFinally fin)
def doFinally(t: Task[Unit]): Initialize[R[S]] = onTask(_ doFinally t)
@ -393,22 +413,23 @@ object Scoped {
@deprecated(
"Use the `result` method to create a task that returns the full Result of this task. Then, call `flatMap` on the new task.",
"0.13.0")
def flatMapR[T](f: Result[S] => Task[T]): Initialize[R[T]] = onTask(_ flatMapR f)
def flatMapR[T](f: Result[S] => Task[T]): Initialize[R[T]] = onTask(_.result flatMap f)
@deprecated(
"Use the `result` method to create a task that returns the full Result of this task. Then, call `map` on the new task.",
"0.13.0")
def mapR[T](f: Result[S] => T): Initialize[R[T]] = onTask(_ mapR f)
def mapR[T](f: Result[S] => T): Initialize[R[T]] = onTask(_.result map f)
@deprecated(
"Use the `failure` method to create a task that returns Incomplete when this task fails and then call `flatMap` on the new task.",
"0.13.0")
def flatFailure[T](f: Incomplete => Task[T]): Initialize[R[T]] = flatMapR(f compose failM)
def flatFailure[T](f: Incomplete => Task[T]): Initialize[R[T]] =
onTask(_.result flatMap (f compose failM))
@deprecated(
"Use the `failure` method to create a task that returns Incomplete when this task fails and then call `map` on the new task.",
"0.13.0")
def mapFailure[T](f: Incomplete => T): Initialize[R[T]] = mapR(f compose failM)
def mapFailure[T](f: Incomplete => T): Initialize[R[T]] = onTask(_.result map (f compose failM))
}
type AnyInitTask = Initialize[Task[T]] forSome { type T }
@ -565,7 +586,7 @@ object Scoped {
/** The sbt 0.10 style DSL was deprecated in 0.13.13, favouring the use of the '.value' macro.
*
* See http://www.scala-sbt.org/0.13/docs/Migrating-from-sbt-012x.html for how to migrate.
* See http://www.scala-sbt.org/1.x/docs/Migrating-from-sbt-013x.html#Migrating+from+sbt+0.12+style for how to migrate.
*/
trait TupleSyntax {
import Scoped._
@ -628,7 +649,7 @@ object InputKey {
apply(AttributeKey[InputTask[T]](label, description, extendScoped(extend1, extendN), rank))
def apply[T](akey: AttributeKey[InputTask[T]]): InputKey[T] =
new InputKey[T] { val key = akey; def scope = Scope.ThisScope }
Scoped.scopedInput(Scope.ThisScope, akey)
}
/** Constructs TaskKeys, which are associated with tasks to define a setting.*/
@ -657,8 +678,7 @@ object TaskKey {
): TaskKey[T] =
apply(AttributeKey[Task[T]](label, description, extendScoped(extend1, extendN), rank))
def apply[T](akey: AttributeKey[Task[T]]): TaskKey[T] =
new TaskKey[T] { val key = akey; def scope = Scope.ThisScope }
def apply[T](akey: AttributeKey[Task[T]]): TaskKey[T] = Scoped.scopedTask(Scope.ThisScope, akey)
def local[T: Manifest]: TaskKey[T] = apply[T](AttributeKey.local[Task[T]])
}
@ -689,8 +709,7 @@ object SettingKey {
): SettingKey[T] =
apply(AttributeKey[T](label, description, extendScoped(extend1, extendN), rank))
def apply[T](akey: AttributeKey[T]): SettingKey[T] =
new SettingKey[T] { val key = akey; def scope = Scope.ThisScope }
def apply[T](akey: AttributeKey[T]): SettingKey[T] = Scoped.scopedSetting(Scope.ThisScope, akey)
def local[T: Manifest: OptJsonWriter]: SettingKey[T] = apply[T](AttributeKey.local[T])
}

View File

@ -8,11 +8,11 @@
package sbt
package std
import reflect.macros._
import scala.reflect.macros._
import Def.Initialize
import sbt.internal.util.complete.Parser
import sbt.internal.util.appmacro.{ Convert, Converted }
import Def.Initialize
object InputInitConvert extends Convert {
def apply[T: c.WeakTypeTag](c: blackbox.Context)(nme: String, in: c.Tree): Converted[c.type] =
@ -46,14 +46,13 @@ object TaskConvert extends Convert {
/** Converts an input `Tree` of type `Initialize[T]`, `Initialize[Task[T]]`, or `Task[T]` into a `Tree` of type `Initialize[Task[T]]`.*/
object FullConvert extends Convert {
import InputWrapper._
def apply[T: c.WeakTypeTag](c: blackbox.Context)(nme: String, in: c.Tree): Converted[c.type] =
nme match {
case WrapInitTaskName => Converted.Success[c.type](in)
case WrapPreviousName => Converted.Success[c.type](in)
case WrapInitName => wrapInit[T](c)(in)
case WrapTaskName => wrapTask[T](c)(in)
case _ => Converted.NotApplicable[c.type]
case InputWrapper.WrapInitTaskName => Converted.Success[c.type](in)
case InputWrapper.WrapPreviousName => Converted.Success[c.type](in)
case InputWrapper.WrapInitName => wrapInit[T](c)(in)
case InputWrapper.WrapTaskName => wrapTask[T](c)(in)
case _ => Converted.NotApplicable[c.type]
}
private def wrapInit[T: c.WeakTypeTag](c: blackbox.Context)(tree: c.Tree): Converted[c.type] = {

View File

@ -8,9 +8,10 @@
package sbt
package std
import language.experimental.macros
import reflect.macros._
import reflect.internal.annotations.compileTimeOnly
import scala.language.experimental.macros
import scala.annotation.compileTimeOnly
import scala.reflect.macros._
import Def.Initialize
import sbt.internal.util.appmacro.ContextUtil
@ -31,27 +32,27 @@ object InputWrapper {
@compileTimeOnly(
"`value` can only be called on a task within a task definition macro, such as :=, +=, ++=, or Def.task.")
def wrapTask_\u2603\u2603[T](in: Any): T = implDetailError
def wrapTask_\u2603\u2603[T](@deprecated("unused", "") in: Any): T = implDetailError
@compileTimeOnly(
"`value` can only be used within a task or setting macro, such as :=, +=, ++=, Def.task, or Def.setting.")
def wrapInit_\u2603\u2603[T](in: Any): T = implDetailError
def wrapInit_\u2603\u2603[T](@deprecated("unused", "") in: Any): T = implDetailError
@compileTimeOnly(
"`value` can only be called on a task within a task definition macro, such as :=, +=, ++=, or Def.task.")
def wrapInitTask_\u2603\u2603[T](in: Any): T = implDetailError
def wrapInitTask_\u2603\u2603[T](@deprecated("unused", "") in: Any): T = implDetailError
@compileTimeOnly(
"`value` can only be called on an input task within a task definition macro, such as := or Def.inputTask.")
def wrapInputTask_\u2603\u2603[T](in: Any): T = implDetailError
def wrapInputTask_\u2603\u2603[T](@deprecated("unused", "") in: Any): T = implDetailError
@compileTimeOnly(
"`value` can only be called on an input task within a task definition macro, such as := or Def.inputTask.")
def wrapInitInputTask_\u2603\u2603[T](in: Any): T = implDetailError
def wrapInitInputTask_\u2603\u2603[T](@deprecated("unused", "") in: Any): T = implDetailError
@compileTimeOnly(
"`previous` can only be called on a task within a task or input task definition macro, such as :=, +=, ++=, Def.task, or Def.inputTask.")
def wrapPrevious_\u2603\u2603[T](in: Any): T = implDetailError
def wrapPrevious_\u2603\u2603[T](@deprecated("unused", "") in: Any): T = implDetailError
private[this] def implDetailError =
sys.error("This method is an implementation detail and should not be referenced.")
@ -164,7 +165,7 @@ object InputWrapper {
format: c.Expr[sjsonnew.JsonFormat[T]]): c.Expr[Option[T]] = {
import c.universe._
c.macroApplication match {
case a @ Apply(Select(Apply(_, t :: Nil), tp), fmt) =>
case a @ Apply(Select(Apply(_, t :: Nil), _), _) =>
if (t.tpe <:< c.weakTypeOf[TaskKey[T]]) {
val tsTyped = c.Expr[TaskKey[T]](t)
val newTree = c.universe.reify { Previous.runtime[T](tsTyped.splice)(format.splice) }
@ -224,12 +225,12 @@ object ParserInput {
@compileTimeOnly(
"`parsed` can only be used within an input task macro, such as := or Def.inputTask.")
def parser_\u2603\u2603[T](i: Any): T =
def parser_\u2603\u2603[T](@deprecated("unused", "") i: Any): T =
sys.error("This method is an implementation detail and should not be referenced.")
@compileTimeOnly(
"`parsed` can only be used within an input task macro, such as := or Def.inputTask.")
def initParser_\u2603\u2603[T](i: Any): T =
def initParser_\u2603\u2603[T](@deprecated("unused", "") i: Any): T =
sys.error("This method is an implementation detail and should not be referenced.")
private[std] def wrap[T: c.WeakTypeTag](c: blackbox.Context)(ts: c.Expr[Any],

View File

@ -61,10 +61,10 @@ private[sbt] object KeyMacro {
n.decodedName.toString.trim // trim is not strictly correct, but macros don't expose the API necessary
@tailrec def enclosingVal(trees: List[c.Tree]): String = {
trees match {
case vd @ ValDef(_, name, _, _) :: ts => processName(name)
case ValDef(_, name, _, _) :: _ => processName(name)
case (_: ApplyTree | _: Select | _: TypeApply) :: xs => enclosingVal(xs)
// lazy val x: X = <methodName> has this form for some reason (only when the explicit type is present, though)
case Block(_, _) :: DefDef(mods, name, _, _, _, _) :: xs if mods.hasFlag(Flag.LAZY) =>
case Block(_, _) :: DefDef(mods, name, _, _, _, _) :: _ if mods.hasFlag(Flag.LAZY) =>
processName(name)
case _ =>
c.error(c.enclosingPosition, invalidEnclosingTree(methodName.decodedName.toString))

View File

@ -48,6 +48,7 @@ abstract class BaseTaskLinterDSL extends LinterDSL {
case _ => exprAtUseSite
}
uncheckedWrappers.add(removedSbtWrapper)
()
}
case _ =>
}
@ -73,7 +74,7 @@ abstract class BaseTaskLinterDSL extends LinterDSL {
val (qualName, isSettingKey) =
Option(qual.symbol)
.map(sym => (sym.name.decodedName.toString, qual.tpe <:< typeOf[SettingKey[_]]))
.getOrElse((ap.pos.lineContent, false))
.getOrElse((ap.pos.source.lineToString(ap.pos.line - 1), false))
if (!isSettingKey && !shouldIgnore && isTask(wrapperName, tpe.tpe, qual)) {
if (insideIf && !isDynamicTask) {

View File

@ -89,12 +89,12 @@ object TaskMacro {
final val InputTaskCreateDynName = "createDyn"
final val InputTaskCreateFreeName = "createFree"
final val append1Migration =
"`<+=` operator is removed. Try `lhs += { x.value }`\n or see http://www.scala-sbt.org/1.0/docs/Migrating-from-sbt-012x.html."
"`<+=` operator is removed. Try `lhs += { x.value }`\n or see http://www.scala-sbt.org/1.x/docs/Migrating-from-sbt-013x.html#Migrating+from+sbt+0.12+style."
final val appendNMigration =
"`<++=` operator is removed. Try `lhs ++= { x.value }`\n or see http://www.scala-sbt.org/1.0/docs/Migrating-from-sbt-012x.html."
"`<++=` operator is removed. Try `lhs ++= { x.value }`\n or see http://www.scala-sbt.org/1.x/docs/Migrating-from-sbt-013x.html#Migrating+from+sbt+0.12+style."
final val assignMigration =
"""`<<=` operator is removed. Use `key := { x.value }` or `key ~= (old => { newValue })`.
|See http://www.scala-sbt.org/1.0/docs/Migrating-from-sbt-012x.html""".stripMargin
|See http://www.scala-sbt.org/1.x/docs/Migrating-from-sbt-013x.html#Migrating+from+sbt+0.12+style""".stripMargin
import LinterDSL.{ Empty => EmptyLinter }
@ -130,37 +130,41 @@ object TaskMacro {
// These macros are there just so we can fail old operators like `<<=` and provide useful migration information.
def fakeSettingAssignPosition[T: c.WeakTypeTag](c: blackbox.Context)(
app: c.Expr[Initialize[T]]): c.Expr[Setting[T]] =
ContextUtil.selectMacroImpl[Setting[T]](c) { (ts, pos) =>
c.abort(pos, assignMigration)
}
def fakeSettingAppend1Position[S: c.WeakTypeTag, V: c.WeakTypeTag](c: blackbox.Context)(
v: c.Expr[Initialize[V]])(a: c.Expr[Append.Value[S, V]]): c.Expr[Setting[S]] =
ContextUtil.selectMacroImpl[Setting[S]](c) { (ts, pos) =>
c.abort(pos, append1Migration)
}
def fakeSettingAppendNPosition[S: c.WeakTypeTag, V: c.WeakTypeTag](c: blackbox.Context)(
vs: c.Expr[Initialize[V]])(a: c.Expr[Append.Values[S, V]]): c.Expr[Setting[S]] =
ContextUtil.selectMacroImpl[Setting[S]](c) { (ts, pos) =>
c.abort(pos, appendNMigration)
}
def fakeItaskAssignPosition[T: c.WeakTypeTag](c: blackbox.Context)(
app: c.Expr[Initialize[Task[T]]]): c.Expr[Setting[Task[T]]] =
ContextUtil.selectMacroImpl[Setting[Task[T]]](c) { (ts, pos) =>
c.abort(pos, assignMigration)
}
def fakeTaskAppend1Position[S: c.WeakTypeTag, V: c.WeakTypeTag](c: blackbox.Context)(
v: c.Expr[Initialize[Task[V]]])(a: c.Expr[Append.Value[S, V]]): c.Expr[Setting[Task[S]]] =
ContextUtil.selectMacroImpl[Setting[Task[S]]](c) { (ts, pos) =>
c.abort(pos, append1Migration)
}
def fakeTaskAppendNPosition[S: c.WeakTypeTag, V: c.WeakTypeTag](c: blackbox.Context)(
vs: c.Expr[Initialize[Task[V]]])(a: c.Expr[Append.Values[S, V]]): c.Expr[Setting[Task[S]]] =
ContextUtil.selectMacroImpl[Setting[Task[S]]](c) { (ts, pos) =>
c.abort(pos, appendNMigration)
}
@deprecated("unused", "") app: c.Expr[Initialize[T]]
): c.Expr[Setting[T]] =
ContextUtil.selectMacroImpl[Setting[T]](c)((_, pos) => c.abort(pos, assignMigration))
/* Implementations of <<= macro variations for tasks and settings. These just get the source position of the call site.*/
def fakeSettingAppend1Position[S: c.WeakTypeTag, V: c.WeakTypeTag](c: blackbox.Context)(
@deprecated("unused", "") v: c.Expr[Initialize[V]])(
@deprecated("unused", "") a: c.Expr[Append.Value[S, V]]
): c.Expr[Setting[S]] =
ContextUtil.selectMacroImpl[Setting[S]](c)((_, pos) => c.abort(pos, append1Migration))
def fakeSettingAppendNPosition[S: c.WeakTypeTag, V: c.WeakTypeTag](c: blackbox.Context)(
@deprecated("unused", "") vs: c.Expr[Initialize[V]])(
@deprecated("unused", "") a: c.Expr[Append.Values[S, V]]
): c.Expr[Setting[S]] =
ContextUtil.selectMacroImpl[Setting[S]](c)((_, pos) => c.abort(pos, appendNMigration))
def fakeItaskAssignPosition[T: c.WeakTypeTag](c: blackbox.Context)(
@deprecated("unused", "") app: c.Expr[Initialize[Task[T]]]
): c.Expr[Setting[Task[T]]] =
ContextUtil.selectMacroImpl[Setting[Task[T]]](c)((_, pos) => c.abort(pos, assignMigration))
def fakeTaskAppend1Position[S: c.WeakTypeTag, V: c.WeakTypeTag](c: blackbox.Context)(
@deprecated("unused", "") v: c.Expr[Initialize[Task[V]]])(
@deprecated("unused", "") a: c.Expr[Append.Value[S, V]]
): c.Expr[Setting[Task[S]]] =
ContextUtil.selectMacroImpl[Setting[Task[S]]](c)((_, pos) => c.abort(pos, append1Migration))
def fakeTaskAppendNPosition[S: c.WeakTypeTag, V: c.WeakTypeTag](c: blackbox.Context)(
@deprecated("unused", "") vs: c.Expr[Initialize[Task[V]]])(
@deprecated("unused", "") a: c.Expr[Append.Values[S, V]]
): c.Expr[Setting[Task[S]]] =
ContextUtil.selectMacroImpl[Setting[Task[S]]](c)((_, pos) => c.abort(pos, appendNMigration))
// Implementations of <<= macro variations for tasks and settings.
// These just get the source position of the call site.
def itaskAssignPosition[T: c.WeakTypeTag](c: blackbox.Context)(
app: c.Expr[Initialize[Task[T]]]): c.Expr[Setting[Task[T]]] =
@ -221,7 +225,7 @@ object TaskMacro {
if typeArgs.nonEmpty && (typeArgs.head weak_<:< c.weakTypeOf[Task[_]])
&& (tpe weak_<:< c.weakTypeOf[Initialize[_]]) =>
c.macroApplication match {
case Apply(Apply(TypeApply(Select(preT, nmeT), targs), _), _) =>
case Apply(Apply(TypeApply(Select(preT, _), _), _), _) =>
val tree = Apply(
TypeApply(Select(preT, TermName("+=").encodedName), TypeTree(typeArgs.head) :: Nil),
Select(v.tree, TermName("taskValue").encodedName) :: Nil)
@ -287,10 +291,14 @@ object TaskMacro {
newName: String): c.Tree = {
import c.universe._
c.macroApplication match {
case Apply(Apply(TypeApply(Select(preT, nmeT), targs), _), _) =>
Apply(Apply(TypeApply(Select(preT, TermName(newName).encodedName), targs),
init :: sourcePosition(c).tree :: Nil),
append :: Nil)
case Apply(Apply(TypeApply(Select(preT, _), targs), _), _) =>
Apply(
Apply(
TypeApply(Select(preT, TermName(newName).encodedName), targs),
init :: sourcePosition(c).tree :: Nil
),
append :: Nil
)
case x => ContextUtil.unexpectedTree(x)
}
}
@ -299,10 +307,14 @@ object TaskMacro {
newName: String): c.Tree = {
import c.universe._
c.macroApplication match {
case Apply(Apply(TypeApply(Select(preT, nmeT), targs), _), r) =>
Apply(Apply(TypeApply(Select(preT, TermName(newName).encodedName), targs),
init :: sourcePosition(c).tree :: Nil),
r)
case Apply(Apply(TypeApply(Select(preT, _), targs), _), _) =>
Apply(
Apply(
TypeApply(Select(preT, TermName(newName).encodedName), targs),
init :: sourcePosition(c).tree :: Nil
),
remove :: Nil
)
case x => ContextUtil.unexpectedTree(x)
}
}

View File

@ -13,7 +13,16 @@ import java.io.File
import sbt.io.IO
import sbt.SlashSyntax
import sbt.{ Scope, ScopeAxis, Scoped, Select, This, Zero }, Scope.{ Global, ThisScope }
import sbt.{ BuildRef, LocalProject, LocalRootProject, ProjectRef, Reference, RootProject, ThisBuild, ThisProject }
import sbt.{
BuildRef,
LocalProject,
LocalRootProject,
ProjectRef,
Reference,
RootProject,
ThisBuild,
ThisProject
}
import sbt.ConfigKey
import sbt.librarymanagement.syntax._
import sbt.{ InputKey, SettingKey, TaskKey }
@ -34,13 +43,13 @@ object BuildDSLInstances {
implicit val arbReference: Arbitrary[Reference] = Arbitrary {
Gen.frequency(
1 -> arbitrary[BuildRef], // 96
100 -> ThisBuild, // 10,271
3 -> LocalRootProject, // 325
23 -> arbitrary[ProjectRef], // 2,283
3 -> ThisProject, // 299
4 -> arbitrary[LocalProject], // 436
11 -> arbitrary[RootProject], // 1,133
96 -> arbitrary[BuildRef],
10271 -> ThisBuild,
325 -> LocalRootProject,
2283 -> arbitrary[ProjectRef],
299 -> ThisProject,
436 -> arbitrary[LocalProject],
1133 -> arbitrary[RootProject],
)
}
@ -57,52 +66,22 @@ object BuildDSLInstances {
implicit def arbAttrKey[A: Manifest]: Arbitrary[AttributeKey[_]] =
Arbitrary(Gen.identifier map (AttributeKey[A](_)))
def withScope[K <: Scoped.ScopingSetting[K]](keyGen: Gen[K]): Arbitrary[K] =
Arbitrary(Gen.frequency(
5 -> keyGen,
1 -> (for (key <- keyGen; scope <- arbitrary[Scope]) yield key in scope)
))
def genInputKey[A: Manifest]: Gen[InputKey[A]] = Gen.identifier map (InputKey[A](_))
def genSettingKey[A: Manifest]: Gen[SettingKey[A]] = Gen.identifier map (SettingKey[A](_))
def genTaskKey[A: Manifest]: Gen[TaskKey[A]] = Gen.identifier map (TaskKey[A](_))
implicit def arbInputKey[A: Manifest]: Arbitrary[InputKey[A]] = withScope(genInputKey[A])
implicit def arbSettingKey[A: Manifest]: Arbitrary[SettingKey[A]] = withScope(genSettingKey[A])
implicit def arbTaskKey[A: Manifest]: Arbitrary[TaskKey[A]] = withScope(genTaskKey[A])
implicit def arbScoped[A: Manifest](implicit
arbInputKey: Arbitrary[InputKey[A]],
arbSettingKey: Arbitrary[SettingKey[A]],
arbTaskKey: Arbitrary[TaskKey[A]],
): Arbitrary[Scoped] = {
Arbitrary(Gen.frequency(
15 -> arbitrary[InputKey[A]], // 15,431
20 -> arbitrary[SettingKey[A]], // 19,645
23 -> arbitrary[TaskKey[A]], // 22,867
))
}
object WithoutScope {
implicit def arbInputKey[A: Manifest]: Arbitrary[InputKey[A]] = Arbitrary(genInputKey[A])
implicit def arbSettingKey[A: Manifest]: Arbitrary[SettingKey[A]] = Arbitrary(genSettingKey[A])
implicit def arbTaskKey[A: Manifest]: Arbitrary[TaskKey[A]] = Arbitrary(genTaskKey[A])
implicit val arbAttributeMap: Arbitrary[AttributeMap] = Arbitrary {
Gen.frequency(
20 -> AttributeMap.empty,
1 -> {
for (name <- Gen.identifier; isModule <- arbitrary[Boolean])
yield
AttributeMap.empty
.put(AttributeKey[String]("name"), name)
.put(AttributeKey[Boolean]("isModule"), isModule)
}
)
}
implicit def arbScopeAxis[A: Arbitrary]: Arbitrary[ScopeAxis[A]] =
Arbitrary(Gen.oneOf[ScopeAxis[A]](This, Zero, arbitrary[A] map (Select(_))))
implicit val arbAttributeMap: Arbitrary[AttributeMap] = Arbitrary {
Gen.frequency(
20 -> AttributeMap.empty,
1 -> (for (name <- Gen.identifier; isModule <- arbitrary[Boolean])
yield AttributeMap.empty
.put(AttributeKey[String]("name"), name)
.put(AttributeKey[Boolean]("isModule"), isModule)
)
)
}
implicit def arbScope: Arbitrary[Scope] = Arbitrary(
for {
r <- arbitrary[ScopeAxis[Reference]]
@ -111,186 +90,134 @@ object BuildDSLInstances {
e <- arbitrary[ScopeAxis[AttributeMap]]
} yield Scope(r, c, t, e)
)
type Key = K forSome { type K <: Scoped.ScopingSetting[K] with Scoped }
def genInputKey[A: Manifest]: Gen[InputKey[A]] = Gen.identifier map (InputKey[A](_))
def genSettingKey[A: Manifest]: Gen[SettingKey[A]] = Gen.identifier map (SettingKey[A](_))
def genTaskKey[A: Manifest]: Gen[TaskKey[A]] = Gen.identifier map (TaskKey[A](_))
def withScope[K <: Scoped.ScopingSetting[K]](keyGen: Gen[K]): Arbitrary[K] = Arbitrary {
Gen.frequency(
5 -> keyGen,
1 -> (for (key <- keyGen; scope <- arbitrary[Scope]) yield key in scope)
)
}
implicit def arbInputKey[A: Manifest]: Arbitrary[InputKey[A]] = withScope(genInputKey[A])
implicit def arbSettingKey[A: Manifest]: Arbitrary[SettingKey[A]] = withScope(genSettingKey[A])
implicit def arbTaskKey[A: Manifest]: Arbitrary[TaskKey[A]] = withScope(genTaskKey[A])
implicit def arbKey[A: Manifest](
implicit
arbInputKey: Arbitrary[InputKey[A]],
arbSettingKey: Arbitrary[SettingKey[A]],
arbTaskKey: Arbitrary[TaskKey[A]],
): Arbitrary[Key] = Arbitrary {
def convert[T](g: Gen[T]) = g.asInstanceOf[Gen[Key]]
Gen.frequency(
15431 -> convert(arbitrary[InputKey[A]]),
19645 -> convert(arbitrary[SettingKey[A]]),
22867 -> convert(arbitrary[TaskKey[A]]),
)
}
object WithoutScope {
implicit def arbInputKey[A: Manifest]: Arbitrary[InputKey[A]] = Arbitrary(genInputKey[A])
implicit def arbSettingKey[A: Manifest]: Arbitrary[SettingKey[A]] = Arbitrary(genSettingKey[A])
implicit def arbTaskKey[A: Manifest]: Arbitrary[TaskKey[A]] = Arbitrary(genTaskKey[A])
}
implicit def arbScoped[A: Manifest]: Arbitrary[Scoped] = Arbitrary(arbitrary[Key])
}
import BuildDSLInstances._
object CustomEquality {
trait Eq[A] {
def equal(x: A, y: A): Boolean
}
// Avoid reimplementing equality for other standard classes.
trait EqualLowPriority {
implicit def universal[A] = (x: A, y: A) => x == y
}
object Eq extends EqualLowPriority {
def apply[A: Eq]: Eq[A] = implicitly
implicit def eqScoped[A <: Scoped]: Eq[A] = (x, y) => x.scope == y.scope && x.key == y.key
}
implicit class AnyWith_===[A](private val x: A) extends AnyVal {
def ===(y: A)(implicit z: Eq[A]): Boolean = z.equal(x, y)
def =?(y: A)(implicit z: Eq[A]): Prop = {
if (x === y) proved else falsified :| s"Expected $x but got $y"
}
}
def expectValue[A: Eq](expected: A)(x: A) = expected =? x
}
import CustomEquality._
object SlashSyntaxSpec extends Properties("SlashSyntax") with SlashSyntax {
type Key[K] = Scoped.ScopingSetting[K] with Scoped
property("Global / key == key in Global") = {
def check[K <: Key[K]: Arbitrary] = forAll((k: K) => expectValue(k in Global)(Global / k))
check[InputKey[String]] && check[SettingKey[String]] && check[TaskKey[String]]
forAll((k: Key) => expectValue(k in Global)(Global / k))
}
property("Reference / key == key in Reference") = {
def check[K <: Key[K]: Arbitrary] = forAll((r: Reference, k: K) => expectValue(k in r)(r / k))
check[InputKey[String]] && check[SettingKey[String]] && check[TaskKey[String]]
forAll((r: Reference, k: Key) => expectValue(k in r)(r / k))
}
property("Reference / Config / key == key in Reference in Config") = {
def check[K <: Key[K]: Arbitrary] =
forAll((r: Reference, c: ConfigKey, k: K) => expectValue(k in r in c)(r / c / k))
check[InputKey[String]] && check[SettingKey[String]] && check[TaskKey[String]]
forAll((r: Reference, c: ConfigKey, k: Key) => expectValue(k in r in c)(r / c / k))
}
property("Reference / task.key / key == key in Reference in task") = {
def check[K <: Key[K]: Arbitrary] =
forAll((r: Reference, t: Scoped, k: K) => expectValue(k in (r, t))(r / t.key / k))
check[InputKey[String]] && check[SettingKey[String]] && check[TaskKey[String]]
forAll((r: Reference, t: Scoped, k: Key) => expectValue(k in (r, t))(r / t.key / k))
}
property("Reference / task / key ~= key in Reference in task") = {
import WithoutScope._
def check[T <: Key[T]: Arbitrary, K <: Key[K]: Arbitrary] =
forAll((r: Reference, t: T, k: K) => expectValue(k in (r, t))(r / t / k))
(true
&& check[InputKey[String], InputKey[String]]
&& check[InputKey[String], SettingKey[String]]
&& check[InputKey[String], TaskKey[String]]
&& check[SettingKey[String], InputKey[String]]
&& check[SettingKey[String], SettingKey[String]]
&& check[SettingKey[String], TaskKey[String]]
&& check[TaskKey[String], InputKey[String]]
&& check[TaskKey[String], SettingKey[String]]
&& check[TaskKey[String], TaskKey[String]]
)
forAll((r: Reference, t: Key, k: Key) => expectValue(k in (r, t))(r / t / k))
}
property("Reference / Config / task.key / key == key in Reference in Config in task") = {
def check[K <: Key[K]: Arbitrary] =
forAll((r: Reference, c: ConfigKey, t: Scoped, k: K) =>
expectValue(k in (r, c, t))(r / c / t.key / k))
check[InputKey[String]] && check[SettingKey[String]] && check[TaskKey[String]]
forAll { (r: Reference, c: ConfigKey, t: Scoped, k: Key) =>
expectValue(k in (r, c, t))(r / c / t.key / k)
}
}
property("Reference / Config / task / key ~= key in Reference in Config in task") = {
import WithoutScope._
def check[T <: Key[T]: Arbitrary, K <: Key[K]: Arbitrary] =
forAll((r: Reference, c: ConfigKey, t: T, k: K) => expectValue(k in (r, c, t))(r / c / t / k))
(true
&& check[InputKey[String], InputKey[String]]
&& check[InputKey[String], SettingKey[String]]
&& check[InputKey[String], TaskKey[String]]
&& check[SettingKey[String], InputKey[String]]
&& check[SettingKey[String], SettingKey[String]]
&& check[SettingKey[String], TaskKey[String]]
&& check[TaskKey[String], InputKey[String]]
&& check[TaskKey[String], SettingKey[String]]
&& check[TaskKey[String], TaskKey[String]]
)
forAll { (r: Reference, c: ConfigKey, t: Key, k: Key) =>
expectValue(k in (r, c, t))(r / c / t / k)
}
}
property("Config / key == key in Config") = {
def check[K <: Key[K]: Arbitrary] =
forAll((c: ConfigKey, k: K) => expectValue(k in c)(c / k))
check[InputKey[String]] && check[SettingKey[String]] && check[TaskKey[String]]
forAll((c: ConfigKey, k: Key) => expectValue(k in c)(c / k))
}
property("Config / task.key / key == key in Config in task") = {
def check[K <: Key[K]: Arbitrary] =
forAll((c: ConfigKey, t: Scoped, k: K) => expectValue(k in c in t)(c / t.key / k))
check[InputKey[String]] && check[SettingKey[String]] && check[TaskKey[String]]
forAll((c: ConfigKey, t: Scoped, k: Key) => expectValue(k in c in t)(c / t.key / k))
}
property("Config / task / key ~= key in Config in task") = {
import WithoutScope._
def check[T <: Key[T]: Arbitrary, K <: Key[K]: Arbitrary] =
forAll((c: ConfigKey, t: T, k: K) => expectValue(k in c in t)(c / t / k))
(true
&& check[InputKey[String], InputKey[String]]
&& check[InputKey[String], SettingKey[String]]
&& check[InputKey[String], TaskKey[String]]
&& check[SettingKey[String], InputKey[String]]
&& check[SettingKey[String], SettingKey[String]]
&& check[SettingKey[String], TaskKey[String]]
&& check[TaskKey[String], InputKey[String]]
&& check[TaskKey[String], SettingKey[String]]
&& check[TaskKey[String], TaskKey[String]]
)
forAll((c: ConfigKey, t: Key, k: Key) => expectValue(k in c in t)(c / t / k))
}
property("task.key / key == key in task") = {
def check[K <: Key[K]: Arbitrary] =
forAll((t: Scoped, k: K) => expectValue(k in t)(t.key / k))
check[InputKey[String]] && check[SettingKey[String]] && check[TaskKey[String]]
forAll((t: Scoped, k: Key) => expectValue(k in t)(t.key / k))
}
property("task / key ~= key in task") = {
import WithoutScope._
def check[T <: Key[T]: Arbitrary, K <: Key[K]: Arbitrary] =
forAll((t: T, k: K) => expectValue(k in t)(t / k))
(true
&& check[InputKey[String], InputKey[String]]
&& check[InputKey[String], SettingKey[String]]
&& check[InputKey[String], TaskKey[String]]
&& check[SettingKey[String], InputKey[String]]
&& check[SettingKey[String], SettingKey[String]]
&& check[SettingKey[String], TaskKey[String]]
&& check[TaskKey[String], InputKey[String]]
&& check[TaskKey[String], SettingKey[String]]
&& check[TaskKey[String], TaskKey[String]]
)
forAll((t: Key, k: Key) => expectValue(k in t)(t / k))
}
property("Scope / key == key in Scope") = {
def check[K <: Key[K]: Arbitrary] = forAll((s: Scope, k: K) => expectValue(k in s)(s / k))
check[InputKey[String]] && check[SettingKey[String]] && check[TaskKey[String]]
forAll((s: Scope, k: Key) => expectValue(k in s)(s / k))
}
property("Reference? / key == key in ThisScope.copy(..)") = {
def check[K <: Key[K]: Arbitrary] =
forAll((r: ScopeAxis[Reference], k: K) =>
expectValue(k in ThisScope.copy(project = r))(r / k))
check[InputKey[String]] && check[SettingKey[String]] && check[TaskKey[String]]
forAll { (r: ScopeAxis[Reference], k: Key) =>
expectValue(k in ThisScope.copy(project = r))(r / k)
}
}
property("Reference? / ConfigKey? / key == key in ThisScope.copy(..)") = {
def check[K <: Key[K]: Arbitrary] =
forAll((r: ScopeAxis[Reference], c: ScopeAxis[ConfigKey], k: K) =>
expectValue(k in ThisScope.copy(project = r, config = c))(r / c / k))
check[InputKey[String]] && check[SettingKey[String]] && check[TaskKey[String]]
forAll((r: ScopeAxis[Reference], c: ScopeAxis[ConfigKey], k: Key) =>
expectValue(k in ThisScope.copy(project = r, config = c))(r / c / k))
}
// property("Reference? / AttributeKey? / key == key in ThisScope.copy(..)") = {
// def check[K <: Key[K]: Arbitrary] =
// forAll(
// (r: ScopeAxis[Reference], t: ScopeAxis[AttributeKey[_]], k: K) =>
// expectValue(k in ThisScope.copy(project = r, task = t))(r / t / k))
// check[InputKey[String]] && check[SettingKey[String]] && check[TaskKey[String]]
// forAll((r: ScopeAxis[Reference], t: ScopeAxis[AttributeKey[_]], k: AnyKey) =>
// expectValue(k in ThisScope.copy(project = r, task = t))(r / t / k))
// }
property("Reference? / ConfigKey? / AttributeKey? / key == key in ThisScope.copy(..)") = {
def check[K <: Key[K]: Arbitrary] =
forAll(
(r: ScopeAxis[Reference], c: ScopeAxis[ConfigKey], t: ScopeAxis[AttributeKey[_]], k: K) =>
expectValue(k in ThisScope.copy(project = r, config = c, task = t))(r / c / t / k))
check[InputKey[String]] && check[SettingKey[String]] && check[TaskKey[String]]
forAll {
(r: ScopeAxis[Reference], c: ScopeAxis[ConfigKey], t: ScopeAxis[AttributeKey[_]], k: Key) =>
expectValue(k in ThisScope.copy(project = r, config = c, task = t))(r / c / t / k)
}
}
def expectValue(expected: Scoped)(x: Scoped) = {
val equals = x.scope == expected.scope && x.key == expected.key
if (equals) proved else falsified :| s"Expected $expected but got $x"
}
}

View File

@ -10,12 +10,11 @@ package sbt.std
class TaskPosSpec {
// Dynamic tasks can have task invocations inside if branches
locally {
import sbt._
import sbt.Def._
import sbt._, Def._
val foo = taskKey[String]("")
val bar = taskKey[String]("")
var condition = true
val baz = Def.taskDyn[String] {
val condition = true
Def.taskDyn[String] {
if (condition) foo
else bar
}
@ -23,23 +22,21 @@ class TaskPosSpec {
// Dynamic settings can have setting invocations inside if branches
locally {
import sbt._
import sbt.Def._
import sbt._, Def._
val foo = settingKey[String]("")
val bar = settingKey[String]("")
var condition = true
val baz = Def.settingDyn[String] {
val condition = true
Def.settingDyn[String] {
if (condition) foo
else bar
}
}
locally {
import sbt._
import sbt.Def._
import sbt._, Def._
val foo = taskKey[String]("")
var condition = true
val baz = Def.task[String] {
val condition = true
Def.task[String] {
val fooAnon = () => foo.value: @sbtUnchecked
if (condition) fooAnon()
else fooAnon()
@ -47,11 +44,10 @@ class TaskPosSpec {
}
locally {
import sbt._
import sbt.Def._
import sbt._, Def._
val foo = taskKey[String]("")
var condition = true
val baz = Def.task[String] {
val condition = true
Def.task[String] {
val fooAnon = () => (foo.value: @sbtUnchecked) + ""
if (condition) fooAnon()
else fooAnon()
@ -59,12 +55,11 @@ class TaskPosSpec {
}
locally {
import sbt._
import sbt.Def._
import sbt._, Def._
val foo = taskKey[String]("")
val bar = taskKey[String]("")
var condition = true
val baz = Def.task[String] {
val condition = true
Def.task[String] {
if (condition) foo.value: @sbtUnchecked
else bar.value: @sbtUnchecked
}
@ -72,11 +67,10 @@ class TaskPosSpec {
locally {
// This is fix 1 for appearance of tasks inside anons
import sbt._
import sbt.Def._
import sbt._, Def._
val foo = taskKey[String]("")
var condition = true
val baz = Def.task[String] {
val condition = true
Def.task[String] {
val fooResult = foo.value
val anon = () => fooResult + " "
if (condition) anon()
@ -86,11 +80,10 @@ class TaskPosSpec {
locally {
// This is fix 2 for appearance of tasks inside anons
import sbt._
import sbt.Def._
import sbt._, Def._
val foo = taskKey[String]("")
var condition = true
val baz = Def.taskDyn[String] {
val condition = true
Def.taskDyn[String] {
val anon1 = (value: String) => value + " "
if (condition) {
Def.task(anon1(foo.value))
@ -100,31 +93,27 @@ class TaskPosSpec {
locally {
// missing .value error should not happen inside task dyn
import sbt._
import sbt.Def._
import sbt._, Def._
val foo = taskKey[String]("")
val baz = Def.taskDyn[String] {
Def.taskDyn[String] {
foo
}
}
locally {
// missing .value error should not happen inside task dyn
import sbt._
import sbt.Def._
import sbt._, Def._
val foo = taskKey[String]("")
val avoidDCE = ""
val baz = Def.task[String] {
foo: @sbtUnchecked
Def.task[String] {
val _ = foo: @sbtUnchecked
avoidDCE
}
}
locally {
import sbt._
import sbt.Def._
import sbt._, Def._
val foo = taskKey[String]("")
val baz = Def.task[String] {
Def.task[String] {
def inner(s: KeyedInitialize[_]) = println(s)
inner(foo)
""
@ -133,11 +122,10 @@ class TaskPosSpec {
locally {
// In theory, this should be reported, but missing .value analysis is dumb at the cost of speed
import sbt._
import sbt.Def._
import sbt._, Def._
val foo = taskKey[String]("")
def avoidDCE = { println(""); "" }
val baz = Def.task[String] {
Def.task[String] {
val (_, _) = "" match {
case _ => (foo, 1 + 2)
}
@ -146,15 +134,14 @@ class TaskPosSpec {
}
locally {
import sbt._
import sbt.Def._
import sbt._, Def._
val foo = taskKey[String]("")
def avoidDCE = { println(""); "" }
val baz = Def.task[String] {
def avoidDCE(x: TaskKey[String]) = x.toString
Def.task[String] {
val hehe = foo
// We do not detect `hehe` because guessing that the user did the wrong thing would require
// us to run the unused name traverser defined in Typer (and hence proxy it from context util)
avoidDCE
avoidDCE(hehe)
}
}
@ -168,11 +155,10 @@ class TaskPosSpec {
}
locally {
import sbt._
import sbt.Def._
import sbt._, Def._
val foo = settingKey[String]("")
val condition = true
val baz = Def.task[String] {
Def.task[String] {
// settings can be evaluated in a condition
if (condition) foo.value
else "..."
@ -180,10 +166,9 @@ class TaskPosSpec {
}
locally {
import sbt._
import sbt.Def._
import sbt._, Def._
val foo = settingKey[String]("")
val baz = Def.task[Seq[String]] {
Def.task[Seq[String]] {
(1 to 10).map(_ => foo.value)
}
}

View File

@ -7,11 +7,9 @@
package sbt.std
import scala.reflect._
import scala.tools.reflect.ToolBox
object TestUtil {
import tools.reflect.ToolBox
def eval(code: String, compileOptions: String = ""): Any = {
val tb = mkToolbox(compileOptions)
tb.eval(tb.parse(code))

View File

@ -7,15 +7,19 @@
package sbt.std.neg
import scala.tools.reflect.ToolBoxError
import org.scalatest.FunSuite
import sbt.std.TaskLinterDSLFeedback
import sbt.std.TestUtil._
class TaskNegSpec extends FunSuite {
import tools.reflect.ToolBoxError
def expectError(errorSnippet: String,
compileOptions: String = "",
baseCompileOptions: String = s"-cp $toolboxClasspath")(code: String) = {
def expectError(
errorSnippet: String,
compileOptions: String = "",
baseCompileOptions: String = s"-cp $toolboxClasspath",
)(code: String) = {
val errorMessage = intercept[ToolBoxError] {
eval(code, s"$compileOptions $baseCompileOptions")
println(s"Test failed -- compilation was successful! Expected:\n$errorSnippet")

View File

@ -11,7 +11,7 @@ import sbt.internal.DslEntry
import sbt.librarymanagement.Configuration
private[sbt] trait BuildSyntax {
import language.experimental.macros
import scala.language.experimental.macros
def settingKey[T](description: String): SettingKey[T] = macro std.KeyMacro.settingKeyImpl[T]
def taskKey[T](description: String): TaskKey[T] = macro std.KeyMacro.taskKeyImpl[T]
def inputKey[T](description: String): InputKey[T] = macro std.KeyMacro.inputKeyImpl[T]

View File

@ -72,8 +72,7 @@ object Cross {
} & spacedFirst(CrossCommand)
}
private def crossRestoreSessionParser(state: State): Parser[String] =
token(CrossRestoreSessionCommand)
private def crossRestoreSessionParser: Parser[String] = token(CrossRestoreSessionCommand)
private[sbt] def requireSession[T](p: State => Parser[T]): State => Parser[T] =
s => if (s get sessionSettings isEmpty) failure("No project loaded") else p(s)
@ -189,9 +188,10 @@ object Cross {
}
def crossRestoreSession: Command =
Command.arb(crossRestoreSessionParser, crossRestoreSessionHelp)(crossRestoreSessionImpl)
Command.arb(_ => crossRestoreSessionParser, crossRestoreSessionHelp)((s, _) =>
crossRestoreSessionImpl(s))
private def crossRestoreSessionImpl(state: State, arg: String): State = {
private def crossRestoreSessionImpl(state: State): State = {
restoreCapturedSession(state, Project.extract(state))
}
@ -216,12 +216,30 @@ object Cross {
Command.arb(requireSession(switchParser), switchHelp)(switchCommandImpl)
private def switchCommandImpl(state: State, args: Switch): State = {
val switchedState = switchScalaVersion(args, state)
val x = Project.extract(state)
val (switchedState, affectedRefs) = switchScalaVersion(args, state)
args.command.toList ::: switchedState
val strictCmd =
if (args.version.force) {
// The Scala version was forced on the whole build, run as is
args.command
} else {
args.command.map { rawCmd =>
parseCommand(rawCmd) match {
case Right(_) => rawCmd // A project is specified, run as is
case Left(cmd) =>
resolveAggregates(x)
.intersect(affectedRefs)
.collect { case ProjectRef(_, proj) => s"$proj/$cmd" }
.mkString("all ", " ", "")
}
}
}
strictCmd.toList ::: switchedState
}
private def switchScalaVersion(switch: Switch, state: State): State = {
private def switchScalaVersion(switch: Switch, state: State): (State, Seq[ResolvedReference]) = {
val extracted = Project.extract(state)
import extracted._
@ -291,7 +309,7 @@ object Cross {
}
}
setScalaVersionForProjects(version, instance, projects, state, extracted)
(setScalaVersionForProjects(version, instance, projects, state, extracted), projects.map(_._1))
}
private def setScalaVersionForProjects(

View File

@ -26,7 +26,12 @@ import sbt.internal.librarymanagement.mavenint.{
PomExtraDependencyAttributes,
SbtPomExtraProperties
}
import sbt.internal.server.{ LanguageServerReporter, Definition }
import sbt.internal.server.{
LanguageServerReporter,
Definition,
LanguageServerProtocol,
ServerHandler
}
import sbt.internal.testing.TestLogger
import sbt.internal.util._
import sbt.internal.util.Attributed.data
@ -278,6 +283,12 @@ object Defaults extends BuildCommon {
if (serverConnectionType.value == ConnectionType.Tcp) Set(ServerAuthentication.Token)
else Set()
},
serverHandlers :== Nil,
fullServerHandlers := {
(Vector(LanguageServerProtocol.handler)
++ serverHandlers.value
++ Vector(ServerHandler.fallback))
},
insideCI :== sys.env.contains("BUILD_NUMBER") || sys.env.contains("CI"),
))
@ -501,7 +512,7 @@ object Defaults extends BuildCommon {
},
compileIncSetup := compileIncSetupTask.value,
console := consoleTask.value,
collectAnalyses := Definition.collectAnalysesTask.value,
collectAnalyses := Definition.collectAnalysesTask.map(_ => ()).value,
consoleQuick := consoleQuickTask.value,
discoveredMainClasses := (compile map discoverMainClasses storeAs discoveredMainClasses xtriggeredBy compile).value,
discoveredSbtPlugins := discoverSbtPluginNames.value,
@ -531,6 +542,7 @@ object Defaults extends BuildCommon {
clean := (Def.task { IO.delete(cleanFiles.value) } tag (Tags.Clean)).value,
consoleProject := consoleProjectTask.value,
watchTransitiveSources := watchTransitiveSourcesTask.value,
watchingMessage := Watched.projectWatchingMessage(thisProjectRef.value.project),
watch := watchSetting.value
)
@ -674,7 +686,6 @@ object Defaults extends BuildCommon {
(testGrouping in test).value,
(testExecution in test).value,
(fullClasspath in test).value,
(javaHome in test).value,
testForkedParallel.value,
(javaOptions in test).value
)
@ -832,7 +843,6 @@ object Defaults extends BuildCommon {
testGrouping.value,
newConfig,
fullClasspath.value,
javaHome.value,
testForkedParallel.value,
javaOptions.value
)
@ -859,20 +869,20 @@ object Defaults extends BuildCommon {
}
}
private[sbt] def allTestGroupsTask(s: TaskStreams,
frameworks: Map[TestFramework, Framework],
loader: ClassLoader,
groups: Seq[Tests.Group],
config: Tests.Execution,
cp: Classpath,
javaHome: Option[File]): Initialize[Task[Tests.Output]] = {
private[sbt] def allTestGroupsTask(
s: TaskStreams,
frameworks: Map[TestFramework, Framework],
loader: ClassLoader,
groups: Seq[Tests.Group],
config: Tests.Execution,
cp: Classpath,
): Initialize[Task[Tests.Output]] = {
allTestGroupsTask(s,
frameworks,
loader,
groups,
config,
cp,
javaHome,
forkedParallelExecution = false,
javaOptions = Nil)
}
@ -884,7 +894,6 @@ object Defaults extends BuildCommon {
groups: Seq[Tests.Group],
config: Tests.Execution,
cp: Classpath,
javaHome: Option[File],
forkedParallelExecution: Boolean): Initialize[Task[Tests.Output]] = {
allTestGroupsTask(s,
frameworks,
@ -892,7 +901,6 @@ object Defaults extends BuildCommon {
groups,
config,
cp,
javaHome,
forkedParallelExecution,
javaOptions = Nil)
}
@ -903,12 +911,11 @@ object Defaults extends BuildCommon {
groups: Seq[Tests.Group],
config: Tests.Execution,
cp: Classpath,
javaHome: Option[File],
forkedParallelExecution: Boolean,
javaOptions: Seq[String]): Initialize[Task[Tests.Output]] = {
val runners = createTestRunners(frameworks, loader, config)
val groupTasks = groups map {
case Tests.Group(name, tests, runPolicy) =>
case Tests.Group(_, tests, runPolicy) =>
runPolicy match {
case Tests.SubProcess(opts) =>
s.log.debug(s"javaOptions: ${opts.runJVMOptions}")
@ -1382,7 +1389,7 @@ object Defaults extends BuildCommon {
private[this] def exported(w: PrintWriter, command: String): Seq[String] => Unit =
args => w.println((command +: args).mkString(" "))
private[this] def exported(s: TaskStreams, command: String): Seq[String] => Unit = args => {
private[this] def exported(s: TaskStreams, command: String): Seq[String] => Unit = {
val w = s.text(ExportStream)
try exported(w, command)
finally w.close() // workaround for #937
@ -1538,7 +1545,7 @@ object Defaults extends BuildCommon {
val cacheStore = s.cacheStoreFactory make "copy-resources"
val mappings = (resources.value --- dirs) pair (rebase(dirs, t) | flat(t))
s.log.debug("Copy resource mappings: " + mappings.mkString("\n\t", "\n\t", ""))
Sync(cacheStore)(mappings)
Sync.sync(cacheStore)(mappings)
mappings
}
@ -1611,7 +1618,11 @@ object Defaults extends BuildCommon {
val sv = (sbtVersion in pluginCrossBuild).value
val scalaV = (scalaVersion in pluginCrossBuild).value
val binVersion = (scalaBinaryVersion in pluginCrossBuild).value
val cross = if (id.crossVersioned) CrossVersion.binary else Disabled()
val cross = id.crossVersionedValue match {
case CrossValue.Disabled => Disabled()
case CrossValue.Full => CrossVersion.full
case CrossValue.Binary => CrossVersion.binary
}
val base = ModuleID(id.groupID, id.name, sv).withCrossVersion(cross)
CrossVersion(scalaV, binVersion)(base).withCrossVersion(Disabled())
}
@ -1704,7 +1715,7 @@ object Classpaths {
}
def packaged(pkgTasks: Seq[TaskKey[File]]): Initialize[Task[Map[Artifact, File]]] =
enabledOnly(packagedArtifact.task, pkgTasks) apply (_.join.map(_.toMap))
enabledOnly(packagedArtifact.toSettingKey, pkgTasks) apply (_.join.map(_.toMap))
def artifactDefs(pkgTasks: Seq[TaskKey[File]]): Initialize[Seq[Artifact]] =
enabledOnly(artifact, pkgTasks)
@ -1714,8 +1725,10 @@ object Classpaths {
case (a, true) => a
})
def forallIn[T](key: Scoped.ScopingSetting[SettingKey[T]],
pkgTasks: Seq[TaskKey[_]]): Initialize[Seq[T]] =
def forallIn[T](
key: Scoped.ScopingSetting[SettingKey[T]], // should be just SettingKey[T] (mea culpa)
pkgTasks: Seq[TaskKey[_]],
): Initialize[Seq[T]] =
pkgTasks.map(pkg => key in pkg.scope in pkg).join
private[this] def publishGlobalDefaults =
@ -1745,15 +1758,16 @@ object Classpaths {
deliver := deliverTask(makeIvyXmlConfiguration).value,
deliverLocal := deliverTask(makeIvyXmlLocalConfiguration).value,
makeIvyXml := deliverTask(makeIvyXmlConfiguration).value,
publish := publishTask(publishConfiguration, deliver).value,
publishLocal := publishTask(publishLocalConfiguration, deliverLocal).value,
publishM2 := publishTask(publishM2Configuration, deliverLocal).value
publish := publishTask(publishConfiguration).value,
publishLocal := publishTask(publishLocalConfiguration).value,
publishM2 := publishTask(publishM2Configuration).value
)
private[this] def baseGlobalDefaults =
Defaults.globalDefaults(
Seq(
conflictWarning :== ConflictWarning.default("global"),
evictionWarningOptions := EvictionWarningOptions.default,
compatibilityWarningOptions :== CompatibilityWarningOptions.default,
homepage :== None,
startYear :== None,
@ -1821,7 +1835,7 @@ object Classpaths {
appResolvers.value,
useJCenter.value) match {
case (Some(delegated), Seq(), _, _) => delegated
case (_, rs, Some(ars), uj) => ars ++ rs
case (_, rs, Some(ars), _) => ars ++ rs
case (_, rs, _, uj) => Resolver.combineDefaultResolvers(rs.toVector, uj, mavenCentral = true)
}),
appResolvers := {
@ -1940,8 +1954,10 @@ object Classpaths {
if (isSnapshot.value) "integration" else "release",
ivyConfigurations.value.map(c => ConfigRef(c.name)).toVector,
packagedArtifacts.in(publish).value.toVector,
checksums.in(publish).value.toVector,
getPublishTo(publishTo.value).name,
checksums.in(publish).value.toVector, { //resolvername: not required if publishTo is false
val publishToOption = publishTo.value
if (publishArtifact.value) getPublishTo(publishToOption).name else "local"
},
ivyLoggingLevel.value,
isSnapshot.value
)
@ -1987,7 +2003,6 @@ object Classpaths {
val suffix = if (crossPaths.value) s"_$binVersion" else ""
s"update_cache$suffix"
},
evictionWarningOptions in update := EvictionWarningOptions.default,
dependencyPositions := dependencyPositionsTask.value,
unresolvedWarningConfiguration in update := UnresolvedWarningConfiguration(
dependencyPositions.value),
@ -1998,6 +2013,7 @@ object Classpaths {
ConflictWarning(conflictWarning.value, report, log)
report
},
evictionWarningOptions in update := evictionWarningOptions.value,
evictionWarningOptions in evicted := EvictionWarningOptions.full,
evicted := {
import ShowLines._
@ -2032,7 +2048,6 @@ object Classpaths {
val docTypes = docArtifactTypes.value
val out = is.withIvy(s.log)(_.getSettings.getDefaultIvyUserDir)
val uwConfig = (unresolvedWarningConfiguration in update).value
val scalaModule = scalaModuleInfo.value
withExcludes(out, mod.classifiers, lock(app)) { excludes =>
lm.updateClassifiers(
GetClassifiersConfiguration(
@ -2063,7 +2078,6 @@ object Classpaths {
// Override the default to handle mixing in the sbtPlugin + scala dependencies.
allDependencies := {
val base = projectDependencies.value ++ libraryDependencies.value
val dependency = sbtDependency.value
val isPlugin = sbtPlugin.value
val sbtdeps =
(sbtDependency in pluginCrossBuild).value.withConfigurations(Some(Provided.name))
@ -2182,9 +2196,6 @@ object Classpaths {
val log = s.log
val out = is.withIvy(log)(_.getSettings.getDefaultIvyUserDir)
val uwConfig = (unresolvedWarningConfiguration in update).value
val depDir = dependencyCacheDirectory.value
val ivy = scalaModuleInfo.value
val st = state.value
withExcludes(out, mod.classifiers, lock(app)) {
excludes =>
// val noExplicitCheck = ivy.map(_.withCheckExplicit(false))
@ -2201,7 +2212,7 @@ object Classpaths {
uwConfig,
log
) match {
case Left(uw) => ???
case Left(_) => ???
case Right(ur) => ur
}
}
@ -2232,16 +2243,20 @@ object Classpaths {
IvyActions.deliver(ivyModule.value, config.value, streams.value.log)
}
def publishTask(config: TaskKey[PublishConfiguration],
deliverKey: TaskKey[_]): Initialize[Task[Unit]] =
@deprecated("Use variant without delivery key", "1.1.1")
def publishTask(
config: TaskKey[PublishConfiguration],
deliverKey: TaskKey[_],
): Initialize[Task[Unit]] =
publishTask(config)
def publishTask(config: TaskKey[PublishConfiguration]): Initialize[Task[Unit]] =
Def.taskDyn {
val s = streams.value
val skp = (skip in publish).value
val ref = thisProjectRef.value
if (skp) Def.task { s.log.debug(s"Skipping publish* for ${ref.project}") } else
Def.task {
IvyActions.publish(ivyModule.value, config.value, s.log)
}
Def.task { IvyActions.publish(ivyModule.value, config.value, s.log) }
} tag (Tags.Publish, Tags.Network)
val moduleIdJsonKeyFormat: sjsonnew.JsonKeyFormat[ModuleID] =
@ -2408,7 +2423,7 @@ object Classpaths {
s.init.evaluate(empty) map { _ -> s.pos }
}: _*)
} catch {
case NonFatal(e) => Map()
case NonFatal(_) => Map()
}
val outCacheStore = cacheStoreFactory make "output_dsp"
@ -2708,19 +2723,32 @@ object Classpaths {
visit(projectRef, conf)
visited.toSeq
}
def interSortConfigurations(
projectRef: ProjectRef,
conf: Configuration,
data: Settings[Scope],
deps: BuildDependencies
): Seq[(ProjectRef, ConfigRef)] =
interSort(projectRef, conf, data, deps).map {
case (projectRef, configName) => (projectRef, ConfigRef(configName))
}
private[sbt] def unmanagedDependencies0(projectRef: ProjectRef,
conf: Configuration,
data: Settings[Scope],
deps: BuildDependencies): Initialize[Task[Classpath]] =
Def.value {
interDependencies(projectRef,
deps,
conf,
conf,
data,
TrackLevel.TrackAlways,
true,
unmanagedLibs0)
interDependencies(
projectRef,
deps,
conf,
conf,
data,
TrackLevel.TrackAlways,
true,
(dep, conf, data, _) => unmanagedLibs(dep, conf, data),
)
}
private[sbt] def internalDependenciesImplTask(projectRef: ProjectRef,
conf: Configuration,
@ -2825,20 +2853,19 @@ object Classpaths {
case TrackLevel.TrackIfMissing => getClasspath(exportedProductJarsIfMissing, dep, conf, data)
case TrackLevel.TrackAlways => getClasspath(exportedProductJars, dep, conf, data)
}
private[sbt] def unmanagedLibs0(dep: ResolvedReference,
conf: String,
data: Settings[Scope],
track: TrackLevel): Task[Classpath] =
unmanagedLibs(dep, conf, data)
def unmanagedLibs(dep: ResolvedReference, conf: String, data: Settings[Scope]): Task[Classpath] =
getClasspath(unmanagedJars, dep, conf, data)
def getClasspath(key: TaskKey[Classpath],
dep: ResolvedReference,
conf: String,
data: Settings[Scope]): Task[Classpath] =
(key in (dep, ConfigKey(conf))) get data getOrElse constant(Nil)
def defaultConfigurationTask(p: ResolvedReference, data: Settings[Scope]): Configuration =
flatten(defaultConfiguration in p get data) getOrElse Configurations.Default
def flatten[T](o: Option[Option[T]]): Option[T] = o flatMap idFun
val sbtIvySnapshots: URLRepository = Resolver.sbtIvyRepo("snapshots")
@ -2871,7 +2898,7 @@ object Classpaths {
up.filter(configurationFilter(config.name) && artifactFilter(`type` = jarTypes))
.toSeq
.map {
case (conf, module, art, file) =>
case (_, module, art, file) =>
Attributed(file)(
AttributeMap.empty
.put(artifact.key, art)
@ -3131,13 +3158,16 @@ trait BuildExtra extends BuildCommon with DefExtra {
file.value,
managedScalaInstance.value)
def externalPom(file: Initialize[File] = inBase("pom.xml"),
iScala: Initialize[Option[ScalaModuleInfo]] = scalaModuleInfo)
: Setting[Task[ModuleSettings]] =
moduleSettings := PomConfiguration(ivyValidate.value,
scalaModuleInfo.value,
file.value,
managedScalaInstance.value)
def externalPom(
file: Initialize[File] = inBase("pom.xml"),
iScala: Initialize[Option[ScalaModuleInfo]] = scalaModuleInfo,
): Setting[Task[ModuleSettings]] =
moduleSettings := PomConfiguration(
ivyValidate.value,
iScala.value,
file.value,
managedScalaInstance.value,
)
def runInputTask(config: Configuration,
mainClass: String,
@ -3166,7 +3196,10 @@ trait BuildExtra extends BuildCommon with DefExtra {
config: Configuration,
mainClass: String,
baseArguments: String*): Vector[Setting[_]] = {
// Use Def.inputTask with the `Def.spaceDelimited()` parser
// TODO: Re-write to avoid InputTask.apply which is deprecated
// I tried "Def.spaceDelimited().parsed" (after importing Def.parserToInput)
// but it broke actions/run-task
// Maybe it needs to be defined inside a Def.inputTask?
def inputTask[T](f: TaskKey[Seq[String]] => Initialize[Task[T]]): Initialize[InputTask[T]] =
InputTask.apply(Def.value((s: State) => Def.spaceDelimited()))(f)
@ -3221,7 +3254,7 @@ trait BuildExtra extends BuildCommon with DefExtra {
trait DefExtra {
private[this] val ts: TaskSequential = new TaskSequential {}
implicit def toTaskSequential(d: Def.type): TaskSequential = ts
implicit def toTaskSequential(@deprecated("unused", "") d: Def.type): TaskSequential = ts
}
trait BuildCommon {
@ -3229,7 +3262,7 @@ trait BuildCommon {
/**
* Allows a String to be used where a `NameFilter` is expected.
* Asterisks (`*`) in the string are interpreted as wildcards.
* All other characters must match exactly. See [[sbt.GlobFilter]].
* All other characters must match exactly. See [[sbt.io.GlobFilter]].
*/
implicit def globFilter(expression: String): NameFilter = GlobFilter(expression)

View File

@ -8,7 +8,7 @@
package sbt
import sbt.internal.{ Load, BuildStructure, TaskTimings, TaskName, GCUtil }
import sbt.internal.util.{ Attributed, ErrorHandling, HList, RMap, Signals, Types }
import sbt.internal.util.{ Attributed, ConsoleAppender, ErrorHandling, HList, RMap, Signals, Types }
import sbt.util.{ Logger, Show }
import sbt.librarymanagement.{ Resolver, UpdateReport }
@ -247,7 +247,11 @@ object EvaluateTask {
(executionRoots in Global) ::= dummyRoots
)
def evalPluginDef(log: Logger)(pluginDef: BuildStructure, state: State): PluginData = {
@deprecated("Use variant which doesn't take a logger", "1.1.1")
def evalPluginDef(log: Logger)(pluginDef: BuildStructure, state: State): PluginData =
evalPluginDef(pluginDef, state)
def evalPluginDef(pluginDef: BuildStructure, state: State): PluginData = {
val root = ProjectRef(pluginDef.root, Load.getRootProject(pluginDef.units)(pluginDef.root))
val pluginKey = pluginData
val config = extractedTaskConfig(Project.extract(state), pluginDef, state)
@ -256,7 +260,7 @@ object EvaluateTask {
val (newS, result) = evaluated getOrElse sys.error(
"Plugin data does not exist for plugin definition at " + pluginDef.root)
Project.runUnloadHooks(newS) // discard states
processResult(result, log)
processResult2(result)
}
/**
@ -296,8 +300,8 @@ object EvaluateTask {
def logIncomplete(result: Incomplete, state: State, streams: Streams): Unit = {
val all = Incomplete linearize result
val keyed = for (Incomplete(Some(key: ScopedKey[_]), _, msg, _, ex) <- all)
yield (key, msg, ex)
val keyed =
all collect { case Incomplete(Some(key: ScopedKey[_]), _, msg, _, ex) => (key, msg, ex) }
import ExceptionCategory._
for ((key, msg, Some(ex)) <- keyed) {
@ -312,7 +316,7 @@ object EvaluateTask {
for ((key, msg, ex) <- keyed if (msg.isDefined || ex.isDefined)) {
val msgString = (msg.toList ++ ex.toList.map(ErrorHandling.reducedToString)).mkString("\n\t")
val log = getStreams(key, streams).log
val display = contextDisplay(state, log.ansiCodesSupported)
val display = contextDisplay(state, ConsoleAppender.formatEnabledInEnv)
log.error("(" + display.show(key) + ") " + msgString)
}
}
@ -433,12 +437,21 @@ object EvaluateTask {
case in @ Incomplete(Some(node: Task[_]), _, _, _, _) => in.copy(node = transformNode(node))
case i => i
}
type AnyCyclic = Execute[({ type A[_] <: AnyRef })#A]#CyclicException[_]
def convertCyclicInc: Incomplete => Incomplete = {
case in @ Incomplete(_, _, _, _, Some(c: AnyCyclic)) =>
case in @ Incomplete(
_,
_,
_,
_,
Some(c: Execute[({ type A[_] <: AnyRef })#A @unchecked]#CyclicException[_])
) =>
in.copy(directCause = Some(new RuntimeException(convertCyclic(c))))
case i => i
}
def convertCyclic(c: AnyCyclic): String =
(c.caller, c.target) match {
case (caller: Task[_], target: Task[_]) =>
@ -448,7 +461,7 @@ object EvaluateTask {
}
def liftAnonymous: Incomplete => Incomplete = {
case i @ Incomplete(node, tpe, None, causes, None) =>
case i @ Incomplete(_, _, None, causes, None) =>
causes.find(inc => inc.node.isEmpty && (inc.message.isDefined || inc.directCause.isDefined)) match {
case Some(lift) => i.copy(directCause = lift.directCause, message = lift.message)
case None => i
@ -456,12 +469,19 @@ object EvaluateTask {
case i => i
}
@deprecated("Use processResult2 which doesn't take the unused log param", "1.1.1")
def processResult[T](result: Result[T], log: Logger, show: Boolean = false): T =
onResult(result, log) { v =>
processResult2(result, show)
def processResult2[T](result: Result[T], show: Boolean = false): T =
onResult(result) { v =>
if (show) println("Result: " + v); v
}
def onResult[T, S](result: Result[T], log: Logger)(f: T => S): S =
@deprecated("Use variant that doesn't take log", "1.1.1")
def onResult[T, S](result: Result[T], log: Logger)(f: T => S): S = onResult(result)(f)
def onResult[T, S](result: Result[T])(f: T => S): S =
result match {
case Value(v) => f(v)
case Inc(inc) => throw inc

View File

@ -8,7 +8,6 @@
package sbt
import sbt.internal.{ Load, BuildStructure, Act, Aggregation, SessionSettings }
import Project._
import Scope.GlobalScope
import Def.{ ScopedKey, Setting }
import sbt.internal.util.complete.Parser
@ -43,7 +42,7 @@ final case class Extracted(structure: BuildStructure,
structure.data.get(inCurrent(key.scope), key.key)
private[this] def inCurrent[T](scope: Scope): Scope =
if (scope.project == This) scope.copy(project = Select(currentRef)) else scope
if (scope.project == This) scope in currentRef else scope
/**
* Runs the task specified by `key` and returns the transformed State and the resulting value of the task.
@ -54,12 +53,12 @@ final case class Extracted(structure: BuildStructure,
* See `runAggregated` for that.
*/
def runTask[T](key: TaskKey[T], state: State): (State, T) = {
val rkey = resolve(key.scopedKey)
val rkey = resolve(key)
val config = extractedTaskConfig(this, structure, state)
val value: Option[(State, Result[T])] =
EvaluateTask(structure, key.scopedKey, state, currentRef, config)
val (newS, result) = getOrError(rkey.scope, rkey.key, value)
(newS, EvaluateTask.processResult(result, newS.log))
(newS, EvaluateTask.processResult2(result))
}
/**
@ -72,22 +71,22 @@ final case class Extracted(structure: BuildStructure,
* This method requests execution of only the given task and does not aggregate execution.
*/
def runInputTask[T](key: InputKey[T], input: String, state: State): (State, T) = {
val scopedKey = ScopedKey(
val key2 = Scoped.scopedSetting(
Scope.resolveScope(Load.projectScope(currentRef), currentRef.build, rootProject)(key.scope),
key.key
)
val rkey = resolve(scopedKey)
val inputTask = get(Scoped.scopedSetting(rkey.scope, rkey.key))
val rkey = resolve(key2)
val inputTask = get(rkey)
val task = Parser.parse(input, inputTask.parser(state)) match {
case Right(t) => t
case Left(msg) => sys.error(s"Invalid programmatic input:\n$msg")
}
val config = extractedTaskConfig(this, structure, state)
EvaluateTask.withStreams(structure, state) { str =>
val nv = EvaluateTask.nodeView(state, str, rkey :: Nil)
val nv = EvaluateTask.nodeView(state, str, rkey.scopedKey :: Nil)
val (newS, result) =
EvaluateTask.runTask(task, state, str, structure.index.triggers, config)(nv)
(newS, EvaluateTask.processResult(result, newS.log))
(newS, EvaluateTask.processResult2(result))
}
}
@ -98,27 +97,29 @@ final case class Extracted(structure: BuildStructure,
* Other axes are resolved to `Zero` if unspecified.
*/
def runAggregated[T](key: TaskKey[T], state: State): State = {
val rkey = resolve(key.scopedKey)
val rkey = resolve(key)
val keys = Aggregation.aggregate(rkey, ScopeMask(), structure.extra)
val tasks = Act.keyValues(structure)(keys)
Aggregation.runTasks(state,
structure,
tasks,
DummyTaskMap(Nil),
show = Aggregation.defaultShow(state, false))(showKey)
Aggregation.runTasks(
state,
tasks,
DummyTaskMap(Nil),
show = Aggregation.defaultShow(state, false),
)(showKey)
}
private[this] def resolve[T](key: ScopedKey[T]): ScopedKey[T] =
Project.mapScope(Scope.resolveScope(GlobalScope, currentRef.build, rootProject))(key.scopedKey)
private[this] def resolve[K <: Scoped.ScopingSetting[K] with Scoped](key: K): K =
key in Scope.resolveScope(GlobalScope, currentRef.build, rootProject)(key.scope)
private def getOrError[T](scope: Scope, key: AttributeKey[_], value: Option[T])(
implicit display: Show[ScopedKey[_]]): T =
implicit display: Show[ScopedKey[_]]
): T =
value getOrElse sys.error(display.show(ScopedKey(scope, key)) + " is undefined.")
private def getOrError[T](scope: Scope, key: AttributeKey[T])(
implicit display: Show[ScopedKey[_]]): T =
structure.data.get(scope, key) getOrElse sys.error(
display.show(ScopedKey(scope, key)) + " is undefined.")
implicit display: Show[ScopedKey[_]]
): T =
getOrError(scope, key, structure.data.get(scope, key))(display)
@deprecated(
"This discards session settings. Migrate to appendWithSession or appendWithoutSession.",

View File

@ -42,6 +42,7 @@ import sbt.internal.{
}
import sbt.io.{ FileFilter, WatchService }
import sbt.internal.io.WatchState
import sbt.internal.server.ServerHandler
import sbt.internal.util.{ AttributeKey, SourcePosition }
import sbt.librarymanagement.Configurations.CompilerPlugin
@ -136,6 +137,8 @@ object Keys {
val serverHost = SettingKey(BasicKeys.serverHost)
val serverAuthentication = SettingKey(BasicKeys.serverAuthentication)
val serverConnectionType = SettingKey(BasicKeys.serverConnectionType)
val fullServerHandlers = SettingKey(BasicKeys.fullServerHandlers)
val serverHandlers = settingKey[Seq[ServerHandler]]("User-defined server handlers.")
val analysis = AttributeKey[CompileAnalysis]("analysis", "Analysis of compilation, including dependencies and generated outputs.", DSetting)
val watch = SettingKey(BasicKeys.watch)
@ -446,7 +449,7 @@ object Keys {
val sbtDependency = settingKey[ModuleID]("Provides a definition for declaring the current version of sbt.").withRank(BMinusSetting)
val sbtVersion = settingKey[String]("Provides the version of sbt. This setting should not be modified.").withRank(AMinusSetting)
val sbtBinaryVersion = settingKey[String]("Defines the binary compatibility version substring.").withRank(BPlusSetting)
val skip = taskKey[Boolean]("For tasks that support it (currently only 'compile' and 'update'), setting skip to true will force the task to not to do its work. This exact semantics may vary by task.").withRank(BSetting)
val skip = taskKey[Boolean]("For tasks that support it (currently only 'compile', 'update', and 'publish'), setting skip to true will force the task to not to do its work. This exact semantics may vary by task.").withRank(BSetting)
val templateResolverInfos = settingKey[Seq[TemplateResolverInfo]]("Template resolvers used for 'new'.").withRank(BSetting)
val interactionService = taskKey[InteractionService]("Service used to ask for user input through the current user interface(s).").withRank(CTask)
val insideCI = SettingKey[Boolean]("insideCI", "Determines if the SBT is running in a Continuous Integration environment", AMinusSetting)

View File

@ -52,7 +52,6 @@ import xsbti.compile.CompilerCache
import scala.annotation.tailrec
import sbt.io.IO
import sbt.io.syntax._
import StandardMain._
import java.io.{ File, IOException }
import java.net.URI
@ -69,34 +68,35 @@ final class xMain extends xsbti.AppMain {
import BasicCommandStrings.runEarly
import BuiltinCommands.defaults
import sbt.internal.CommandStrings.{ BootCommand, DefaultsCommand, InitCommand }
val state = initialState(
val state = StandardMain.initialState(
configuration,
Seq(defaults, early),
runEarly(DefaultsCommand) :: runEarly(InitCommand) :: BootCommand :: Nil)
runManaged(state)
StandardMain.runManaged(state)
}
}
final class ScriptMain extends xsbti.AppMain {
def run(configuration: xsbti.AppConfiguration): xsbti.MainResult = {
import BasicCommandStrings.runEarly
runManaged(
initialState(
configuration,
BuiltinCommands.ScriptCommands,
runEarly(Level.Error.toString) :: Script.Name :: Nil
))
val state = StandardMain.initialState(
configuration,
BuiltinCommands.ScriptCommands,
runEarly(Level.Error.toString) :: Script.Name :: Nil
)
StandardMain.runManaged(state)
}
}
final class ConsoleMain extends xsbti.AppMain {
def run(configuration: xsbti.AppConfiguration): xsbti.MainResult =
runManaged(
initialState(
configuration,
BuiltinCommands.ConsoleCommands,
IvyConsole.Name :: Nil
))
def run(configuration: xsbti.AppConfiguration): xsbti.MainResult = {
val state = StandardMain.initialState(
configuration,
BuiltinCommands.ConsoleCommands,
IvyConsole.Name :: Nil
)
StandardMain.runManaged(state)
}
}
object StandardMain {
@ -273,9 +273,9 @@ object BuiltinCommands {
case _ => si.actualVersion
}
private[this] def quiet[T](t: => T): Option[T] = try { Some(t) } catch {
case e: Exception => None
}
private[this] def quiet[T](t: => T): Option[T] =
try Some(t)
catch { case _: Exception => None }
def settingsCommand: Command =
showSettingLike(SettingsCommand,
@ -400,7 +400,7 @@ object BuiltinCommands {
// For correct behavior, we also need to re-inject a settings logger, as we'll be re-evaluating settings
val loggerInject = LogManager.settingsLogger(s)
val withLogger = newSession.appendRaw(loggerInject :: Nil)
val show = Project.showContextKey(newSession, structure)
val show = Project.showContextKey2(newSession)
val newStructure = Load.reapply(withLogger.mergeSettings, structure)(show)
Project.setProject(newSession, newStructure, s)
}
@ -424,19 +424,27 @@ object BuiltinCommands {
)(cl)
val setResult =
if (all) SettingCompletions.setAll(extracted, settings)
else SettingCompletions.setThis(s, extracted, settings, arg)
else SettingCompletions.setThis(extracted, settings, arg)
s.log.info(setResult.quietSummary)
s.log.debug(setResult.verboseSummary)
reapply(setResult.session, structure, s)
}
@deprecated("Use variant that doesn't take a State", "1.1.1")
def setThis(
s: State,
extracted: Extracted,
settings: Seq[Def.Setting[_]],
arg: String
): SetResult =
SettingCompletions.setThis(s, extracted, settings, arg)
setThis(extracted, settings, arg)
def setThis(
extracted: Extracted,
settings: Seq[Def.Setting[_]],
arg: String
): SetResult =
SettingCompletions.setThis(extracted, settings, arg)
def inspect: Command = Command(InspectCommand, inspectBrief, inspectDetailed)(Inspect.parser) {
case (s, (option, sk)) =>
@ -448,10 +456,10 @@ object BuiltinCommands {
Command(LastGrepCommand, lastGrepBrief, lastGrepDetailed)(lastGrepParser) {
case (s, (pattern, Some(sks))) =>
val (str, _, display) = extractLast(s)
Output.lastGrep(sks, str.streams(s), pattern, printLast(s))(display)
Output.lastGrep(sks, str.streams(s), pattern, printLast)(display)
keepLastLog(s)
case (s, (pattern, None)) =>
for (logFile <- lastLogFile(s)) yield Output.lastGrep(logFile, pattern, printLast(s))
for (logFile <- lastLogFile(s)) yield Output.lastGrep(logFile, pattern, printLast)
keepLastLog(s)
}
@ -493,7 +501,7 @@ object BuiltinCommands {
lastOnly_keys <- keysParser
kvs = Act.keyValues(structure)(lastOnly_keys._2)
f <- if (lastOnly_keys._1) success(() => s)
else Aggregation.evaluatingParser(s, structure, show)(kvs)
else Aggregation.evaluatingParser(s, show)(kvs)
} yield
() => {
def export0(s: State): State = lastImpl(s, kvs, Some(ExportStream))
@ -516,7 +524,7 @@ object BuiltinCommands {
def last: Command = Command(LastCommand, lastBrief, lastDetailed)(aggregatedKeyValueParser) {
case (s, Some(sks)) => lastImpl(s, sks, None)
case (s, None) =>
for (logFile <- lastLogFile(s)) yield Output.last(logFile, printLast(s))
for (logFile <- lastLogFile(s)) yield Output.last(logFile, printLast)
keepLastLog(s)
}
@ -525,7 +533,7 @@ object BuiltinCommands {
private[this] def lastImpl(s: State, sks: AnyKeys, sid: Option[String]): State = {
val (str, _, display) = extractLast(s)
Output.last(sks, str.streams(s), printLast(s), sid)(display)
Output.last(sks, str.streams(s), printLast, sid)(display)
keepLastLog(s)
}
@ -550,7 +558,10 @@ object BuiltinCommands {
*/
def isLastOnly(s: State): Boolean = s.history.previous.forall(_.commandLine == Shell)
def printLast(s: State): Seq[String] => Unit = _ foreach println
@deprecated("Use variant that doesn't take the state", "1.1.1")
def printLast(s: State): Seq[String] => Unit = printLast
def printLast: Seq[String] => Unit = _ foreach println
def autoImports(extracted: Extracted): EvalImports =
new EvalImports(imports(extracted), "<auto-imports>")
@ -620,7 +631,7 @@ object BuiltinCommands {
val extraUpdated = Project.updateExtraBuilds(s, f)
try doLoadProject(extraUpdated, LoadAction.Current)
catch {
case e: Exception =>
case _: Exception =>
s.log.error("Project loading failed: reverting to previous state.")
Project.setExtraBuilds(s, original)
}

View File

@ -7,14 +7,16 @@
package sbt
import java.io.PrintWriter
import java.util.Properties
import jline.TerminalFactory
import scala.annotation.tailrec
import scala.util.control.NonFatal
import jline.TerminalFactory
import sbt.io.{ IO, Using }
import sbt.internal.util.{ ErrorHandling, GlobalLogBacking }
import sbt.internal.util.complete.DefaultParsers
import sbt.internal.langserver.ErrorCodes
import sbt.util.Logger
import sbt.protocol._
@ -26,15 +28,14 @@ object MainLoop {
// We've disabled jline shutdown hooks to prevent classloader leaks, and have been careful to always restore
// the jline terminal in finally blocks, but hitting ctrl+c prevents finally blocks from being executed, in that
// case the only way to restore the terminal is in a shutdown hook.
val shutdownHook = new Thread(new Runnable {
def run(): Unit = TerminalFactory.get().restore()
})
val shutdownHook = new Thread(() => TerminalFactory.get().restore())
try {
Runtime.getRuntime.addShutdownHook(shutdownHook)
runLoggedLoop(state, state.globalLogging.backing)
} finally {
Runtime.getRuntime.removeShutdownHook(shutdownHook)
()
}
}
@ -100,7 +101,7 @@ object MainLoop {
/** Runs the next sequence of commands with global logging in place. */
def runWithNewLog(state: State, logBacking: GlobalLogBacking): RunNext =
Using.fileWriter(append = true)(logBacking.file) { writer =>
val out = new java.io.PrintWriter(writer)
val out = new PrintWriter(writer)
val full = state.globalLogging.full
val newLogging = state.globalLogging.newAppender(full, out, logBacking)
// transferLevels(state, newLogging)
@ -124,7 +125,7 @@ object MainLoop {
final class KeepGlobalLog(val state: State) extends RunNext
final class Return(val result: xsbti.MainResult) extends RunNext
/** Runs the next sequence of commands that doesn't require global logging changes.*/
/** Runs the next sequence of commands that doesn't require global logging changes. */
@tailrec def run(state: State): RunNext =
state.next match {
case State.Continue => run(next(state))
@ -143,19 +144,10 @@ object MainLoop {
/** This is the main function State transfer function of the sbt command processing. */
def processCommand(exec: Exec, state: State): State = {
import DefaultParsers._
val channelName = exec.source map (_.channelName)
StandardMain.exchange publishEventMessage ExecStatusEvent("Processing",
channelName,
exec.execId,
Vector())
val parser = Command combine state.definedCommands
val newState = parse(exec.commandLine, parser(state)) match {
case Right(s) => s() // apply command. command side effects happen here
case Left(errMsg) =>
state.log error errMsg
state.fail
}
StandardMain.exchange publishEventMessage
ExecStatusEvent("Processing", channelName, exec.execId, Vector())
val newState = Command.process(exec.commandLine, state)
val doneEvent = ExecStatusEvent(
"Done",
channelName,

View File

@ -16,7 +16,7 @@ import sbt.internal.Load
import sbt.internal.CommandStrings._
import Cross.{ spacedFirst, requireSession }
import sbt.librarymanagement.VersionNumber
import Project.{ inScope }
import Project.inScope
/**
* Module responsible for plugin cross building.
@ -24,8 +24,7 @@ import Project.{ inScope }
private[sbt] object PluginCross {
lazy val pluginSwitch: Command = {
def switchParser(state: State): Parser[(String, String)] = {
val knownVersions = Nil
lazy val switchArgs = token(NotSpace.examples(knownVersions: _*)) ~ (token(
lazy val switchArgs = token(NotSpace.examples()) ~ (token(
Space ~> matched(state.combinedParser)) ?? "")
lazy val nextSpaced = spacedFirst(PluginSwitchCommand)
token(PluginSwitchCommand ~ OptSpace) flatMap { _ =>

View File

@ -111,7 +111,7 @@ abstract class AutoPlugin extends Plugins.Basic with PluginsFunctions {
def extraProjects: Seq[Project] = Nil
/** The [[Project]]s to add to the current build based on an existing project. */
def derivedProjects(proj: ProjectDefinition[_]): Seq[Project] = Nil
def derivedProjects(@deprecated("unused", "") proj: ProjectDefinition[_]): Seq[Project] = Nil
private[sbt] def unary_! : Exclude = Exclude(this)
@ -224,20 +224,19 @@ object Plugins extends PluginsFunctions {
_.label
})
}
val retval = topologicalSort(selectedPlugins, log)
val retval = topologicalSort(selectedPlugins)
// log.debug(s" :: sorted deduced result: ${retval.toString}")
retval
}
}
}
}
private[sbt] def topologicalSort(ns: List[AutoPlugin], log: Logger): List[AutoPlugin] = {
// log.debug(s"sorting: ns: ${ns.toString}")
private[sbt] def topologicalSort(ns: List[AutoPlugin]): List[AutoPlugin] = {
@tailrec
def doSort(found0: List[AutoPlugin],
notFound0: List[AutoPlugin],
limit0: Int): List[AutoPlugin] = {
// log.debug(s" :: sorting:: found: ${found0.toString} not found ${notFound0.toString}")
if (limit0 < 0) throw AutoPluginException(s"Failed to sort ${ns} topologically")
else if (notFound0.isEmpty) found0
else {
@ -250,6 +249,7 @@ object Plugins extends PluginsFunctions {
val (roots, nonRoots) = ns partition (_.isRoot)
doSort(roots, nonRoots, ns.size * ns.size + 1)
}
private[sbt] def translateMessage(e: LogicException) = e match {
case ic: InitialContradictions =>
s"Contradiction in selected plugins. These plugins were both included and excluded: ${literalsString(
@ -260,6 +260,7 @@ object Plugins extends PluginsFunctions {
case cn: CyclicNegation =>
s"Cycles in plugin requirements cannot involve excludes. The problematic cycle is: ${literalsString(cn.cycle)}"
}
private[this] def literalsString(lits: Seq[Literal]): String =
lits map { case Atom(l) => l; case Negated(Atom(l)) => l } mkString (", ")
@ -271,6 +272,7 @@ object Plugins extends PluginsFunctions {
val message = s"Plugin$ns provided by multiple AutoPlugins:$nl${dupStrings.mkString(nl)}"
throw AutoPluginException(message)
}
private[this] def exclusionConflictError(requested: Plugins,
selected: Seq[AutoPlugin],
conflicting: Seq[AutoPlugin]): Unit = {
@ -360,14 +362,14 @@ ${listConflicts(conflicting)}""")
// This would handle things like !!p or !(p && z)
case Exclude(n) => hasInclude(n, p)
case And(ns) => ns.forall(n => hasExclude(n, p))
case b: Basic => false
case _: Basic => false
case Empty => false
}
private[sbt] def hasInclude(n: Plugins, p: AutoPlugin): Boolean = n match {
case `p` => true
case Exclude(n) => hasExclude(n, p)
case And(ns) => ns.forall(n => hasInclude(n, p))
case b: Basic => false
case _: Basic => false
case Empty => false
}
private[this] def flattenConvert(n: Plugins): Seq[Literal] = n match {

View File

@ -27,6 +27,7 @@ import Keys.{
serverPort,
serverAuthentication,
serverConnectionType,
fullServerHandlers,
logLevel,
watch
}
@ -44,6 +45,7 @@ import sbt.internal.{
import sbt.internal.util.{ AttributeKey, AttributeMap, Dag, Relation, Settings, ~> }
import sbt.internal.util.Types.{ const, idFun }
import sbt.internal.util.complete.DefaultParsers
import sbt.internal.server.ServerHandler
import sbt.librarymanagement.Configuration
import sbt.util.{ Show, Level }
import sjsonnew.JsonFormat
@ -286,23 +288,29 @@ object Project extends ProjectExtra {
showContextKey(state, None)
def showContextKey(state: State, keyNameColor: Option[String]): Show[ScopedKey[_]] =
if (isProjectLoaded(state)) showContextKey(session(state), structure(state), keyNameColor)
if (isProjectLoaded(state)) showContextKey2(session(state), keyNameColor)
else Def.showFullKey
@deprecated("Use showContextKey2 which doesn't take the unused structure param", "1.1.1")
def showContextKey(
session: SessionSettings,
structure: BuildStructure,
keyNameColor: Option[String] = None
): Show[ScopedKey[_]] =
Def.showRelativeKey(session.current, structure.allProjects.size > 1, keyNameColor)
showContextKey2(session, keyNameColor)
def showContextKey2(
session: SessionSettings,
keyNameColor: Option[String] = None
): Show[ScopedKey[_]] =
Def.showRelativeKey2(session.current, keyNameColor)
def showLoadingKey(
loaded: LoadedBuild,
keyNameColor: Option[String] = None
): Show[ScopedKey[_]] =
Def.showRelativeKey(
Def.showRelativeKey2(
ProjectRef(loaded.root, loaded.units(loaded.root).rootProjects.head),
loaded.allProjectRefs.size > 1,
keyNameColor
)
@ -413,7 +421,7 @@ object Project extends ProjectExtra {
def extract(state: State): Extracted = extract(session(state), structure(state))
private[sbt] def extract(se: SessionSettings, st: BuildStructure): Extracted =
Extracted(st, se, se.current)(showContextKey(se, st))
Extracted(st, se, se.current)(showContextKey2(se))
def getProjectForReference(ref: Reference, structure: BuildStructure): Option[ResolvedProject] =
ref match { case pr: ProjectRef => getProject(pr, structure); case _ => None }
@ -475,6 +483,7 @@ object Project extends ProjectExtra {
val authentication: Option[Set[ServerAuthentication]] = get(serverAuthentication)
val connectionType: Option[ConnectionType] = get(serverConnectionType)
val srvLogLevel: Option[Level.Value] = (logLevel in (ref, serverLog)).get(structure.data)
val hs: Option[Seq[ServerHandler]] = get(fullServerHandlers)
val commandDefs = allCommands.distinct.flatten[Command].map(_ tag (projectCommand, true))
val newDefinedCommands = commandDefs ++ BasicCommands.removeTagged(s.definedCommands,
projectCommand)
@ -491,6 +500,7 @@ object Project extends ProjectExtra {
.put(templateResolverInfos.key, trs)
.setCond(shellPrompt.key, prompt)
.setCond(serverLogLevel, srvLogLevel)
.setCond(fullServerHandlers.key, hs)
s.copy(
attributes = newAttrs,
definedCommands = newDefinedCommands
@ -757,7 +767,9 @@ object Project extends ProjectExtra {
EvaluateTask(extracted.structure, taskKey, state, extracted.currentRef, config)
}
implicit def projectToRef(p: Project): ProjectReference = LocalProject(p.id)
def projectToRef(p: Project): ProjectReference = LocalProject(p.id)
implicit def projectToLocalProject(p: Project): LocalProject = LocalProject(p.id)
final class RichTaskSessionVar[S](i: Def.Initialize[Task[S]]) {
import SessionVar.{ persistAndSet, resolveContext, set, transform => tx }

View File

@ -40,7 +40,7 @@ object Resolvers {
val to = uniqueSubdirectoryFor(info.uri, in = info.staging)
Some { () =>
creates(to) { IO.unzipURL(url, to) }
creates(to) { IO.unzipURL(url, to); () }
}
}

View File

@ -10,7 +10,7 @@ package sbt
import sbt.internal.{ Load, LoadedBuildUnit }
import sbt.internal.util.{ AttributeKey, Dag, Types }
import sbt.librarymanagement.Configuration
import sbt.librarymanagement.{ Configuration, ConfigRef }
import Types.const
import Def.Initialize
@ -104,7 +104,7 @@ object ScopeFilter {
/** Selects all scopes that apply to a single project. Zero and build-level scopes are excluded. */
def inAnyProject: ProjectFilter =
selectAxis(const { case p: ProjectRef => true; case _ => false })
selectAxis(const { case _: ProjectRef => true; case _ => false })
/** Accepts all values for the task axis except Zero. */
def inAnyTask: TaskFilter = selectAny[AttributeKey[_]]
@ -154,6 +154,16 @@ object ScopeFilter {
selectAxis[ConfigKey](const(c => cs(c.name)))
}
def inConfigurationsByKeys(keys: ConfigKey*): ConfigurationFilter = {
val cs = keys.toSet
selectAxis[ConfigKey](const(cs))
}
def inConfigurationsByRefs(refs: ConfigRef*): ConfigurationFilter = {
val cs = refs.map(r => ConfigKey(r.name)).toSet
selectAxis[ConfigKey](const(cs))
}
implicit def settingKeyAll[T](key: Initialize[T]): SettingKeyAll[T] = new SettingKeyAll[T](key)
implicit def taskKeyAll[T](key: Initialize[Task[T]]): TaskKeyAll[T] = new TaskKeyAll[T](key)
}

View File

@ -0,0 +1,191 @@
/*
* sbt
* Copyright 2011 - 2017, Lightbend, Inc.
* Copyright 2008 - 2010, Mark Harrah
* Licensed under BSD-3-Clause license (see LICENSE)
*/
package sbt
import java.io.File
import java.lang.reflect.Method
import sbt.io._
import sbt.io.syntax._
import sbt.internal.util.complete.{ Parser, DefaultParsers }
import sbt.librarymanagement._
import sbt.librarymanagement.syntax._
import sbt.internal.inc.classpath.ClasspathUtilities
import sbt.internal.inc.ModuleUtilities
import Def._
import Keys._
import Project._
object ScriptedPlugin extends AutoPlugin {
object autoImport {
val ScriptedConf = Configurations.config("scripted-sbt") hide
val ScriptedLaunchConf = Configurations.config("scripted-sbt-launch") hide
val scriptedSbt = settingKey[String]("")
val sbtLauncher = taskKey[File]("")
val sbtTestDirectory = settingKey[File]("")
val scriptedBufferLog = settingKey[Boolean]("")
val scriptedClasspath = taskKey[PathFinder]("")
val scriptedTests = taskKey[AnyRef]("")
val scriptedBatchExecution =
settingKey[Boolean]("Enables or disables batch execution for scripted.")
val scriptedParallelInstances = settingKey[Int](
"Configures the number of scripted instances for parallel testing, only used in batch mode.")
val scriptedRun = taskKey[Method]("")
val scriptedLaunchOpts =
settingKey[Seq[String]]("options to pass to jvm launching scripted tasks")
val scriptedDependencies = taskKey[Unit]("")
val scripted = inputKey[Unit]("")
}
import autoImport._
override lazy val globalSettings = Seq(
scriptedBufferLog := true,
scriptedLaunchOpts := Seq(),
)
override lazy val projectSettings = Seq(
ivyConfigurations ++= Seq(ScriptedConf, ScriptedLaunchConf),
scriptedSbt := (sbtVersion in pluginCrossBuild).value,
sbtLauncher := getJars(ScriptedLaunchConf).map(_.get.head).value,
sbtTestDirectory := sourceDirectory.value / "sbt-test",
libraryDependencies ++= (CrossVersion.partialVersion(scriptedSbt.value) match {
case Some((0, 13)) =>
Seq(
"org.scala-sbt" % "scripted-sbt" % scriptedSbt.value % ScriptedConf,
"org.scala-sbt" % "sbt-launch" % scriptedSbt.value % ScriptedLaunchConf
)
case Some((1, _)) =>
Seq(
"org.scala-sbt" %% "scripted-sbt" % scriptedSbt.value % ScriptedConf,
"org.scala-sbt" % "sbt-launch" % scriptedSbt.value % ScriptedLaunchConf
)
case Some((x, y)) => sys error s"Unknown sbt version ${scriptedSbt.value} ($x.$y)"
case None => sys error s"Unknown sbt version ${scriptedSbt.value}"
}),
scriptedClasspath := getJars(ScriptedConf).value,
scriptedTests := scriptedTestsTask.value,
scriptedParallelInstances := 1,
scriptedBatchExecution := false,
scriptedRun := scriptedRunTask.value,
scriptedDependencies := {
def use[A](@deprecated("unused", "") x: A*): Unit = () // avoid unused warnings
val analysis = (Keys.compile in Test).value
val pub = (publishLocal).value
use(analysis, pub)
},
scripted := scriptedTask.evaluated
)
private[sbt] def scriptedTestsTask: Initialize[Task[AnyRef]] =
Def.task {
val loader = ClasspathUtilities.toLoader(scriptedClasspath.value, scalaInstance.value.loader)
try {
ModuleUtilities.getObject("sbt.scriptedtest.ScriptedTests", loader)
} catch {
case _: ClassNotFoundException =>
ModuleUtilities.getObject("sbt.test.ScriptedTests", loader)
}
}
private[sbt] def scriptedRunTask: Initialize[Task[Method]] = Def.taskDyn {
val fCls = classOf[File]
val bCls = classOf[Boolean]
val asCls = classOf[Array[String]]
val lfCls = classOf[java.util.List[File]]
val iCls = classOf[Int]
val clazz = scriptedTests.value.getClass
val method =
if (scriptedBatchExecution.value)
clazz.getMethod("runInParallel", fCls, bCls, asCls, fCls, asCls, lfCls, iCls)
else
clazz.getMethod("run", fCls, bCls, asCls, fCls, asCls, lfCls)
Def.task(method)
}
private[sbt] final case class ScriptedTestPage(page: Int, total: Int)
private[sbt] def scriptedParser(scriptedBase: File): Parser[Seq[String]] = {
import DefaultParsers._
val scriptedFiles: NameFilter = ("test": NameFilter) | "pending"
val pairs = (scriptedBase * AllPassFilter * AllPassFilter * scriptedFiles).get map {
(f: File) =>
val p = f.getParentFile
(p.getParentFile.getName, p.getName)
}
val pairMap = pairs.groupBy(_._1).mapValues(_.map(_._2).toSet)
val id = charClass(c => !c.isWhitespace && c != '/').+.string
val groupP = token(id.examples(pairMap.keySet)) <~ token('/')
// A parser for page definitions
val pageP: Parser[ScriptedTestPage] = ("*" ~ NatBasic ~ "of" ~ NatBasic) map {
case _ ~ page ~ _ ~ total => ScriptedTestPage(page, total)
}
// Grabs the filenames from a given test group in the current page definition.
def pagedFilenames(group: String, page: ScriptedTestPage): Seq[String] = {
val files = pairMap(group).toSeq.sortBy(_.toLowerCase)
val pageSize = files.size / page.total
// The last page may loose some values, so we explicitly keep them
val dropped = files.drop(pageSize * (page.page - 1))
if (page.page == page.total) dropped
else dropped.take(pageSize)
}
def nameP(group: String) = {
token("*".id | id.examples(pairMap.getOrElse(group, Set.empty[String])))
}
val PagedIds: Parser[Seq[String]] =
for {
group <- groupP
page <- pageP
files = pagedFilenames(group, page)
// TODO - Fail the parser if we don't have enough files for the given page size
//if !files.isEmpty
} yield files map (f => s"$group/$f")
val testID = (for (group <- groupP; name <- nameP(group)) yield (group, name))
val testIdAsGroup = matched(testID) map (test => Seq(test))
//(token(Space) ~> matched(testID)).*
(token(Space) ~> (PagedIds | testIdAsGroup)).* map (_.flatten)
}
private[sbt] def scriptedTask: Initialize[InputTask[Unit]] = Def.inputTask {
val args = scriptedParser(sbtTestDirectory.value).parsed
scriptedDependencies.value
try {
val method = scriptedRun.value
val scriptedInstance = scriptedTests.value
val dir = sbtTestDirectory.value
val log = Boolean box scriptedBufferLog.value
val launcher = sbtLauncher.value
val opts = scriptedLaunchOpts.value.toArray
val empty = new java.util.ArrayList[File]()
val instances = Int box scriptedParallelInstances.value
if (scriptedBatchExecution.value)
method.invoke(scriptedInstance, dir, log, args.toArray, launcher, opts, empty, instances)
else method.invoke(scriptedInstance, dir, log, args.toArray, launcher, opts, empty)
()
} catch { case e: java.lang.reflect.InvocationTargetException => throw e.getCause }
}
private[this] def getJars(config: Configuration): Initialize[Task[PathFinder]] = Def.task {
PathFinder(Classpaths.managedJars(config, classpathTypes.value, Keys.update.value).map(_.data))
}
}

View File

@ -63,7 +63,7 @@ object SessionVar {
def read[T](key: ScopedKey[Task[T]], state: State)(implicit f: JsonFormat[T]): Option[T] =
Project.structure(state).streams(state).use(key) { s =>
try { Some(s.getInput(key, DefaultDataID).read[T]) } catch { case NonFatal(e) => None }
try { Some(s.getInput(key, DefaultDataID).read[T]) } catch { case NonFatal(_) => None }
}
def load[T](key: ScopedKey[Task[T]], state: State)(implicit f: JsonFormat[T]): Option[T] =

View File

@ -21,9 +21,10 @@ import BasicCommandStrings._, BasicKeys._
private[sbt] object TemplateCommandUtil {
def templateCommand: Command =
Command(TemplateCommand, templateBrief, templateDetailed)(templateCommandParser)(runTemplate)
Command(TemplateCommand, templateBrief, templateDetailed)(_ => templateCommandParser)(
runTemplate)
private def templateCommandParser(state: State): Parser[Seq[String]] =
private def templateCommandParser: Parser[Seq[String]] =
(token(Space) ~> repsep(StringBasic, token(Space))) | (token(EOF) map (_ => Nil))
private def runTemplate(s0: State, inputArg: Seq[String]): State = {
@ -84,8 +85,10 @@ private[sbt] object TemplateCommandUtil {
private def runTemplate(info: TemplateResolverInfo,
arguments: List[String],
loader: ClassLoader): Unit =
loader: ClassLoader): Unit = {
call(info.implementationClass, "run", loader)(classOf[Array[String]])(arguments.toArray)
()
}
private def infoLoader(
info: TemplateResolverInfo,

View File

@ -8,7 +8,7 @@
package sbt
package internal
import Def.{ showRelativeKey, ScopedKey }
import Def.{ showRelativeKey2, ScopedKey }
import Keys.sessionSettings
import sbt.internal.util.complete.{ DefaultParsers, Parser }
import Aggregation.{ KeyValue, Values }
@ -56,7 +56,7 @@ object Act {
keyMap: Map[String, AttributeKey[_]],
data: Settings[Scope]): Parser[ParsedKey] =
scopedKeyFull(index, current, defaultConfigs, keyMap) flatMap { choices =>
select(choices, data)(showRelativeKey(current, index.buildURIs.size > 1))
select(choices, data)(showRelativeKey2(current))
}
def scopedKeyFull(index: KeyIndex,
@ -100,7 +100,7 @@ object Act {
conf <- configs(confAmb, defaultConfigs, proj, index)
} yield
for {
taskAmb <- taskAxis(conf, index.tasks(proj, conf), keyMap)
taskAmb <- taskAxis(index.tasks(proj, conf), keyMap)
task = resolveTask(taskAmb)
key <- key(index, proj, conf, task, keyMap)
extra <- extraAxis(keyMap, IMap.empty)
@ -161,6 +161,7 @@ object Act {
def examples(p: Parser[String], exs: Set[String], label: String): Parser[String] =
p !!! ("Expected " + label) examples exs
def examplesStrict(p: Parser[String], exs: Set[String], label: String): Parser[String] =
filterStrings(examples(p, exs, label), exs, label)
@ -168,6 +169,7 @@ object Act {
p.? map { opt =>
toAxis(opt, ifNone)
}
def toAxis[T](opt: Option[T], ifNone: ScopeAxis[T]): ScopeAxis[T] =
opt match { case Some(t) => Select(t); case None => ifNone }
@ -231,8 +233,8 @@ object Act {
// This queries the key index so tab completion will list the build-level keys.
val buildKeys: Set[String] =
proj match {
case Some(ProjectRef(uri, id)) => index.keys(Some(BuildRef(uri)), conf, task)
case _ => Set()
case Some(ProjectRef(uri, _)) => index.keys(Some(BuildRef(uri)), conf, task)
case _ => Set()
}
val keys: Set[String] = index.keys(proj, conf, task) ++ buildKeys
keyParser(keys)
@ -255,9 +257,10 @@ object Act {
optionalAxis(extras, Zero)
}
def taskAxis(d: Option[String],
tasks: Set[AttributeKey[_]],
allKnown: Map[String, AttributeKey[_]]): Parser[ParsedAxis[AttributeKey[_]]] = {
def taskAxis(
tasks: Set[AttributeKey[_]],
allKnown: Map[String, AttributeKey[_]],
): Parser[ParsedAxis[AttributeKey[_]]] = {
val taskSeq = tasks.toSeq
def taskKeys(f: AttributeKey[_] => String): Seq[(String, AttributeKey[_])] =
taskSeq.map(key => (f(key), key))
@ -380,7 +383,7 @@ object Act {
def evaluate(kvs: Seq[ScopedKey[_]]): Parser[() => State] = {
val preparedPairs = anyKeyValues(structure, kvs)
val showConfig = Aggregation.defaultShow(state, showTasks = action == ShowAction)
evaluatingParser(state, structure, showConfig)(preparedPairs) map { evaluate => () =>
evaluatingParser(state, showConfig)(preparedPairs) map { evaluate => () =>
{
val keyStrings = preparedPairs.map(pp => showKey.show(pp.key)).mkString(", ")
state.log.debug("Evaluating tasks: " + keyStrings)

View File

@ -61,11 +61,10 @@ object Aggregation {
def applyTasks[T](
s: State,
structure: BuildStructure,
ps: Values[Parser[Task[T]]],
show: ShowConfig
)(implicit display: Show[ScopedKey[_]]): Parser[() => State] =
Command.applyEffect(seqParser(ps))(ts => runTasks(s, structure, ts, DummyTaskMap(Nil), show))
Command.applyEffect(seqParser(ps))(ts => runTasks(s, ts, DummyTaskMap(Nil), show))
private def showRun[T](complete: Complete[T], show: ShowConfig)(
implicit display: Show[ScopedKey[_]]
@ -104,7 +103,6 @@ object Aggregation {
}
def runTasks[HL <: HList, T](s: State,
structure: BuildStructure,
ts: Values[Task[T]],
extra: DummyTaskMap,
show: ShowConfig)(implicit display: Show[ScopedKey[_]]): State = {
@ -128,33 +126,26 @@ object Aggregation {
key in currentRef get structure.data getOrElse true
if (get(showSuccess)) {
if (get(showTiming)) {
val msg = timingString(start, stop, "", structure.data, currentRef, log)
val msg = timingString(start, stop, structure.data, currentRef)
if (success) log.success(msg) else log.error(msg)
} else if (success)
log.success("")
}
}
private def timingString(
startTime: Long,
endTime: Long,
s: String,
data: Settings[Scope],
currentRef: ProjectRef,
log: Logger
): String = {
val format = timingFormat in currentRef get data getOrElse defaultFormat
timing(format, startTime, endTime, "", log)
timing(format, startTime, endTime)
}
def timing(
format: java.text.DateFormat,
startTime: Long,
endTime: Long,
s: String,
log: Logger
): String = {
val ss = if (s.isEmpty) "" else s + " "
def timing(format: java.text.DateFormat, startTime: Long, endTime: Long): String = {
val nowString = format.format(new java.util.Date(endTime))
"Total " + ss + "time: " + (endTime - startTime + 500) / 1000 + " s, completed " + nowString
"Total time: " + (endTime - startTime + 500) / 1000 + " s, completed " + nowString
}
def defaultFormat: DateFormat = {
@ -164,20 +155,19 @@ object Aggregation {
def applyDynamicTasks[I](
s: State,
structure: BuildStructure,
inputs: Values[InputTask[I]],
show: ShowConfig
)(implicit display: Show[ScopedKey[_]]): Parser[() => State] = {
val parsers = for (KeyValue(k, it) <- inputs)
yield it.parser(s).map(v => KeyValue(k, v))
Command.applyEffect(seq(parsers)) { roots =>
runTasks(s, structure, roots, DummyTaskMap(Nil), show)
runTasks(s, roots, DummyTaskMap(Nil), show)
}
}
def evaluatingParser(s: State, structure: BuildStructure, show: ShowConfig)(
keys: Seq[KeyValue[_]]
)(implicit display: Show[ScopedKey[_]]): Parser[() => State] = {
def evaluatingParser(s: State, show: ShowConfig)(keys: Seq[KeyValue[_]])(
implicit display: Show[ScopedKey[_]]
): Parser[() => State] = {
// to make the call sites clearer
def separate[L](in: Seq[KeyValue[_]])(
@ -210,12 +200,12 @@ object Aggregation {
val otherStrings = other.map(_.key).mkString("Task(s)/setting(s):\n\t", "\n\t", "\n")
failure(s"Cannot mix input tasks with plain tasks/settings. $inputStrings $otherStrings")
} else
applyDynamicTasks(s, structure, maps(inputTasks)(castToAny), show)
applyDynamicTasks(s, maps(inputTasks)(castToAny), show)
} else {
val base =
if (tasks.isEmpty) success(() => s)
else
applyTasks(s, structure, maps(tasks)(x => success(castToAny(x))), show)
applyTasks(s, maps(tasks)(x => success(castToAny(x))), show)
base.map { res => () =>
val newState = res()
if (show.settingValues && settings.nonEmpty) printSettings(settings, show.print)

View File

@ -16,7 +16,7 @@ import sbt.internal.util.Attributed
import sbt.internal.inc.ReflectUtilities
trait BuildDef {
def projectDefinitions(baseDirectory: File): Seq[Project] = projects
def projectDefinitions(@deprecated("unused", "") baseDirectory: File): Seq[Project] = projects
def projects: Seq[Project] =
ReflectUtilities.allVals[CompositeProject](this).values.toSeq.flatMap(_.componentProjects)
// TODO: Should we grab the build core settings here or in a plugin?

View File

@ -154,7 +154,8 @@ case class DetectedAutoPlugin(name: String, value: AutoPlugin, hasAutoImport: Bo
* Auto-discovered modules for the build definition project. These include modules defined in build definition sources
* as well as modules in binary dependencies.
*
* @param builds The [[Build]]s detected in the build definition. This does not include the default [[Build]] that sbt creates if none is defined.
* @param builds The [[BuildDef]]s detected in the build definition.
* This does not include the default [[BuildDef]] that sbt creates if none is defined.
*/
final class DetectedPlugins(val autoPlugins: Seq[DetectedAutoPlugin],
val builds: DetectedModules[BuildDef]) {
@ -172,9 +173,7 @@ final class DetectedPlugins(val autoPlugins: Seq[DetectedAutoPlugin],
private[this] lazy val (autoPluginAutoImports, topLevelAutoPluginAutoImports) =
autoPlugins
.flatMap {
case DetectedAutoPlugin(name, ap, hasAutoImport) =>
if (hasAutoImport) Some(name)
else None
case DetectedAutoPlugin(name, _, hasAutoImport) => if (hasAutoImport) Some(name) else None
}
.partition(nonTopLevelPlugin)

View File

@ -20,6 +20,7 @@ import BasicKeys.{
serverAuthentication,
serverConnectionType,
serverLogLevel,
fullServerHandlers,
logLevel
}
import java.net.Socket
@ -44,49 +45,43 @@ import sbt.util.{ Level, Logger, LogExchange }
* this exchange, which could serve command request from either of the channel.
*/
private[sbt] final class CommandExchange {
private val autoStartServerSysProp = sys.props.get("sbt.server.autostart") map {
_.toLowerCase == "true"
} getOrElse true
private val lock = new AnyRef {}
private val autoStartServerSysProp =
sys.props get "sbt.server.autostart" forall (_.toLowerCase == "true")
private var server: Option[ServerInstance] = None
private val firstInstance: AtomicBoolean = new AtomicBoolean(true)
private var consoleChannel: Option[ConsoleChannel] = None
private val commandQueue: ConcurrentLinkedQueue[Exec] = new ConcurrentLinkedQueue()
private val channelBuffer: ListBuffer[CommandChannel] = new ListBuffer()
private val channelBufferLock = new AnyRef {}
private val nextChannelId: AtomicInteger = new AtomicInteger(0)
private lazy val jsonFormat = new sjsonnew.BasicJsonProtocol with JValueFormats {}
def channels: List[CommandChannel] = channelBuffer.toList
def subscribe(c: CommandChannel): Unit =
lock.synchronized {
channelBuffer.append(c)
}
def subscribe(c: CommandChannel): Unit = channelBufferLock.synchronized(channelBuffer.append(c))
// periodically move all messages from all the channels
@tailrec def blockUntilNextExec: Exec = {
@tailrec def slurpMessages(): Unit =
(((None: Option[Exec]) /: channels) { _ orElse _.poll }) match {
channels.foldLeft(Option.empty[Exec]) { _ orElse _.poll } match {
case None => ()
case Some(x) =>
commandQueue.add(x)
slurpMessages
case _ => ()
}
slurpMessages()
Option(commandQueue.poll) match {
case Some(x) => x
case _ =>
case None =>
Thread.sleep(50)
blockUntilNextExec
}
}
def run(s: State): State = {
consoleChannel match {
case Some(_) => // do nothing
case _ =>
val x = new ConsoleChannel("console0")
consoleChannel = Some(x)
subscribe(x)
if (consoleChannel.isEmpty) {
val console0 = new ConsoleChannel("console0")
consoleChannel = Some(console0)
subscribe(console0)
}
val autoStartServerAttr = (s get autoStartServer) match {
case Some(bool) => bool
@ -102,25 +97,13 @@ private[sbt] final class CommandExchange {
* Check if a server instance is running already, and start one if it isn't.
*/
private[sbt] def runServer(s: State): State = {
lazy val port = (s get serverPort) match {
case Some(x) => x
case None => 5001
}
lazy val host = (s get serverHost) match {
case Some(x) => x
case None => "127.0.0.1"
}
lazy val auth: Set[ServerAuthentication] = (s get serverAuthentication) match {
case Some(xs) => xs
case None => Set(ServerAuthentication.Token)
}
lazy val connectionType = (s get serverConnectionType) match {
case Some(x) => x
case None => ConnectionType.Tcp
}
lazy val level: Level.Value = (s get serverLogLevel)
.orElse(s get logLevel)
.getOrElse(Level.Warn)
lazy val port = s.get(serverPort).getOrElse(5001)
lazy val host = s.get(serverHost).getOrElse("127.0.0.1")
lazy val auth: Set[ServerAuthentication] =
s.get(serverAuthentication).getOrElse(Set(ServerAuthentication.Token))
lazy val connectionType = s.get(serverConnectionType).getOrElse(ConnectionType.Tcp)
lazy val level = s.get(serverLogLevel).orElse(s.get(logLevel)).getOrElse(Level.Warn)
lazy val handlers = s.get(fullServerHandlers).getOrElse(Nil)
def onIncomingSocket(socket: Socket, instance: ServerInstance): Unit = {
val name = newNetworkName
@ -133,7 +116,7 @@ private[sbt] final class CommandExchange {
log
}
val channel =
new NetworkChannel(name, socket, Project structure s, auth, instance, logger)
new NetworkChannel(name, socket, Project structure s, auth, instance, handlers, logger)
subscribe(channel)
}
if (server.isEmpty && firstInstance.get) {
@ -181,9 +164,7 @@ private[sbt] final class CommandExchange {
}
def shutdown(): Unit = {
channels foreach { c =>
c.shutdown()
}
channels foreach (_.shutdown())
// interrupt and kill the thread
server.foreach(_.shutdown())
server = None
@ -206,7 +187,7 @@ private[sbt] final class CommandExchange {
toDel.toList match {
case Nil => // do nothing
case xs =>
lock.synchronized {
channelBufferLock.synchronized {
channelBuffer --= xs
()
}
@ -222,48 +203,30 @@ private[sbt] final class CommandExchange {
val params = toLogMessageParams(entry)
channels collect {
case c: ConsoleChannel =>
if (broadcastStringMessage) {
if (broadcastStringMessage || (entry.channelName forall (_ == c.name)))
c.publishEvent(event)
} else {
if (entry.channelName.isEmpty || entry.channelName == Some(c.name)) {
c.publishEvent(event)
}
}
case c: NetworkChannel =>
try {
// Note that language server's LogMessageParams does not hold the execid,
// so this is weaker than the StringMessage. We might want to double-send
// in case we have a better client that can utilize the knowledge.
import sbt.internal.langserver.codec.JsonProtocol._
if (broadcastStringMessage) {
c.langNotify("window/logMessage", params)
} else {
if (entry.channelName == Some(c.name)) {
c.langNotify("window/logMessage", params)
}
}
} catch {
case _: IOException =>
toDel += c
}
if (broadcastStringMessage || (entry.channelName contains c.name))
c.jsonRpcNotify("window/logMessage", params)
} catch { case _: IOException => toDel += c }
}
case _ =>
channels collect {
case c: ConsoleChannel =>
c.publishEvent(event)
channels foreach {
case c: ConsoleChannel => c.publishEvent(event)
case c: NetworkChannel =>
try {
c.publishEvent(event)
} catch {
case _: IOException =>
toDel += c
}
try c.publishEvent(event)
catch { case _: IOException => toDel += c }
}
}
toDel.toList match {
case Nil => // do nothing
case xs =>
lock.synchronized {
channelBufferLock.synchronized {
channelBuffer --= xs
()
}
@ -305,7 +268,7 @@ private[sbt] final class CommandExchange {
toDel.toList match {
case Nil => // do nothing
case xs =>
lock.synchronized {
channelBufferLock.synchronized {
channelBuffer --= xs
()
}
@ -315,6 +278,11 @@ private[sbt] final class CommandExchange {
// fanout publishEvent
def publishEventMessage(event: EventMessage): Unit = {
val toDel: ListBuffer[CommandChannel] = ListBuffer.empty
def tryTo(x: => Unit, c: CommandChannel): Unit =
try x
catch { case _: IOException => toDel += c }
event match {
// Special treatment for ConsolePromptEvent since it's hand coded without codec.
case entry: ConsolePromptEvent =>
@ -328,36 +296,21 @@ private[sbt] final class CommandExchange {
case entry: ExecStatusEvent =>
channels collect {
case c: ConsoleChannel =>
if (entry.channelName.isEmpty || entry.channelName == Some(c.name)) {
c.publishEventMessage(event)
}
if (entry.channelName forall (_ == c.name)) c.publishEventMessage(event)
case c: NetworkChannel =>
try {
if (entry.channelName == Some(c.name)) {
c.publishEventMessage(event)
}
} catch {
case e: IOException =>
toDel += c
}
if (entry.channelName contains c.name) tryTo(c.publishEventMessage(event), c)
}
case _ =>
channels collect {
case c: ConsoleChannel =>
c.publishEventMessage(event)
case c: NetworkChannel =>
try {
c.publishEventMessage(event)
} catch {
case _: IOException =>
toDel += c
}
case c: ConsoleChannel => c.publishEventMessage(event)
case c: NetworkChannel => tryTo(c.publishEventMessage(event), c)
}
}
toDel.toList match {
case Nil => // do nothing
case xs =>
lock.synchronized {
channelBufferLock.synchronized {
channelBuffer --= xs
()
}

View File

@ -51,7 +51,7 @@ object ConsoleProject {
options,
initCommands,
cleanupCommands
)(Some(unit.loader), bindings)
)(Some(unit.loader), bindings).get
}
()
}

View File

@ -36,12 +36,15 @@ private[sbt] abstract class BackgroundJob {
}
def shutdown(): Unit
// this should be true on construction and stay true until
// the job is complete
def isRunning(): Boolean
// called after stop or on spontaneous exit, closing the result
// removes the listener
def onStop(listener: () => Unit)(implicit ex: ExecutionContext): Closeable
// do we need this or is the spawning task good enough?
// def tags: SomeType
}
@ -57,8 +60,8 @@ private[sbt] abstract class AbstractBackgroundJobService extends BackgroundJobSe
private val serviceTempDir = IO.createTemporaryDirectory
// hooks for sending start/stop events
protected def onAddJob(job: JobHandle): Unit = {}
protected def onRemoveJob(job: JobHandle): Unit = {}
protected def onAddJob(@deprecated("unused", "") job: JobHandle): Unit = ()
protected def onRemoveJob(@deprecated("unused", "") job: JobHandle): Unit = ()
// this mutable state could conceptually go on State except
// that then every task that runs a background job would have

View File

@ -8,11 +8,18 @@
package sbt
package internal
import sbt.internal.util.{ complete, AttributeEntry, AttributeKey, LineRange, MessageOnlyException, RangePosition, Settings }
import sbt.internal.util.{
AttributeEntry,
AttributeKey,
LineRange,
MessageOnlyException,
RangePosition,
Settings
}
import java.io.File
import compiler.{ Eval, EvalImports }
import complete.DefaultParsers.validID
import sbt.internal.util.complete.DefaultParsers.validID
import Def.{ ScopedKey, Setting }
import Scope.GlobalScope
import sbt.internal.parser.SbtParser
@ -37,7 +44,9 @@ private[sbt] object EvaluateConfigurations {
/**
* This represents the parsed expressions in a build sbt, as well as where they were defined.
*/
private[this] final class ParsedFile(val imports: Seq[(String, Int)], val definitions: Seq[(String, LineRange)], val settings: Seq[(String, LineRange)])
private[this] final class ParsedFile(val imports: Seq[(String, Int)],
val definitions: Seq[(String, LineRange)],
val settings: Seq[(String, LineRange)])
/** The keywords we look for when classifying a string as a definition. */
private[this] val DefinitionKeywords = Seq("lazy val ", "def ", "val ")
@ -48,18 +57,24 @@ private[sbt] object EvaluateConfigurations {
* return a parsed, compiled + evaluated [[LoadedSbtFile]]. The result has
* raw sbt-types that can be accessed and used.
*/
def apply(eval: Eval, srcs: Seq[File], imports: Seq[String]): LazyClassLoaded[LoadedSbtFile] =
{
val loadFiles = srcs.sortBy(_.getName) map { src => evaluateSbtFile(eval, src, IO.readLines(src), imports, 0) }
loader => (LoadedSbtFile.empty /: loadFiles) { (loaded, load) => loaded merge load(loader) }
def apply(eval: Eval, srcs: Seq[File], imports: Seq[String]): LazyClassLoaded[LoadedSbtFile] = {
val loadFiles = srcs.sortBy(_.getName) map { src =>
evaluateSbtFile(eval, src, IO.readLines(src), imports, 0)
}
loader =>
(LoadedSbtFile.empty /: loadFiles) { (loaded, load) =>
loaded merge load(loader)
}
}
/**
* Reads a given .sbt file and evaluates it into a sequence of setting values.
*
* Note: This ignores any non-Setting[_] values in the file.
*/
def evaluateConfiguration(eval: Eval, src: File, imports: Seq[String]): LazyClassLoaded[Seq[Setting[_]]] =
def evaluateConfiguration(eval: Eval,
src: File,
imports: Seq[String]): LazyClassLoaded[Seq[Setting[_]]] =
evaluateConfiguration(eval, src, IO.readLines(src), imports, 0)
/**
@ -68,13 +83,16 @@ private[sbt] object EvaluateConfigurations {
*
* @param builtinImports The set of import statements to add to those parsed in the .sbt file.
*/
private[this] def parseConfiguration(file: File, lines: Seq[String], builtinImports: Seq[String], offset: Int): ParsedFile =
{
val (importStatements, settingsAndDefinitions) = splitExpressions(file, lines)
val allImports = builtinImports.map(s => (s, -1)) ++ addOffset(offset, importStatements)
val (definitions, settings) = splitSettingsDefinitions(addOffsetToRange(offset, settingsAndDefinitions))
new ParsedFile(allImports, definitions, settings)
}
private[this] def parseConfiguration(file: File,
lines: Seq[String],
builtinImports: Seq[String],
offset: Int): ParsedFile = {
val (importStatements, settingsAndDefinitions) = splitExpressions(file, lines)
val allImports = builtinImports.map(s => (s, -1)) ++ addOffset(offset, importStatements)
val (definitions, settings) = splitSettingsDefinitions(
addOffsetToRange(offset, settingsAndDefinitions))
new ParsedFile(allImports, definitions, settings)
}
/**
* Evaluates a parsed sbt configuration file.
@ -86,11 +104,15 @@ private[sbt] object EvaluateConfigurations {
*
* @return Just the Setting[_] instances defined in the .sbt file.
*/
def evaluateConfiguration(eval: Eval, file: File, lines: Seq[String], imports: Seq[String], offset: Int): LazyClassLoaded[Seq[Setting[_]]] =
{
val l = evaluateSbtFile(eval, file, lines, imports, offset)
loader => l(loader).settings
}
def evaluateConfiguration(eval: Eval,
file: File,
lines: Seq[String],
imports: Seq[String],
offset: Int): LazyClassLoaded[Seq[Setting[_]]] = {
val l = evaluateSbtFile(eval, file, lines, imports, offset)
loader =>
l(loader).settings
}
/**
* Evaluates a parsed sbt configuration file.
@ -102,31 +124,38 @@ private[sbt] object EvaluateConfigurations {
* @return A function which can take an sbt classloader and return the raw types/configuration
* which was compiled/parsed for the given file.
*/
private[sbt] def evaluateSbtFile(eval: Eval, file: File, lines: Seq[String], imports: Seq[String], offset: Int): LazyClassLoaded[LoadedSbtFile] =
{
// TODO - Store the file on the LoadedSbtFile (or the parent dir) so we can accurately do
// detection for which project project manipulations should be applied.
val name = file.getPath
val parsed = parseConfiguration(file, lines, imports, offset)
val (importDefs, definitions) =
if (parsed.definitions.isEmpty) (Nil, DefinedSbtValues.empty) else {
val definitions = evaluateDefinitions(eval, name, parsed.imports, parsed.definitions, Some(file))
val imp = BuildUtil.importAllRoot(definitions.enclosingModule :: Nil)
(imp, DefinedSbtValues(definitions))
}
val allImports = importDefs.map(s => (s, -1)) ++ parsed.imports
val dslEntries = parsed.settings map {
case (dslExpression, range) =>
evaluateDslEntry(eval, name, allImports, dslExpression, range)
private[sbt] def evaluateSbtFile(eval: Eval,
file: File,
lines: Seq[String],
imports: Seq[String],
offset: Int): LazyClassLoaded[LoadedSbtFile] = {
// TODO - Store the file on the LoadedSbtFile (or the parent dir) so we can accurately do
// detection for which project project manipulations should be applied.
val name = file.getPath
val parsed = parseConfiguration(file, lines, imports, offset)
val (importDefs, definitions) =
if (parsed.definitions.isEmpty) (Nil, DefinedSbtValues.empty)
else {
val definitions =
evaluateDefinitions(eval, name, parsed.imports, parsed.definitions, Some(file))
val imp = BuildUtil.importAllRoot(definitions.enclosingModule :: Nil)
(imp, DefinedSbtValues(definitions))
}
eval.unlinkDeferred()
// Tracks all the files we generated from evaluating the sbt file.
val allGeneratedFiles = (definitions.generated ++ dslEntries.flatMap(_.generated))
loader => {
val projects = definitions.values(loader).flatMap {
case p: CompositeProject => p.componentProjects.map(resolveBase(file.getParentFile, _))
case _ => Nil
}
val allImports = importDefs.map(s => (s, -1)) ++ parsed.imports
val dslEntries = parsed.settings map {
case (dslExpression, range) =>
evaluateDslEntry(eval, name, allImports, dslExpression, range)
}
eval.unlinkDeferred()
// Tracks all the files we generated from evaluating the sbt file.
val allGeneratedFiles = (definitions.generated ++ dslEntries.flatMap(_.generated))
loader =>
{
val projects =
definitions.values(loader).flatMap {
case p: CompositeProject => p.componentProjects.map(resolveBase(file.getParentFile, _))
case _ => Nil
}
val (settingsRaw, manipulationsRaw) =
dslEntries map (_.result apply loader) partition {
case DslEntry.ProjectSettings(_) => true
@ -140,9 +169,14 @@ private[sbt] object EvaluateConfigurations {
case DslEntry.ProjectManipulation(f) => f
}
// TODO -get project manipulations.
new LoadedSbtFile(settings, projects, importDefs, manipulations, definitions, allGeneratedFiles)
new LoadedSbtFile(settings,
projects,
importDefs,
manipulations,
definitions,
allGeneratedFiles)
}
}
}
/** move a project to be relative to this file after we've evaluated it. */
private[this] def resolveBase(f: File, p: Project) = p.copy(base = IO.resolve(f, p.base))
@ -173,11 +207,19 @@ private[sbt] object EvaluateConfigurations {
* @return A method that given an sbt classloader, can return the actual [[sbt.internal.DslEntry]] defined by
* the expression, and the sequence of .class files generated.
*/
private[sbt] def evaluateDslEntry(eval: Eval, name: String, imports: Seq[(String, Int)], expression: String, range: LineRange): TrackedEvalResult[DslEntry] = {
private[sbt] def evaluateDslEntry(eval: Eval,
name: String,
imports: Seq[(String, Int)],
expression: String,
range: LineRange): TrackedEvalResult[DslEntry] = {
// TODO - Should we try to namespace these between.sbt files? IF they hash to the same value, they may actually be
// exactly the same setting, so perhaps we don't care?
val result = try {
eval.eval(expression, imports = new EvalImports(imports, name), srcName = name, tpeName = Some(SettingsDefinitionName), line = range.start)
eval.eval(expression,
imports = new EvalImports(imports, name),
srcName = name,
tpeName = Some(SettingsDefinitionName),
line = range.start)
} catch {
case e: sbt.compiler.EvalException => throw new MessageOnlyException(e.getMessage)
}
@ -206,7 +248,11 @@ private[sbt] object EvaluateConfigurations {
*/
// Build DSL now includes non-Setting[_] type settings.
// Note: This method is used by the SET command, so we may want to evaluate that sucker a bit.
def evaluateSetting(eval: Eval, name: String, imports: Seq[(String, Int)], expression: String, range: LineRange): LazyClassLoaded[Seq[Setting[_]]] =
def evaluateSetting(eval: Eval,
name: String,
imports: Seq[(String, Int)],
expression: String,
range: LineRange): LazyClassLoaded[Seq[Setting[_]]] =
evaluateDslEntry(eval, name, imports, expression, range).result andThen {
case DslEntry.ProjectSettings(values) => values
case _ => Nil
@ -216,44 +262,62 @@ private[sbt] object EvaluateConfigurations {
* Splits a set of lines into (imports, expressions). That is,
* anything on the right of the tuple is a scala expression (definition or setting).
*/
private[sbt] def splitExpressions(file: File, lines: Seq[String]): (Seq[(String, Int)], Seq[(String, LineRange)]) =
{
val split = SbtParser(file, lines)
// TODO - Look at pulling the parsed expression trees from the SbtParser and stitch them back into a different
// scala compiler rather than re-parsing.
(split.imports, split.settings)
}
private[sbt] def splitExpressions(
file: File,
lines: Seq[String]): (Seq[(String, Int)], Seq[(String, LineRange)]) = {
val split = SbtParser(file, lines)
// TODO - Look at pulling the parsed expression trees from the SbtParser and stitch them back into a different
// scala compiler rather than re-parsing.
(split.imports, split.settings)
}
private[this] def splitSettingsDefinitions(lines: Seq[(String, LineRange)]): (Seq[(String, LineRange)], Seq[(String, LineRange)]) =
lines partition { case (line, range) => isDefinition(line) }
private[this] def splitSettingsDefinitions(
lines: Seq[(String, LineRange)]): (Seq[(String, LineRange)], Seq[(String, LineRange)]) =
lines partition { case (line, _) => isDefinition(line) }
private[this] def isDefinition(line: String): Boolean =
{
val trimmed = line.trim
DefinitionKeywords.exists(trimmed startsWith _)
}
private[this] def isDefinition(line: String): Boolean = {
val trimmed = line.trim
DefinitionKeywords.exists(trimmed startsWith _)
}
private[this] def extractedValTypes: Seq[String] =
Seq(classOf[CompositeProject], classOf[InputKey[_]], classOf[TaskKey[_]], classOf[SettingKey[_]]).map(_.getName)
Seq(classOf[CompositeProject],
classOf[InputKey[_]],
classOf[TaskKey[_]],
classOf[SettingKey[_]])
.map(_.getName)
private[this] def evaluateDefinitions(eval: Eval, name: String, imports: Seq[(String, Int)], definitions: Seq[(String, LineRange)], file: Option[File]): compiler.EvalDefinitions =
{
val convertedRanges = definitions.map { case (s, r) => (s, r.start to r.end) }
eval.evalDefinitions(convertedRanges, new EvalImports(imports, name), name, file, extractedValTypes)
}
private[this] def evaluateDefinitions(eval: Eval,
name: String,
imports: Seq[(String, Int)],
definitions: Seq[(String, LineRange)],
file: Option[File]): compiler.EvalDefinitions = {
val convertedRanges = definitions.map { case (s, r) => (s, r.start to r.end) }
eval.evalDefinitions(convertedRanges,
new EvalImports(imports, name),
name,
file,
extractedValTypes)
}
}
object Index {
def taskToKeyMap(data: Settings[Scope]): Map[Task[_], ScopedKey[Task[_]]] =
{
// AttributeEntry + the checked type test 'value: Task[_]' ensures that the cast is correct.
// (scalac couldn't determine that 'key' is of type AttributeKey[Task[_]] on its own and a type match still required the cast)
val pairs = for (scope <- data.scopes; AttributeEntry(key, value: Task[_]) <- data.data(scope).entries) yield (value, ScopedKey(scope, key.asInstanceOf[AttributeKey[Task[_]]])) // unclear why this cast is needed even with a type test in the above filter
pairs.toMap[Task[_], ScopedKey[Task[_]]]
}
def taskToKeyMap(data: Settings[Scope]): Map[Task[_], ScopedKey[Task[_]]] = {
val pairs = data.scopes flatMap (scope =>
data.data(scope).entries collect {
case AttributeEntry(key, value: Task[_]) =>
(value, ScopedKey(scope, key.asInstanceOf[AttributeKey[Task[_]]]))
})
pairs.toMap[Task[_], ScopedKey[Task[_]]]
}
def allKeys(settings: Seq[Setting[_]]): Set[ScopedKey[_]] =
settings.flatMap(s => if (s.key.key.isLocal) Nil else s.key +: s.dependencies).filter(!_.key.isLocal).toSet
settings
.flatMap(s => if (s.key.key.isLocal) Nil else s.key +: s.dependencies)
.filter(!_.key.isLocal)
.toSet
def attributeKeys(settings: Settings[Scope]): Set[AttributeKey[_]] =
settings.data.values.flatMap(_.keys).toSet[AttributeKey[_]]
@ -261,30 +325,36 @@ object Index {
def stringToKeyMap(settings: Set[AttributeKey[_]]): Map[String, AttributeKey[_]] =
stringToKeyMap0(settings)(_.label)
private[this] def stringToKeyMap0(settings: Set[AttributeKey[_]])(label: AttributeKey[_] => String): Map[String, AttributeKey[_]] =
{
val multiMap = settings.groupBy(label)
val duplicates = multiMap collect { case (k, xs) if xs.size > 1 => (k, xs.map(_.manifest)) } collect { case (k, xs) if xs.size > 1 => (k, xs) }
if (duplicates.isEmpty)
multiMap.collect { case (k, v) if validID(k) => (k, v.head) } toMap
else
sys.error(duplicates map { case (k, tps) => "'" + k + "' (" + tps.mkString(", ") + ")" } mkString ("Some keys were defined with the same name but different types: ", ", ", ""))
private[this] def stringToKeyMap0(settings: Set[AttributeKey[_]])(
label: AttributeKey[_] => String): Map[String, AttributeKey[_]] = {
val multiMap = settings.groupBy(label)
val duplicates = multiMap collect { case (k, xs) if xs.size > 1 => (k, xs.map(_.manifest)) } collect {
case (k, xs) if xs.size > 1 => (k, xs)
}
if (duplicates.isEmpty)
multiMap.collect { case (k, v) if validID(k) => (k, v.head) } toMap
else
sys.error(
duplicates map { case (k, tps) => "'" + k + "' (" + tps.mkString(", ") + ")" } mkString ("Some keys were defined with the same name but different types: ", ", ", ""))
}
private[this]type TriggerMap = collection.mutable.HashMap[Task[_], Seq[Task[_]]]
private[this] type TriggerMap = collection.mutable.HashMap[Task[_], Seq[Task[_]]]
def triggers(ss: Settings[Scope]): Triggers[Task] =
{
val runBefore = new TriggerMap
val triggeredBy = new TriggerMap
for ((_, amap) <- ss.data; AttributeEntry(_, value: Task[_]) <- amap.entries) {
val as = value.info.attributes
update(runBefore, value, as get Keys.runBefore)
update(triggeredBy, value, as get Keys.triggeredBy)
def triggers(ss: Settings[Scope]): Triggers[Task] = {
val runBefore = new TriggerMap
val triggeredBy = new TriggerMap
ss.data.values foreach (
_.entries foreach {
case AttributeEntry(_, value: Task[_]) =>
val as = value.info.attributes
update(runBefore, value, as get Keys.runBefore)
update(triggeredBy, value, as get Keys.triggeredBy)
case _ => ()
}
val onComplete = Keys.onComplete in GlobalScope get ss getOrElse { () => () }
new Triggers[Task](runBefore, triggeredBy, map => { onComplete(); map })
}
)
val onComplete = Keys.onComplete in GlobalScope get ss getOrElse (() => ())
new Triggers[Task](runBefore, triggeredBy, map => { onComplete(); map })
}
private[this] def update(map: TriggerMap, base: Task[_], tasksOpt: Option[Seq[Task[_]]]): Unit =
for (tasks <- tasksOpt; task <- tasks)

View File

@ -94,7 +94,7 @@ object GlobalPlugin {
val nv = nodeView(state, str, roots)
val config = EvaluateTask.extractedTaskConfig(Project.extract(state), structure, state)
val (newS, result) = runTask(t, state, str, structure.index.triggers, config)(nv)
(newS, processResult(result, newS.log))
(newS, processResult2(result))
}
}
val globalPluginSettings = Project.inScope(Scope.GlobalScope in LocalRootProject)(

View File

@ -37,18 +37,12 @@ private[sbt] object LibraryManagement {
): UpdateReport = {
/* Resolve the module settings from the inputs. */
def resolve(inputs: UpdateInputs): UpdateReport = {
def resolve: UpdateReport = {
import sbt.util.ShowLines._
log.info(s"Updating $label...")
val reportOrUnresolved: Either[UnresolvedWarning, UpdateReport] =
//try {
lm.update(module, updateConfig, uwConfig, log)
// } catch {
// case e: Throwable =>
// e.printStackTrace
// throw e
// }
val report = reportOrUnresolved match {
case Right(report0) => report0
case Left(unresolvedWarning) =>
@ -96,12 +90,12 @@ private[sbt] object LibraryManagement {
import sbt.librarymanagement.LibraryManagementCodec._
val cachedResolve = Tracked.lastOutput[UpdateInputs, UpdateReport](cache) {
case (_, Some(out)) if upToDate(inChanged, out) => markAsCached(out)
case _ => resolve(updateInputs)
case _ => resolve
}
import scala.util.control.Exception.catching
catching(classOf[NullPointerException], classOf[OutOfMemoryError])
.withApply { t =>
val resolvedAgain = resolve(updateInputs)
val resolvedAgain = resolve
val culprit = t.getClass.getSimpleName
log.warn(s"Update task caching failed due to $culprit.")
log.warn("Report the following output to sbt:")

View File

@ -61,7 +61,7 @@ private[sbt] object Load {
val globalBase = getGlobalBase(state)
val base = baseDirectory.getCanonicalFile
val rawConfig = defaultPreGlobal(state, base, globalBase, log)
val config0 = defaultWithGlobal(state, base, rawConfig, globalBase, log)
val config0 = defaultWithGlobal(state, base, rawConfig, globalBase)
val config =
if (isPlugin) enableSbtPlugin(config0) else config0.copy(extraBuilds = topLevelExtras)
(base, config)
@ -109,7 +109,7 @@ private[sbt] object Load {
javaHome = None,
scalac
)
val evalPluginDef = EvaluateTask.evalPluginDef(log) _
val evalPluginDef: (BuildStructure, State) => PluginData = EvaluateTask.evalPluginDef _
val delegates = defaultDelegates
val pluginMgmt = PluginManagement(loader)
val inject = InjectSettings(injectGlobal(state), Nil, const(Nil))
@ -145,7 +145,6 @@ private[sbt] object Load {
base: File,
rawConfig: LoadBuildConfiguration,
globalBase: File,
log: Logger
): LoadBuildConfiguration = {
val globalPluginsDir = getGlobalPluginsDirectory(state, globalBase)
val withGlobal = loadGlobal(state, base, globalPluginsDir, rawConfig)
@ -208,7 +207,6 @@ private[sbt] object Load {
project => projectInherit(lb, project),
(project, config) => configInherit(lb, project, config, rootProject),
task => task.extend,
(project, extra) => Nil
)
}

View File

@ -15,6 +15,7 @@ import Keys.{ logLevel, logManager, persistLogLevel, persistTraceLevel, sLog, tr
import scala.Console.{ BLUE, RESET }
import sbt.internal.util.{
AttributeKey,
ConsoleAppender,
ConsoleOut,
Settings,
SuppressedTraceContext,
@ -105,7 +106,7 @@ object LogManager {
def backgroundLog(data: Settings[Scope], state: State, task: ScopedKey[_]): ManagedLogger = {
val console = screen(task, state)
LogManager.backgroundLog(data, state, task, console, relay(()), extra(task).toList)
LogManager.backgroundLog(data, state, task, console, relay(()))
}
}
@ -191,7 +192,6 @@ object LogManager {
console: Appender,
/* TODO: backed: Appender,*/
relay: Appender,
extra: List[Appender]
): ManagedLogger = {
val scope = task.scope
val screenLevel = getOr(logLevel.key, data, scope, state, Level.Info)
@ -258,7 +258,7 @@ object LogManager {
private[this] def slog: Logger =
Option(ref.get) getOrElse sys.error("Settings logger used after project was loaded.")
override val ansiCodesSupported = slog.ansiCodesSupported
override val ansiCodesSupported = ConsoleAppender.formatEnabledInEnv
override def trace(t: => Throwable) = slog.trace(t)
override def success(message: => String) = slog.success(message)
override def log(level: Level.Value, message: => String) = slog.log(level, message)

View File

@ -47,6 +47,8 @@ object PluginDiscovery {
"sbt.plugins.IvyPlugin" -> sbt.plugins.IvyPlugin,
"sbt.plugins.JvmPlugin" -> sbt.plugins.JvmPlugin,
"sbt.plugins.CorePlugin" -> sbt.plugins.CorePlugin,
"sbt.ScriptedPlugin" -> sbt.ScriptedPlugin,
"sbt.plugins.SbtPlugin" -> sbt.plugins.SbtPlugin,
"sbt.plugins.JUnitXmlReportPlugin" -> sbt.plugins.JUnitXmlReportPlugin,
"sbt.plugins.Giter8TemplatePlugin" -> sbt.plugins.Giter8TemplatePlugin
)
@ -65,7 +67,7 @@ object PluginDiscovery {
new DiscoveredNames(discover[AutoPlugin], discover[BuildDef])
}
// TODO: for 0.14.0, consider consolidating into a single file, which would make the classpath search 4x faster
// TODO: consider consolidating into a single file, which would make the classpath search 4x faster
/** Writes discovered module `names` to zero or more files in `dir` as per [[writeDescriptor]] and returns the list of files written. */
def writeDescriptors(names: DiscoveredNames, dir: File): Seq[File] = {
import Paths._

View File

@ -57,7 +57,7 @@ private[sbt] class PluginsDebug(
if (possible.nonEmpty) {
val explained = possible.map(explainPluginEnable)
val possibleString =
if (explained.size > 1)
if (explained.lengthCompare(1) > 0)
explained.zipWithIndex
.map { case (s, i) => s"$i. $s" }
.mkString(s"Multiple plugins are available that can provide $notFoundKey:\n", "\n", "")
@ -111,7 +111,7 @@ private[sbt] class PluginsDebug(
}
private[this] def multi(strs: Seq[String]): String =
strs.mkString(if (strs.size > 4) "\n\t" else ", ")
strs.mkString(if (strs.lengthCompare(4) > 0) "\n\t" else ", ")
}
private[sbt] object PluginsDebug {
@ -377,7 +377,7 @@ private[sbt] object PluginsDebug {
def explainPluginEnable(ps: PluginEnable): String =
ps match {
case PluginRequirements(plugin,
context,
_,
blockingExcludes,
enablingPlugins,
extraEnabledPlugins,
@ -393,9 +393,8 @@ private[sbt] object PluginsDebug {
note(willRemove(plugin, toBeRemoved.toList)) ::
Nil
parts.filterNot(_.isEmpty).mkString("\n")
case PluginImpossible(plugin, context, contradictions) =>
pluginImpossible(plugin, contradictions)
case PluginActivated(plugin, context) => s"Plugin ${plugin.label} already activated."
case PluginImpossible(plugin, _, contradictions) => pluginImpossible(plugin, contradictions)
case PluginActivated(plugin, _) => s"Plugin ${plugin.label} already activated."
}
/**

View File

@ -26,7 +26,7 @@ class RelayAppender(name: String)
val level = ConsoleAppender.toLevel(event.getLevel)
val message = event.getMessage
message match {
case o: ObjectMessage => appendEvent(level, o.getParameter)
case o: ObjectMessage => appendEvent(o.getParameter)
case p: ParameterizedMessage => appendLog(level, p.getFormattedMessage)
case r: RingBufferLogEvent => appendLog(level, r.getFormattedMessage)
case _ => appendLog(level, message.toString)
@ -35,7 +35,7 @@ class RelayAppender(name: String)
def appendLog(level: Level.Value, message: => String): Unit = {
exchange.publishEventMessage(LogEvent(level.toString, message))
}
def appendEvent(level: Level.Value, event: AnyRef): Unit =
def appendEvent(event: AnyRef): Unit =
event match {
case x: StringEvent => {
import JsonProtocol._

View File

@ -15,7 +15,7 @@ import sbt.librarymanagement.Configuration
import Project._
import Def.{ ScopedKey, Setting }
import Scope.Global
import Types.{ const, idFun }
import Types.idFun
import complete._
import DefaultParsers._
@ -64,11 +64,10 @@ private[sbt] object SettingCompletions {
setResult(session, r, redefined)
}
/** Implementation of the `set` command that will reload the current project with `settings` appended to the current settings. */
def setThis(s: State,
extracted: Extracted,
settings: Seq[Def.Setting[_]],
arg: String): SetResult = {
/** Implementation of the `set` command that will reload the current project with `settings`
* appended to the current settings.
*/
def setThis(extracted: Extracted, settings: Seq[Def.Setting[_]], arg: String): SetResult = {
import extracted._
val append =
Load.transformSettings(Load.projectScope(currentRef), currentRef.build, rootProject, settings)
@ -82,16 +81,19 @@ private[sbt] object SettingCompletions {
private[this] def setResult(
session: SessionSettings,
r: Relation[ScopedKey[_], ScopedKey[_]],
redefined: Seq[Setting[_]])(implicit show: Show[ScopedKey[_]]): SetResult = {
redefined: Seq[Setting[_]],
)(implicit show: Show[ScopedKey[_]]): SetResult = {
val redefinedKeys = redefined.map(_.key).toSet
val affectedKeys = redefinedKeys.flatMap(r.reverse)
def summary(verbose: Boolean): String = setSummary(redefinedKeys, affectedKeys, verbose)
new SetResult(session, summary(true), summary(false))
}
private[this] def setSummary(redefined: Set[ScopedKey[_]],
affected: Set[ScopedKey[_]],
verbose: Boolean)(implicit display: Show[ScopedKey[_]]): String = {
private[this] def setSummary(
redefined: Set[ScopedKey[_]],
affected: Set[ScopedKey[_]],
verbose: Boolean,
)(implicit display: Show[ScopedKey[_]]): String = {
val QuietLimit = 3
def strings(in: Set[ScopedKey[_]]): Seq[String] = in.toSeq.map(sk => display.show(sk)).sorted
def lines(in: Seq[String]): (String, Boolean) =
@ -129,17 +131,17 @@ private[sbt] object SettingCompletions {
* when there are fewer choices or tab is pressed multiple times.
* The last part of the completion will generate a template for the value or function literal that will initialize the setting or task.
*/
def settingParser(settings: Settings[Scope],
rawKeyMap: Map[String, AttributeKey[_]],
context: ResolvedProject): Parser[String] = {
val keyMap
: Map[String, AttributeKey[_]] = rawKeyMap.map { case (k, v) => (keyScalaID(k), v) }.toMap
def inputScopedKey(pred: AttributeKey[_] => Boolean): Parser[ScopedKey[_]] =
scopedKeyParser(keyMap.filter { case (_, k) => pred(k) }, settings, context)
def settingParser(
settings: Settings[Scope],
rawKeyMap: Map[String, AttributeKey[_]],
context: ResolvedProject,
): Parser[String] = {
val keyMap: Map[String, AttributeKey[_]] =
rawKeyMap.map { case (k, v) => (keyScalaID(k), v) }.toMap
val full = for {
defineKey <- scopedKeyParser(keyMap, settings, context)
a <- assign(defineKey)
_ <- valueParser(defineKey, a, inputScopedKey(keyFilter(defineKey.key)))
_ <- valueParser(defineKey, a)
} yield
() // parser is currently only for completion and the parsed data structures are not used
@ -167,9 +169,7 @@ private[sbt] object SettingCompletions {
* Parser for the initialization expression for the assignment method `assign` on the key `sk`.
* `scopedKeyP` is used to parse and complete the input keys for an initialization that depends on other keys.
*/
def valueParser(sk: ScopedKey[_],
assign: Assign.Value,
scopedKeyP: Parser[ScopedKey[_]]): Parser[Seq[ScopedKey[_]]] = {
def valueParser(sk: ScopedKey[_], assign: Assign.Value): Parser[Seq[ScopedKey[_]]] = {
val fullTypeString = keyTypeString(sk.key)
val typeString = if (assignNoAppend(assign)) fullTypeString else "..."
if (assign == Assign.Update) {
@ -181,14 +181,6 @@ private[sbt] object SettingCompletions {
}
}
/**
* For a setting definition `definingKey <<= (..., in, ...) { ... }`,
* `keyFilter(definingKey)(in)` returns true when `in` is an allowed input for `definingKey` based on whether they are settings or not.
* For example, if `definingKey` is for a setting, `in` may only be a setting itself.
*/
def keyFilter(definingKey: AttributeKey[_]): AttributeKey[_] => Boolean =
if (isSetting(definingKey)) isSetting _ else isTaskOrSetting _
/**
* Parser for a Scope for a `key` given the current project `context` and evaluated `settings`.
* The completions are restricted to be more useful. Currently, this parser will suggest
@ -202,17 +194,20 @@ private[sbt] object SettingCompletions {
val definedScopes = data.toSeq flatMap {
case (scope, attrs) => if (attrs contains key) scope :: Nil else Nil
}
scope(key, allScopes, definedScopes, context)
scope(allScopes, definedScopes, context)
}
private[this] def scope(key: AttributeKey[_],
allScopes: Seq[Scope],
definedScopes: Seq[Scope],
context: ResolvedProject): Parser[Scope] = {
def axisParser[T](axis: Scope => ScopeAxis[T],
name: T => String,
description: T => Option[String],
label: String): Parser[ScopeAxis[T]] = {
private[this] def scope(
allScopes: Seq[Scope],
definedScopes: Seq[Scope],
context: ResolvedProject,
): Parser[Scope] = {
def axisParser[T](
axis: Scope => ScopeAxis[T],
name: T => String,
description: T => Option[String],
label: String,
): Parser[ScopeAxis[T]] = {
def getChoice(s: Scope): Seq[(String, T)] = axis(s) match {
case Select(t) => (name(t), t) :: Nil
case _ => Nil
@ -220,19 +215,23 @@ private[sbt] object SettingCompletions {
def getChoices(scopes: Seq[Scope]): Map[String, T] = scopes.flatMap(getChoice).toMap
val definedChoices: Set[String] =
definedScopes.flatMap(s => axis(s).toOption.map(name)).toSet
val fullChoices: Map[String, T] = getChoices(allScopes.toSeq)
val fullChoices: Map[String, T] = getChoices(allScopes)
val completions = fixedCompletions { (seen, level) =>
completeScope(seen, level, definedChoices, fullChoices)(description).toSet
}
Act.optionalAxis(inParser ~> token(Space) ~> token(scalaID(fullChoices, label), completions),
This)
Act.optionalAxis(
inParser ~> token(Space) ~> token(scalaID(fullChoices, label), completions),
This,
)
}
val configurations: Map[String, Configuration] =
context.configurations.map(c => (configScalaID(c.name), c)).toMap
val configParser = axisParser[ConfigKey](_.config,
c => configScalaID(c.name),
ck => configurations.get(ck.name).map(_.description),
"configuration")
val configParser = axisParser[ConfigKey](
_.config,
c => configScalaID(c.name),
ck => configurations.get(ck.name).map(_.description),
"configuration",
)
val taskParser =
axisParser[AttributeKey[_]](_.task, k => keyScalaID(k.label), _.description, "task")
val nonGlobal = (configParser ~ taskParser) map { case (c, t) => Scope(This, c, t, Zero) }
@ -242,8 +241,8 @@ private[sbt] object SettingCompletions {
/** Parser for the assignment method (such as `:=`) for defining `key`. */
def assign(key: ScopedKey[_]): Parser[Assign.Value] = {
val completions = fixedCompletions { (seen, level) =>
completeAssign(seen, level, key).toSet
val completions = fixedCompletions { (seen, _) =>
completeAssign(seen, key).toSet
}
val identifier = Act.filterStrings(Op, Assign.values.map(_.toString), "assignment method") map Assign.withName
token(Space) ~> token(optionallyQuoted(identifier), completions)
@ -267,7 +266,7 @@ private[sbt] object SettingCompletions {
* Completions for an assignment method for `key` given the tab completion `level` and existing partial string `seen`.
* This will filter possible assignment methods based on the underlying type of `key`, so that only `<<=` is shown for input tasks, for example.
*/
def completeAssign(seen: String, level: Int, key: ScopedKey[_]): Seq[Completion] = {
def completeAssign(seen: String, key: ScopedKey[_]): Seq[Completion] = {
val allowed: Iterable[Assign.Value] =
if (appendable(key.key)) Assign.values
else assignNoAppend
@ -284,7 +283,7 @@ private[sbt] object SettingCompletions {
prominentCutoff: Int,
detailLimit: Int): Seq[Completion] =
completeSelectDescribed(seen, level, keys, detailLimit)(_.description) {
case (k, v) => v.rank <= prominentCutoff
case (_, v) => v.rank <= prominentCutoff
}
def completeScope[T](
@ -293,17 +292,17 @@ private[sbt] object SettingCompletions {
definedChoices: Set[String],
allChoices: Map[String, T])(description: T => Option[String]): Seq[Completion] =
completeSelectDescribed(seen, level, allChoices, 10)(description) {
case (k, v) => definedChoices(k)
case (k, _) => definedChoices(k)
}
def completeSelectDescribed[T](seen: String, level: Int, all: Map[String, T], detailLimit: Int)(
description: T => Option[String])(prominent: (String, T) => Boolean): Seq[Completion] = {
val applicable = all.toSeq.filter { case (k, v) => k startsWith seen }
val applicable = all.toSeq.filter { case (k, _) => k startsWith seen }
val prominentOnly = applicable filter { case (k, v) => prominent(k, v) }
val showAll = (level >= 3) || (level == 2 && prominentOnly.size <= detailLimit) || prominentOnly.isEmpty
val showAll = (level >= 3) || (level == 2 && prominentOnly.lengthCompare(detailLimit) <= 0) || prominentOnly.isEmpty
val showKeys = if (showAll) applicable else prominentOnly
val showDescriptions = (level >= 2) || (showKeys.size <= detailLimit)
val showDescriptions = (level >= 2) || showKeys.lengthCompare(detailLimit) <= 0
completeDescribed(seen, showDescriptions, showKeys)(s => description(s).toList.mkString)
}
def completeDescribed[T](seen: String, showDescriptions: Boolean, in: Seq[(String, T)])(
@ -315,14 +314,11 @@ private[sbt] object SettingCompletions {
val withDescriptions = in map { case (id, key) => (id, description(key)) }
val padded = CommandUtil.aligned("", " ", withDescriptions)
(padded, in).zipped.map {
case (line, (id, key)) =>
case (line, (id, _)) =>
Completion.tokenDisplay(append = appendString(id), display = line + "\n")
}
} else
in map {
case (id, key) =>
Completion.tokenDisplay(display = id, append = appendString(id))
}
in map { case (id, _) => Completion.tokenDisplay(display = id, append = appendString(id)) }
}
/**
@ -364,18 +360,6 @@ private[sbt] object SettingCompletions {
keyType(key)(mfToString, mfToString, mfToString)
}
/** True if the `key` represents an input task, false if it represents a task or setting. */
def isInputTask(key: AttributeKey[_]): Boolean =
keyType(key)(const(false), const(false), const(true))
/** True if the `key` represents a setting, false if it represents a task or an input task.*/
def isSetting(key: AttributeKey[_]): Boolean =
keyType(key)(const(true), const(false), const(false))
/** True if the `key` represents a setting or task, false if it is for an input task. */
def isTaskOrSetting(key: AttributeKey[_]): Boolean =
keyType(key)(const(true), const(true), const(false))
/** True if the `key` represents a setting or task that may be appended using an assignment method such as `+=`. */
def appendable(key: AttributeKey[_]): Boolean = {
val underlying = keyUnderlyingType(key).runtimeClass

View File

@ -99,7 +99,7 @@ object Graph {
val withBar = childLines.zipWithIndex flatMap {
case ((line, withBar), pos) if pos < (cs.size - 1) =>
(line +: withBar) map { insertBar(_, 2 * (level + 1)) }
case ((line, withBar), pos) if withBar.lastOption.getOrElse(line).trim != "" =>
case ((line, withBar), _) if withBar.lastOption.getOrElse(line).trim != "" =>
(line +: withBar) ++ Vector(twoSpaces * (level + 1))
case ((line, withBar), _) => line +: withBar
}

View File

@ -61,7 +61,7 @@ private[sbt] final class TaskTimings(shutdown: Boolean) extends ExecuteProgress[
}
}
def ready(state: Unit, task: Task[_]) = ()
def workStarting(task: Task[_]) = timings.put(task, System.nanoTime)
def workStarting(task: Task[_]) = { timings.put(task, System.nanoTime); () }
def workFinished[T](task: Task[T], result: Either[Task[T], Result[T]]) = {
timings.put(task, System.nanoTime - timings.get(task))
result.left.foreach { t =>
@ -81,7 +81,7 @@ private[sbt] final class TaskTimings(shutdown: Boolean) extends ExecuteProgress[
println(s"Total time: $total $unit")
import collection.JavaConverters._
def sumTimes(in: Seq[(Task[_], Long)]) = in.map(_._2).sum
val timingsByName = timings.asScala.toSeq.groupBy { case (t, time) => mappedName(t) } mapValues (sumTimes)
val timingsByName = timings.asScala.toSeq.groupBy { case (t, _) => mappedName(t) } mapValues (sumTimes)
val times = timingsByName.toSeq
.sortBy(_._2)
.reverse

View File

@ -277,7 +277,7 @@ private[sbt] case class SbtParser(file: File, lines: Seq[String]) extends Parsed
modifiedContent: String,
imports: Seq[Tree]
): Seq[(String, Int)] = {
val toLineRange = imports map convertImport(modifiedContent)
val toLineRange = imports map convertImport
val groupedByLineNumber = toLineRange.groupBy { case (_, lineNumber) => lineNumber }
val mergedImports = groupedByLineNumber.map {
case (l, seq) => (l, extractLine(modifiedContent, seq))
@ -286,12 +286,10 @@ private[sbt] case class SbtParser(file: File, lines: Seq[String]) extends Parsed
}
/**
*
* @param modifiedContent - modifiedContent
* @param t - tree
* @return ((start,end),lineNumber)
* @return ((start, end), lineNumber)
*/
private def convertImport(modifiedContent: String)(t: Tree): ((Int, Int), Int) = {
private def convertImport(t: Tree): ((Int, Int), Int) = {
val lineNumber = t.pos.line - 1
((t.pos.start, t.pos.end), lineNumber)
}

View File

@ -57,10 +57,7 @@ private[sbt] object SbtRefactorings {
commands.flatMap {
case (_, command) =>
val map = toTreeStringMap(command)
map.flatMap {
case (name, statement) =>
treesToReplacements(split, name, command)
}
map.flatMap { case (name, _) => treesToReplacements(split, name, command) }
}
private def treesToReplacements(split: SbtParser, name: String, command: Seq[String]) =

View File

@ -9,38 +9,47 @@ package sbt
package internal
package server
import sbt.io.IO
import sbt.internal.inc.MixedAnalyzingCompiler
import sbt.internal.langserver.ErrorCodes
import sbt.util.Logger
import java.io.File
import java.net.URI
import java.nio.file._
import scala.annotation.tailrec
import scala.collection.JavaConverters._
import scala.concurrent.{ ExecutionContext, Future }
import scala.concurrent.duration.Duration.Inf
import scala.util.matching.Regex.MatchIterator
import java.nio.file.{ Files, Paths }
import sbt.StandardMain
import scala.concurrent.duration.Duration
import scala.reflect.NameTransformer
import scala.tools.reflect.{ ToolBox, ToolBoxError }
import scala.util.matching.Regex
import sjsonnew.JsonFormat
import sjsonnew.shaded.scalajson.ast.unsafe.JValue
import sjsonnew.support.scalajson.unsafe.{ CompactPrinter, Converter }
import scalacache._
import sbt.io.IO
import sbt.internal.inc.{ Analysis, MixedAnalyzingCompiler }
import sbt.internal.inc.JavaInterfaceUtil._
import sbt.internal.protocol.JsonRpcResponseError
import sbt.internal.protocol.codec.JsonRPCProtocol
import sbt.internal.langserver
import sbt.internal.langserver.{ ErrorCodes, Location, Position, Range, TextDocumentPositionParams }
import sbt.util.Logger
import sbt.Keys._
private[sbt] object Definition {
import java.net.URI
import Keys._
import sbt.internal.inc.Analysis
import sbt.internal.inc.JavaInterfaceUtil._
val AnalysesKey = "lsp.definition.analyses.key"
import sjsonnew.JsonFormat
def send[A: JsonFormat](source: CommandSource, execId: String)(params: A): Unit = {
for {
channel <- StandardMain.exchange.channels.collectFirst {
case c if c.name == source.channelName => c
}
} yield {
} {
channel.publishEvent(params, Option(execId))
}
}
object textProcessor {
private val isIdentifier = {
import scala.tools.reflect.{ ToolBox, ToolBoxError }
lazy val tb =
scala.reflect.runtime.universe
.runtimeMirror(this.getClass.getClassLoader)
@ -58,23 +67,14 @@ private[sbt] object Definition {
private def findInBackticks(line: String, point: Int): Option[String] = {
val (even, odd) = line.zipWithIndex
.collect {
case (char, backtickIndex) if char == '`' =>
backtickIndex
}
.collect { case (char, backtickIndex) if char == '`' => backtickIndex }
.zipWithIndex
.partition { bs =>
val (_, index) = bs
index % 2 == 0
}
.partition { case (_, index) => index % 2 == 0 }
even
.collect {
case (backtickIndex, _) => backtickIndex
}
.collect { case (backtickIndex, _) => backtickIndex }
.zip {
odd.collect {
case (backtickIndex, _) => backtickIndex + 1
}
odd.collect { case (backtickIndex, _) => backtickIndex + 1 }
}
.collectFirst {
case (from, to) if from <= point && point < to => line.slice(from, to)
@ -83,43 +83,43 @@ private[sbt] object Definition {
def identifier(line: String, point: Int): Option[String] = findInBackticks(line, point).orElse {
val whiteSpaceReg = "(\\s|\\.)+".r
val (zero, end) = fold(Seq.empty)(whiteSpaceReg.findAllIn(line))
.collect {
case (white, ind) => (ind, ind + white.length)
}
.fold((0, line.length)) { (z, e) =>
val (from, to) = e
val (left, right) = z
(if (to > left && to <= point) to else left,
if (from < right && from >= point) from else right)
.fold((0, line.length)) {
case ((left, right), (from, to)) =>
val zero = if (to > left && to <= point) to else left
val end = if (from < right && from >= point) from else right
(zero, end)
}
val ranges = for {
from <- zero to point
to <- point to end
} yield (from -> to)
ranges
.sortBy {
case (from, to) => -(to - from)
}
.foldLeft(Seq.empty[String]) { (z, e) =>
val (from, to) = e
val fragment = line.slice(from, to).trim
z match {
case Nil if fragment.nonEmpty && isIdentifier(fragment) => fragment +: z
case h +: _ if h.length < fragment.length && isIdentifier(fragment) =>
Seq(fragment)
case h +: _ if h.length == fragment.length && isIdentifier(fragment) =>
fragment +: z
case z => z
}
.sortBy { case (from, to) => -(to - from) }
.foldLeft(List.empty[String]) {
case (z, (from, to)) =>
val fragment = line.slice(from, to).trim
if (isIdentifier(fragment))
z match {
case Nil if fragment.nonEmpty => fragment :: z
case h :: _ if h.length < fragment.length => fragment :: Nil
case h :: _ if h.length == fragment.length => fragment :: z
case _ => z
} else z
}
.headOption
}
private def asClassObjectIdentifier(sym: String) =
Seq(s".$sym", s".$sym$$", s"$$$sym", s"$$$sym$$")
def potentialClsOrTraitOrObj(sym: String): PartialFunction[String, String] = {
import scala.reflect.NameTransformer
val encodedSym = NameTransformer.encode(sym.toSeq match {
case '`' +: body :+ '`' => body.mkString
case noBackticked => noBackticked.mkString
@ -135,17 +135,17 @@ private[sbt] object Definition {
}
@tailrec
private def fold(z: Seq[(String, Int)])(it: MatchIterator): Seq[(String, Int)] = {
private def fold(z: Seq[(String, Int)])(it: Regex.MatchIterator): Seq[(String, Int)] = {
if (!it.hasNext) z
else fold(z :+ (it.next() -> it.start))(it)
}
def classTraitObjectInLine(sym: String)(line: String): Seq[(String, Int)] = {
import scala.util.matching.Regex.quote
val potentials =
Seq(s"object\\s+${quote(sym)}".r,
s"trait\\s+${quote(sym)} *\\[?".r,
s"class\\s+${quote(sym)} *\\[?".r)
val potentials = Seq(
s"object\\s+${Regex quote sym}".r,
s"trait\\s+${Regex quote sym} *\\[?".r,
s"class\\s+${Regex quote sym} *\\[?".r,
)
potentials
.flatMap { reg =>
fold(Seq.empty)(reg.findAllIn(line))
@ -156,10 +156,7 @@ private[sbt] object Definition {
}
}
import java.io.File
def markPosition(file: File, sym: String): Seq[(File, Long, Long, Long)] = {
import java.nio.file._
import scala.collection.JavaConverters._
val findInLine = classTraitObjectInLine(sym)(_)
Files
.lines(file.toPath)
@ -179,43 +176,49 @@ private[sbt] object Definition {
}
}
import sbt.internal.langserver.TextDocumentPositionParams
import sjsonnew.shaded.scalajson.ast.unsafe.JValue
private def getDefinition(jsonDefinition: JValue): Option[TextDocumentPositionParams] = {
import sbt.internal.langserver.codec.JsonProtocol._
import sjsonnew.support.scalajson.unsafe.Converter
import langserver.codec.JsonProtocol._
Converter.fromJson[TextDocumentPositionParams](jsonDefinition).toOption
}
import java.io.File
object AnalysesAccess {
private[this] val AnalysesKey = "lsp.definition.analyses.key"
private[server] type Analyses = Set[((String, Boolean), Option[Analysis])]
private[server] def getFrom[F[_]](
cache: Cache[Any]
)(implicit mode: Mode[F], flags: Flags): F[Option[Analyses]] =
mode.M.map(cache.get(AnalysesKey))(_ map (_.asInstanceOf[Analyses]))
private[server] def putIn[F[_]](
cache: Cache[Any],
value: Analyses,
ttl: Option[Duration],
)(implicit mode: Mode[F], flags: Flags): F[Any] =
cache.put(AnalysesKey)(value, ttl)
}
private def storeAnalysis(cacheFile: File, useBinary: Boolean): Option[Analysis] =
MixedAnalyzingCompiler
.staticCachedStore(cacheFile, !useBinary)
.get
.toOption
.collect {
case contents =>
contents.getAnalysis
}
.collect {
case a: Analysis => a
}
.map { _.getAnalysis }
.collect { case a: Analysis => a }
import scalacache._
private[sbt] def updateCache[F[_]](cache: Cache[Any])(cacheFile: String, useBinary: Boolean)(
implicit
mode: Mode[F],
flags: Flags): F[Any] = {
mode.M.flatMap(cache.get(AnalysesKey)) {
mode.M.flatMap(AnalysesAccess.getFrom(cache)) {
case None =>
cache.put(AnalysesKey)(Set(cacheFile -> useBinary -> None), Option(Inf))
AnalysesAccess.putIn(cache, Set(cacheFile -> useBinary -> None), Option(Duration.Inf))
case Some(set) =>
cache.put(AnalysesKey)(
set.asInstanceOf[Set[((String, Boolean), Option[Analysis])]].filterNot {
case ((file, _), _) => file == cacheFile
} + (cacheFile -> useBinary -> None),
Option(Inf))
case _ => mode.M.pure(())
val newSet = set
.filterNot { case ((file, _), _) => file == cacheFile }
.+(cacheFile -> useBinary -> None)
AnalysesAccess.putIn(cache, newSet, Option(Duration.Inf))
}
}
@ -228,17 +231,16 @@ private[sbt] object Definition {
updateCache(StandardMain.cache)(cacheFile, useBinary)
}
private[sbt] def getAnalyses(log: Logger): Future[Seq[Analysis]] = {
private[sbt] def getAnalyses: Future[Seq[Analysis]] = {
import scalacache.modes.scalaFuture._
import scala.concurrent.ExecutionContext.Implicits.global
StandardMain.cache
.get(AnalysesKey)
.collect {
case Some(a) => a.asInstanceOf[Set[((String, Boolean), Option[Analysis])]]
}
AnalysesAccess
.getFrom(StandardMain.cache)
.collect { case Some(a) => a }
.map { caches =>
val (working, uninitialized) = caches.partition { cacheAnalysis =>
cacheAnalysis._2 != None
val (working, uninitialized) = caches.partition {
case (_, Some(_)) => true
case (_, None) => false
}
val addToCache = uninitialized.collect {
case (title @ (file, useBinary), _) if Files.exists(Paths.get(file)) =>
@ -246,7 +248,7 @@ private[sbt] object Definition {
}
val validCaches = working ++ addToCache
if (addToCache.nonEmpty)
StandardMain.cache.put(AnalysesKey)(validCaches, Option(Inf))
AnalysesAccess.putIn(StandardMain.cache, validCaches, Option(Duration.Inf))
validCaches.toSeq.collect {
case (_, Some(analysis)) =>
analysis
@ -254,19 +256,19 @@ private[sbt] object Definition {
}
}
def lspDefinition(jsonDefinition: JValue,
requestId: String,
commandSource: CommandSource,
log: Logger)(implicit ec: ExecutionContext): Future[Unit] = Future {
def lspDefinition(
jsonDefinition: JValue,
requestId: String,
commandSource: CommandSource,
log: Logger,
)(implicit ec: ExecutionContext): Future[Unit] = Future {
val LspDefinitionLogHead = "lsp-definition"
import sjsonnew.support.scalajson.unsafe.CompactPrinter
log.debug(s"$LspDefinitionLogHead json request: ${CompactPrinter(jsonDefinition)}")
lazy val analyses = getAnalyses(log)
val definition = getDefinition(jsonDefinition)
definition
val jsonDefinitionString = CompactPrinter(jsonDefinition)
log.debug(s"$LspDefinitionLogHead json request: $jsonDefinitionString")
lazy val analyses = getAnalyses
getDefinition(jsonDefinition)
.flatMap { definition =>
val uri = URI.create(definition.textDocument.uri)
import java.nio.file._
Files
.lines(Paths.get(uri))
.skip(definition.position.line)
@ -274,11 +276,10 @@ private[sbt] object Definition {
.toOption
.flatMap { line =>
log.debug(s"$LspDefinitionLogHead found line: $line")
textProcessor
.identifier(line, definition.position.character.toInt)
textProcessor.identifier(line, definition.position.character.toInt)
}
}
.map { sym =>
} match {
case Some(sym) =>
log.debug(s"symbol $sym")
analyses
.map { analyses =>
@ -291,40 +292,39 @@ private[sbt] object Definition {
log.debug(s"$LspDefinitionLogHead potentials: $classes")
classes
.flatMap { className =>
analysis.relations.definesClass(className) ++ analysis.relations
.libraryDefinesClass(className)
analysis.relations.definesClass(className) ++
analysis.relations.libraryDefinesClass(className)
}
.flatMap { classFile =>
textProcessor.markPosition(classFile, sym).collect {
case (file, line, from, to) =>
import sbt.internal.langserver.{ Location, Position, Range }
Location(IO.toURI(file).toString,
Range(Position(line, from), Position(line, to)))
Location(
IO.toURI(file).toString,
Range(Position(line, from), Position(line, to)),
)
}
}
}.seq
log.debug(s"$LspDefinitionLogHead locations ${locations}")
import sbt.internal.langserver.codec.JsonProtocol._
log.debug(s"$LspDefinitionLogHead locations $locations")
import langserver.codec.JsonProtocol._
send(commandSource, requestId)(locations.toArray)
}
.recover {
case anyException @ _ =>
log.warn(
s"Problem with processing analyses $anyException for ${CompactPrinter(jsonDefinition)}")
import sbt.internal.protocol.JsonRpcResponseError
import sbt.internal.protocol.codec.JsonRPCProtocol._
send(commandSource, requestId)(
JsonRpcResponseError(ErrorCodes.InternalError,
"Problem with processing analyses.",
None))
case t =>
log.warn(s"Problem with processing analyses $t for $jsonDefinitionString")
val rsp = JsonRpcResponseError(
ErrorCodes.InternalError,
"Problem with processing analyses.",
None,
)
import JsonRPCProtocol._
send(commandSource, requestId)(rsp)
}
}
.orElse {
log.info(s"Symbol not found in definition request ${CompactPrinter(jsonDefinition)}")
import sbt.internal.langserver.Location
import sbt.internal.langserver.codec.JsonProtocol._
()
case None =>
log.info(s"Symbol not found in definition request $jsonDefinitionString")
import langserver.codec.JsonProtocol._
send(commandSource, requestId)(Array.empty[Location])
None
}
}
}
}

View File

@ -10,6 +10,7 @@ package internal
package server
import sjsonnew.JsonFormat
import sjsonnew.shaded.scalajson.ast.unsafe.JValue
import sjsonnew.support.scalajson.unsafe.Converter
import sbt.protocol.Serialization
import sbt.protocol.{ SettingQuery => Q }
@ -19,12 +20,71 @@ import sbt.internal.langserver._
import sbt.internal.util.ObjectEvent
import sbt.util.Logger
private[sbt] case class LangServerError(code: Long, message: String) extends Throwable(message)
private[sbt] final case class LangServerError(code: Long, message: String)
extends Throwable(message)
/**
* Implements Language Server Protocol <https://github.com/Microsoft/language-server-protocol>.
*/
private[sbt] trait LanguageServerProtocol extends CommandChannel {
private[sbt] object LanguageServerProtocol {
lazy val internalJsonProtocol = new InitializeOptionFormats with sjsonnew.BasicJsonProtocol {}
lazy val serverCapabilities: ServerCapabilities = {
ServerCapabilities(textDocumentSync =
TextDocumentSyncOptions(true, 0, false, false, SaveOptions(false)),
hoverProvider = false,
definitionProvider = true)
}
lazy val handler: ServerHandler = ServerHandler({
case callback: ServerCallback =>
import callback._
ServerIntent(
{
import sbt.internal.langserver.codec.JsonProtocol._
import internalJsonProtocol._
def json(r: JsonRpcRequestMessage) =
r.params.getOrElse(
throw LangServerError(ErrorCodes.InvalidParams,
s"param is expected on '${r.method}' method."))
{
case r: JsonRpcRequestMessage if r.method == "initialize" =>
if (authOptions(ServerAuthentication.Token)) {
val param = Converter.fromJson[InitializeParams](json(r)).get
val optionJson = param.initializationOptions.getOrElse(
throw LangServerError(ErrorCodes.InvalidParams,
"initializationOptions is expected on 'initialize' param."))
val opt = Converter.fromJson[InitializeOption](optionJson).get
val token = opt.token.getOrElse(sys.error("'token' is missing."))
if (authenticate(token)) ()
else throw LangServerError(ErrorCodes.InvalidRequest, "invalid token")
} else ()
setInitialized(true)
appendExec(Exec(s"collectAnalyses", Some(r.id), Some(CommandSource(name))))
jsonRpcRespond(InitializeResult(serverCapabilities), Option(r.id))
case r: JsonRpcRequestMessage if r.method == "textDocument/definition" =>
import scala.concurrent.ExecutionContext.Implicits.global
Definition.lspDefinition(json(r), r.id, CommandSource(name), log)
()
case r: JsonRpcRequestMessage if r.method == "sbt/exec" =>
val param = Converter.fromJson[SbtExecParams](json(r)).get
appendExec(Exec(param.commandLine, Some(r.id), Some(CommandSource(name))))
()
case r: JsonRpcRequestMessage if r.method == "sbt/setting" =>
import sbt.protocol.codec.JsonProtocol._
val param = Converter.fromJson[Q](json(r)).get
onSettingQuery(Option(r.id), param)
}
}, {
case n: JsonRpcNotificationMessage if n.method == "textDocument/didSave" =>
appendExec(Exec(";Test/compile; collectAnalyses", None, Some(CommandSource(name))))
()
}
)
})
}
/** Implements Language Server Protocol <https://github.com/Microsoft/language-server-protocol>. */
private[sbt] trait LanguageServerProtocol extends CommandChannel { self =>
lazy val internalJsonProtocol = new InitializeOptionFormats with sjsonnew.BasicJsonProtocol {}
@ -34,51 +94,24 @@ private[sbt] trait LanguageServerProtocol extends CommandChannel {
protected def log: Logger
protected def onSettingQuery(execId: Option[String], req: Q): Unit
protected def onNotification(notification: JsonRpcNotificationMessage): Unit = {
log.debug(s"onNotification: $notification")
notification.method match {
case "textDocument/didSave" =>
append(Exec(";Test/compile; collectAnalyses", None, Some(CommandSource(name))))
case u => log.debug(s"Unhandled notification received: $u")
}
}
protected lazy val callbackImpl: ServerCallback = new ServerCallback {
def jsonRpcRespond[A: JsonFormat](event: A, execId: Option[String]): Unit =
self.jsonRpcRespond(event, execId)
protected def onRequestMessage(request: JsonRpcRequestMessage): Unit = {
import sbt.internal.langserver.codec.JsonProtocol._
import internalJsonProtocol._
def json =
request.params.getOrElse(
throw LangServerError(ErrorCodes.InvalidParams,
s"param is expected on '${request.method}' method."))
log.debug(s"onRequestMessage: $request")
request.method match {
case "initialize" =>
if (authOptions(ServerAuthentication.Token)) {
val param = Converter.fromJson[InitializeParams](json).get
val optionJson = param.initializationOptions.getOrElse(
throw LangServerError(ErrorCodes.InvalidParams,
"initializationOptions is expected on 'initialize' param."))
val opt = Converter.fromJson[InitializeOption](optionJson).get
val token = opt.token.getOrElse(sys.error("'token' is missing."))
if (authenticate(token)) ()
else throw LangServerError(ErrorCodes.InvalidRequest, "invalid token")
} else ()
setInitialized(true)
append(Exec(s"collectAnalyses", Some(request.id), Some(CommandSource(name))))
langRespond(InitializeResult(serverCapabilities), Option(request.id))
case "textDocument/definition" =>
import scala.concurrent.ExecutionContext.Implicits.global
Definition.lspDefinition(json, request.id, CommandSource(name), log)
case "sbt/exec" =>
val param = Converter.fromJson[SbtExecParams](json).get
append(Exec(param.commandLine, Some(request.id), Some(CommandSource(name))))
case "sbt/setting" => {
import sbt.protocol.codec.JsonProtocol._
val param = Converter.fromJson[Q](json).get
onSettingQuery(Option(request.id), param)
}
case unhandledRequest => log.debug(s"Unhandled request received: $unhandledRequest")
}
def jsonRpcRespondError(execId: Option[String], code: Long, message: String): Unit =
self.jsonRpcRespondError(execId, code, message)
def jsonRpcNotify[A: JsonFormat](method: String, params: A): Unit =
self.jsonRpcNotify(method, params)
def appendExec(exec: Exec): Boolean = self.append(exec)
def log: Logger = self.log
def name: String = self.name
private[sbt] def authOptions: Set[ServerAuthentication] = self.authOptions
private[sbt] def authenticate(token: String): Boolean = self.authenticate(token)
private[sbt] def setInitialized(value: Boolean): Unit = self.setInitialized(value)
private[sbt] def onSettingQuery(execId: Option[String], req: Q): Unit =
self.onSettingQuery(execId, req)
}
/**
@ -94,7 +127,7 @@ private[sbt] trait LanguageServerProtocol extends CommandChannel {
// LanguageServerReporter sends PublishDiagnosticsParams
case "sbt.internal.langserver.PublishDiagnosticsParams" =>
// val p = event.message.asInstanceOf[PublishDiagnosticsParams]
// langNotify("textDocument/publishDiagnostics", p)
// jsonRpcNotify("textDocument/publishDiagnostics", p)
case "xsbti.Problem" =>
() // ignore
case _ =>
@ -103,62 +136,53 @@ private[sbt] trait LanguageServerProtocol extends CommandChannel {
}
}
/**
* Respond back to Language Server's client.
*/
private[sbt] def langRespond[A: JsonFormat](event: A, execId: Option[String]): Unit = {
/** Respond back to Language Server's client. */
private[sbt] def jsonRpcRespond[A: JsonFormat](event: A, execId: Option[String]): Unit = {
val m =
JsonRpcResponseMessage("2.0", execId, Option(Converter.toJson[A](event).get), None)
val bytes = Serialization.serializeResponseMessage(m)
publishBytes(bytes)
}
/**
* Respond back to Language Server's client.
*/
private[sbt] def langError(execId: Option[String], code: Long, message: String): Unit = {
val e = JsonRpcResponseError(code, message, None)
/** Respond back to Language Server's client. */
private[sbt] def jsonRpcRespondError(execId: Option[String], code: Long, message: String): Unit =
jsonRpcRespondErrorImpl(execId, code, message, None)
/** Respond back to Language Server's client. */
private[sbt] def jsonRpcRespondError[A: JsonFormat](
execId: Option[String],
code: Long,
message: String,
data: A,
): Unit =
jsonRpcRespondErrorImpl(execId, code, message, Option(Converter.toJson[A](data).get))
private[this] def jsonRpcRespondErrorImpl(
execId: Option[String],
code: Long,
message: String,
data: Option[JValue],
): Unit = {
val e = JsonRpcResponseError(code, message, data)
val m = JsonRpcResponseMessage("2.0", execId, None, Option(e))
val bytes = Serialization.serializeResponseMessage(m)
publishBytes(bytes)
}
/**
* Respond back to Language Server's client.
*/
private[sbt] def langError[A: JsonFormat](execId: Option[String],
code: Long,
message: String,
data: A): Unit = {
val e = JsonRpcResponseError(code, message, Option(Converter.toJson[A](data).get))
val m = JsonRpcResponseMessage("2.0", execId, None, Option(e))
val bytes = Serialization.serializeResponseMessage(m)
publishBytes(bytes)
}
/**
* Notify to Language Server's client.
*/
private[sbt] def langNotify[A: JsonFormat](method: String, params: A): Unit = {
/** Notify to Language Server's client. */
private[sbt] def jsonRpcNotify[A: JsonFormat](method: String, params: A): Unit = {
val m =
JsonRpcNotificationMessage("2.0", method, Option(Converter.toJson[A](params).get))
log.debug(s"langNotify: $m")
log.debug(s"jsonRpcNotify: $m")
val bytes = Serialization.serializeNotificationMessage(m)
publishBytes(bytes)
}
def logMessage(level: String, message: String): Unit = {
import sbt.internal.langserver.codec.JsonProtocol._
langNotify(
jsonRpcNotify(
"window/logMessage",
LogMessageParams(MessageType.fromLevelString(level), message)
)
}
private[sbt] lazy val serverCapabilities: ServerCapabilities = {
ServerCapabilities(textDocumentSync =
TextDocumentSyncOptions(true, 0, false, false, SaveOptions(false)),
hoverProvider = false,
definitionProvider = true)
}
}

View File

@ -26,6 +26,7 @@ final class NetworkChannel(val name: String,
structure: BuildStructure,
auth: Set[ServerAuthentication],
instance: ServerInstance,
handlers: Seq[ServerHandler],
val log: Logger)
extends CommandChannel
with LanguageServerProtocol {
@ -45,18 +46,12 @@ final class NetworkChannel(val name: String,
private val VsCodeOld = "application/vscode-jsonrpc; charset=utf8"
private lazy val jsonFormat = new sjsonnew.BasicJsonProtocol with JValueFormats {}
def setContentType(ct: String): Unit = synchronized {
_contentType = ct
}
def setContentType(ct: String): Unit = synchronized { _contentType = ct }
def contentType: String = _contentType
protected def authenticate(token: String): Boolean = {
instance.authenticate(token)
}
protected def authenticate(token: String): Boolean = instance.authenticate(token)
protected def setInitialized(value: Boolean): Unit = {
initialized = value
}
protected def setInitialized(value: Boolean): Unit = initialized = value
protected def authOptions: Set[ServerAuthentication] = auth
@ -73,10 +68,8 @@ final class NetworkChannel(val name: String,
var bytesRead = 0
def resetChannelState(): Unit = {
contentLength = 0
// contentType = ""
state = SingleLine
}
def tillEndOfLine: Option[Vector[Byte]] = {
val delimPos = buffer.indexOf(delimiter)
if (delimPos > 0) {
@ -165,6 +158,21 @@ final class NetworkChannel(val name: String,
}
}
private lazy val intents = {
val cb = callbackImpl
handlers.toVector map { h =>
h.handler(cb)
}
}
lazy val onRequestMessage: PartialFunction[JsonRpcRequestMessage, Unit] =
intents.foldLeft(PartialFunction.empty[JsonRpcRequestMessage, Unit]) {
case (f, i) => f orElse i.onRequest
}
lazy val onNotification: PartialFunction[JsonRpcNotificationMessage, Unit] =
intents.foldLeft(PartialFunction.empty[JsonRpcNotificationMessage, Unit]) {
case (f, i) => f orElse i.onNotification
}
def handleBody(chunk: Vector[Byte]): Unit = {
if (isLanguageServerProtocol) {
Serialization.deserializeJsonMessage(chunk) match {
@ -174,7 +182,7 @@ final class NetworkChannel(val name: String,
} catch {
case LangServerError(code, message) =>
log.debug(s"sending error: $code: $message")
langError(Option(req.id), code, message)
jsonRpcRespondError(Option(req.id), code, message)
}
case Right(ntf: JsonRpcNotificationMessage) =>
try {
@ -182,13 +190,13 @@ final class NetworkChannel(val name: String,
} catch {
case LangServerError(code, message) =>
log.debug(s"sending error: $code: $message")
langError(None, code, message) // new id?
jsonRpcRespondError(None, code, message) // new id?
}
case Right(msg) =>
log.debug(s"Unhandled message: $msg")
case Left(errorDesc) =>
val msg = s"Got invalid chunk from client (${new String(chunk.toArray, "UTF-8")}): " + errorDesc
langError(None, ErrorCodes.ParseError, msg)
jsonRpcRespondError(None, ErrorCodes.ParseError, msg)
}
} else {
contentType match {
@ -230,7 +238,7 @@ final class NetworkChannel(val name: String,
private[sbt] def notifyEvent[A: JsonFormat](method: String, params: A): Unit = {
if (isLanguageServerProtocol) {
langNotify(method, params)
jsonRpcNotify(method, params)
} else {
()
}
@ -242,11 +250,11 @@ final class NetworkChannel(val name: String,
case entry: StringEvent => logMessage(entry.level, entry.message)
case entry: ExecStatusEvent =>
entry.exitCode match {
case None => langRespond(event, entry.execId)
case Some(0) => langRespond(event, entry.execId)
case Some(exitCode) => langError(entry.execId, exitCode, "")
case None => jsonRpcRespond(event, entry.execId)
case Some(0) => jsonRpcRespond(event, entry.execId)
case Some(exitCode) => jsonRpcRespondError(entry.execId, exitCode, "")
}
case _ => langRespond(event, execId)
case _ => jsonRpcRespond(event, execId)
}
} else {
contentType match {
@ -258,8 +266,6 @@ final class NetworkChannel(val name: String,
}
}
def publishEvent[A: JsonFormat](event: A): Unit = publishEvent(event, None)
def publishEventMessage(event: EventMessage): Unit = {
if (isLanguageServerProtocol) {
event match {
@ -337,6 +343,7 @@ final class NetworkChannel(val name: String,
if (initialized) {
append(
Exec(cmd.commandLine, cmd.execId orElse Some(Exec.newExecId), Some(CommandSource(name))))
()
} else {
log.warn(s"ignoring command $cmd before initialization")
}
@ -346,8 +353,8 @@ final class NetworkChannel(val name: String,
if (initialized) {
import sbt.protocol.codec.JsonProtocol._
SettingQuery.handleSettingQueryEither(req, structure) match {
case Right(x) => langRespond(x, execId)
case Left(s) => langError(execId, ErrorCodes.InvalidParams, s)
case Right(x) => jsonRpcRespond(x, execId)
case Left(s) => jsonRpcRespondError(execId, ErrorCodes.InvalidParams, s)
}
} else {
log.warn(s"ignoring query $req before initialization")

View File

@ -21,7 +21,7 @@ import sjsonnew.support.scalajson.unsafe._
object SettingQuery {
import sbt.internal.util.{ AttributeKey, Settings }
import sbt.internal.util.complete.{ DefaultParsers, Parser }, DefaultParsers._
import sbt.Def.{ showBuildRelativeKey, ScopedKey }
import sbt.Def.{ showBuildRelativeKey2, ScopedKey }
// Similar to Act.ParsedAxis / Act.projectRef / Act.resolveProject except you can't omit the project reference
@ -67,7 +67,7 @@ object SettingQuery {
data: Settings[Scope]
): Parser[ParsedKey] =
scopedKeyFull(index, currentBuild, defaultConfigs, keyMap) flatMap { choices =>
Act.select(choices, data)(showBuildRelativeKey(currentBuild, index.buildURIs.size > 1))
Act.select(choices, data)(showBuildRelativeKey2(currentBuild))
}
def scopedKey(

View File

@ -0,0 +1,19 @@
/*
* sbt
* Copyright 2011 - 2017, Lightbend, Inc.
* Copyright 2008 - 2010, Mark Harrah
* Licensed under BSD-3-Clause license (see LICENSE)
*/
package sbt
package plugins
import Keys._
object SbtPlugin extends AutoPlugin {
override def requires = ScriptedPlugin
override lazy val projectSettings = Seq(
sbtPlugin := true
)
}

View File

@ -47,7 +47,7 @@ object Delegates extends Properties("delegates") {
}
}
property("Initial scope present with all combinations of Global axes") = allAxes(
globalCombinations)
(s, ds, _) => globalCombinations(s, ds))
property("initial scope first") = forAll { (keys: Keys) =>
allDelegates(keys) { (scope, ds) =>
@ -66,6 +66,7 @@ object Delegates extends Properties("delegates") {
all(f(s, ds, _.project), f(s, ds, _.config), f(s, ds, _.task), f(s, ds, _.extra))
}
}
def allDelegates(keys: Keys)(f: (Scope, Seq[Scope]) => Prop): Prop =
all(keys.scopes map { scope =>
val delegates = keys.env.delegates(scope)
@ -73,16 +74,20 @@ object Delegates extends Properties("delegates") {
("Delegates:\n\t" + delegates.map(scope => Scope.display(scope, "_")).mkString("\n\t")) |:
f(scope, delegates)
}: _*)
def alwaysZero(s: Scope, ds: Seq[Scope], axis: Scope => ScopeAxis[_]): Prop =
(axis(s) != Zero) ||
all(ds map { d =>
(axis(d) == Zero): Prop
}: _*)
def globalCombinations(s: Scope, ds: Seq[Scope], axis: Scope => ScopeAxis[_]): Prop = {
val mods = List[Scope => Scope](_.copy(project = Zero),
_.copy(config = Zero),
_.copy(task = Zero),
_.copy(extra = Zero))
def globalCombinations(s: Scope, ds: Seq[Scope]): Prop = {
val mods = List[Scope => Scope](
_.copy(project = Zero),
_.copy(config = Zero),
_.copy(task = Zero),
_.copy(extra = Zero),
)
val modAndIdent = mods.map(_ :: idFun[Scope] :: Nil)
def loop(cur: Scope, acc: List[Scope], rem: List[Seq[Scope => Scope]]): Seq[Scope] =

View File

@ -8,166 +8,114 @@
package sbt
import Def.{ displayFull, displayMasked, ScopedKey }
import sbt.internal.{ TestBuild, Resolve }
import TestBuild._
import sbt.internal.util.complete._
import sbt.internal.{ TestBuild, Resolve }, TestBuild._
import sbt.internal.util.complete.Parser
import org.scalacheck._
import Gen._
import Prop._
import Arbitrary.arbBool
import org.scalacheck._, Arbitrary.arbitrary, Gen._, Prop._
/**
* Tests that the scoped key parser in Act can correctly parse a ScopedKey converted by Def.show*Key.
* This includes properly resolving omitted components.
*/
object ParseKey extends Properties("Key parser test") {
final val MaxKeys = 5
final val MaxScopedKeys = 100
implicit val gstructure = genStructure
property("An explicitly specified axis is always parsed to that explicit value") =
forAllNoShrink(structureDefinedKey) { (skm: StructureKeyMask) =>
import skm.{ structure, key, mask => mask0 }
val hasZeroConfig = key.scope.config == Zero
val mask = if (hasZeroConfig) mask0.copy(project = true) else mask0
val expected = resolve(structure, key, mask)
// Note that this explicitly displays the configuration axis set to Zero.
// This is to disambiguate `proj/Zero/name`, which could render potentially
// as `Zero/name`, but could be interpretted as `Zero/Zero/name`.
val s = displayMasked(key, mask, hasZeroConfig)
("Key: " + displayPedantic(key)) |:
parseExpected(structure, s, expected, mask)
}
property("An unspecified project axis resolves to the current project") =
forAllNoShrink(structureDefinedKey) { (skm: StructureKeyMask) =>
import skm.{ structure, key }
val mask = skm.mask.copy(project = false)
val string = displayMasked(key, mask)
// skip when config axis is set to Zero
val hasZeroConfig = key.scope.config == Zero
("Key: " + displayPedantic(key)) |:
("Mask: " + mask) |:
("Current: " + structure.current) |:
parse(structure, string) {
case Left(err) => false
case Right(sk) if hasZeroConfig => true
case Right(sk) => sk.scope.project == Select(structure.current)
}
}
property("An unspecified task axis resolves to Zero") = forAllNoShrink(structureDefinedKey) {
property("An explicitly specified axis is always parsed to that explicit value") = forAll {
(skm: StructureKeyMask) =>
import skm.{ structure, key }
val mask = skm.mask.copy(task = false)
val string = displayMasked(key, mask)
val hasZeroConfig = key.scope.config == Zero
val mask = if (hasZeroConfig) skm.mask.copy(project = true) else skm.mask
// Note that this explicitly displays the configuration axis set to Zero.
// This is to disambiguate `proj/Zero/name`, which could render potentially
// as `Zero/name`, but could be interpreted as `Zero/Zero/name`.
val expected = ScopedKey(
Resolve(structure.extra, Select(structure.current), key.key, mask)(key.scope),
key.key
)
parseCheck(structure, key, mask, hasZeroConfig)(
sk =>
Project.equal(sk, expected, mask)
:| s"$sk.key == $expected.key: ${sk.key == expected.key}"
:| s"${sk.scope} == ${expected.scope}: ${Scope.equal(sk.scope, expected.scope, mask)}"
) :| s"Expected: ${displayFull(expected)}"
}
("Key: " + displayPedantic(key)) |:
("Mask: " + mask) |:
parse(structure, string) {
case Left(err) => false
case Right(sk) => sk.scope.task == Zero
}
property("An unspecified project axis resolves to the current project") = forAll {
(skm: StructureKeyMask) =>
import skm.{ structure, key }
val mask = skm.mask.copy(project = false)
// skip when config axis is set to Zero
val hasZeroConfig = key.scope.config == Zero
parseCheck(structure, key, mask)(
sk =>
(hasZeroConfig || sk.scope.project == Select(structure.current))
:| s"Current: ${structure.current}"
)
}
property("An unspecified task axis resolves to Zero") = forAll { (skm: StructureKeyMask) =>
import skm.{ structure, key }
val mask = skm.mask.copy(task = false)
parseCheck(structure, key, mask)(_.scope.task == Zero)
}
property(
"An unspecified configuration axis resolves to the first configuration directly defining the key or else Zero") =
forAllNoShrink(structureDefinedKey) { (skm: StructureKeyMask) =>
forAll { (skm: StructureKeyMask) =>
import skm.{ structure, key }
val mask = ScopeMask(config = false)
val string = displayMasked(key, mask)
val resolvedConfig = Resolve.resolveConfig(structure.extra, key.key, mask)(key.scope).config
("Key: " + displayPedantic(key)) |:
("Mask: " + mask) |:
("Expected configuration: " + resolvedConfig.map(_.name)) |:
parse(structure, string) {
case Right(sk) => (sk.scope.config == resolvedConfig) || (sk.scope == Scope.GlobalScope)
case Left(err) => false
}
parseCheck(structure, key, mask)(
sk => (sk.scope.config == resolvedConfig) || (sk.scope == Scope.GlobalScope)
) :| s"Expected configuration: ${resolvedConfig map (_.name)}"
}
def displayPedantic(scoped: ScopedKey[_]): String =
Scope.displayPedantic(scoped.scope, scoped.key.label)
lazy val structureDefinedKey: Gen[StructureKeyMask] = structureKeyMask { s =>
for (scope <- TestBuild.scope(s.env); key <- oneOf(s.allAttributeKeys.toSeq))
yield ScopedKey(scope, key)
}
def structureKeyMask(genKey: Structure => Gen[ScopedKey[_]])(
implicit maskGen: Gen[ScopeMask],
structureGen: Gen[Structure]): Gen[StructureKeyMask] =
for (mask <- maskGen; structure <- structureGen; key <- genKey(structure))
yield new StructureKeyMask(structure, key, mask)
final class StructureKeyMask(val structure: Structure, val key: ScopedKey[_], val mask: ScopeMask)
def resolve(structure: Structure, key: ScopedKey[_], mask: ScopeMask): ScopedKey[_] =
ScopedKey(Resolve(structure.extra, Select(structure.current), key.key, mask)(key.scope),
key.key)
def parseExpected(structure: Structure,
s: String,
expected: ScopedKey[_],
mask: ScopeMask): Prop =
("Expected: " + displayFull(expected)) |:
("Mask: " + mask) |:
parse(structure, s) {
case Left(err) => false
case Right(sk) =>
(s"${sk}.key == ${expected}.key: ${sk.key == expected.key}") |:
(s"${sk.scope} == ${expected.scope}: ${Scope.equal(sk.scope, expected.scope, mask)}") |:
Project.equal(sk, expected, mask)
}
def parse(structure: Structure, s: String)(f: Either[String, ScopedKey[_]] => Prop): Prop = {
val parser = makeParser(structure)
val parsed = DefaultParsers.result(parser, s).left.map(_().toString)
val showParsed = parsed.right.map(displayFull)
("Key string: '" + s + "'") |:
("Parsed: " + showParsed) |:
("Structure: " + structure) |:
f(parsed)
}
// Here we're shadowing the in-scope implicit called `mkEnv` for this method
// so that it will use the passed-in `Gen` rather than the one imported
// from TestBuild.
def genStructure(implicit mkEnv: Gen[Env]): Gen[Structure] =
structureGenF { (scopes: Seq[Scope], env: Env, current: ProjectRef) =>
val settings =
for {
scope <- scopes
t <- env.tasks
} yield Def.setting(ScopedKey(scope, t.key), Def.value(""))
TestBuild.structure(env, settings, current)
}
// Here we're shadowing the in-scope implicit called `mkEnv` for this method
// so that it will use the passed-in `Gen` rather than the one imported
// from TestBuild.
def structureGenF(f: (Seq[Scope], Env, ProjectRef) => Structure)(
implicit mkEnv: Gen[Env]): Gen[Structure] =
structureGen((s, e, p) => Gen.const(f(s, e, p)))
// Here we're shadowing the in-scope implicit called `mkEnv` for this method
// so that it will use the passed-in `Gen` rather than the one imported
// from TestBuild.
def structureGen(f: (Seq[Scope], Env, ProjectRef) => Gen[Structure])(
implicit mkEnv: Gen[Env]): Gen[Structure] =
implicit val arbStructure: Arbitrary[Structure] = Arbitrary {
for {
env <- mkEnv
loadFactor <- choose(0.0, 1.0)
scopes <- pickN(loadFactor, env.allFullScopes)
current <- oneOf(env.allProjects.unzip._1)
structure <- f(scopes, env, current)
structure <- {
val settings = for (scope <- scopes; t <- env.tasks)
yield Def.setting(ScopedKey(scope, t.key), Def.value(""))
TestBuild.structure(env, settings, current)
}
} yield structure
}
// pickN is a function that randomly picks load % items from the from sequence.
final class StructureKeyMask(val structure: Structure, val key: ScopedKey[_], val mask: ScopeMask)
implicit val arbStructureKeyMask: Arbitrary[StructureKeyMask] = Arbitrary {
for {
mask <- maskGen
structure <- arbitrary[Structure]
key <- for {
scope <- TestBuild.scope(structure.env)
key <- oneOf(structure.allAttributeKeys.toSeq)
} yield ScopedKey(scope, key)
} yield new StructureKeyMask(structure, key, mask)
}
def parseCheck(
structure: Structure,
key: ScopedKey[_],
mask: ScopeMask,
showZeroConfig: Boolean = false,
)(f: ScopedKey[_] => Prop): Prop = {
val s = displayMasked(key, mask, showZeroConfig)
val parser = makeParser(structure)
val parsed = Parser.result(parser, s).left.map(_().toString)
(
parsed.fold(_ => falsified, f)
:| s"Key: ${Scope.displayPedantic(key.scope, key.key.label)}"
:| s"Mask: $mask"
:| s"Key string: '$s'"
:| s"Parsed: ${parsed.right.map(displayFull)}"
:| s"Structure: $structure"
)
}
// pickN is a function that randomly picks load % items from the "from" sequence.
// The rest of the tests expect at least one item, so I changed it to return 1 in case of 0.
def pickN[T](load: Double, from: Seq[T]): Gen[Seq[T]] =
pick(Math.max((load * from.size).toInt, 1), from)
pick((load * from.size).toInt max 1, from)
}

View File

@ -39,18 +39,18 @@ object PluginsTest extends Specification {
}
"throw an AutoPluginException on conflicting requirements" in {
deducePlugin(S, log) must throwAn[AutoPluginException](
message = """Contradiction in enabled plugins:
- requested: sbt.AI\$S
- enabled: sbt.AI\$S, sbt.AI\$Q, sbt.AI\$R, sbt.AI\$B, sbt.AI\$A
- conflict: sbt.AI\$R is enabled by sbt.AI\$Q; excluded by sbt.AI\$S""")
message = s"""Contradiction in enabled plugins:
- requested: sbt.AI\\$$S
- enabled: sbt.AI\\$$S, sbt.AI\\$$Q, sbt.AI\\$$R, sbt.AI\\$$B, sbt.AI\\$$A
- conflict: sbt.AI\\$$R is enabled by sbt.AI\\$$Q; excluded by sbt.AI\\$$S""")
}
"generates a detailed report on conflicting requirements" in {
deducePlugin(T && U, log) must throwAn[AutoPluginException](message =
"""Contradiction in enabled plugins:
- requested: sbt.AI\$T && sbt.AI\$U
- enabled: sbt.AI\$U, sbt.AI\$T, sbt.AI\$A, sbt.AI\$Q, sbt.AI\$R, sbt.AI\$B
- conflict: sbt.AI\$Q is enabled by sbt.AI\$A && sbt.AI\$B; required by sbt.AI\$T, sbt.AI\$R; excluded by sbt.AI\$U
- conflict: sbt.AI\$R is enabled by sbt.AI\$Q; excluded by sbt.AI\$T""")
deducePlugin(T && U, log) must throwAn[AutoPluginException](
message = s"""Contradiction in enabled plugins:
- requested: sbt.AI\\$$T && sbt.AI\\$$U
- enabled: sbt.AI\\$$U, sbt.AI\\$$T, sbt.AI\\$$A, sbt.AI\\$$Q, sbt.AI\\$$R, sbt.AI\\$$B
- conflict: sbt.AI\\$$Q is enabled by sbt.AI\\$$A && sbt.AI\\$$B; required by sbt.AI\\$$T, sbt.AI\\$$R; excluded by sbt.AI\\$$U
- conflict: sbt.AI\\$$R is enabled by sbt.AI\\$$Q; excluded by sbt.AI\\$$T""")
}
}
}

View File

@ -142,7 +142,6 @@ abstract class TestBuild {
inheritProject,
inheritConfig,
inheritTask,
(ref, mp) => Nil
)
lazy val allFullScopes: Seq[Scope] =
for {
@ -213,7 +212,7 @@ abstract class TestBuild {
}
def structure(env: Env, settings: Seq[Setting[_]], current: ProjectRef): Structure = {
implicit val display = Def.showRelativeKey(current, env.allProjects.size > 1)
implicit val display = Def.showRelativeKey2(current)
if (settings.isEmpty) {
try {
sys.error("settings is empty")

View File

@ -77,8 +77,7 @@ class ErrorSpec extends AbstractSpec {
case exception: MessageOnlyException =>
val error = exception.getMessage
"""(\d+)""".r.findFirstIn(error) match {
case Some(x) =>
true
case Some(_) => true
case None =>
println(s"Number not found in $error")
false

View File

@ -9,8 +9,6 @@ package sbt
package internal
package server
import sbt.internal.inc.Analysis
class DefinitionTest extends org.specs2.mutable.Specification {
import Definition.textProcessor
@ -126,9 +124,12 @@ class DefinitionTest extends org.specs2.mutable.Specification {
textProcessor.classTraitObjectInLine("B")("trait A ") must be empty
}
}
"definition" should {
import scalacache.caffeine._
import scalacache.modes.sync._
"cache data in cache" in {
val cache = CaffeineCache[Any]
val cacheFile = "Test.scala"
@ -136,12 +137,11 @@ class DefinitionTest extends org.specs2.mutable.Specification {
Definition.updateCache(cache)(cacheFile, useBinary)
val actual = cache.get(Definition.AnalysesKey)
val actual = Definition.AnalysesAccess.getFrom(cache)
actual.collect {
case s => s.asInstanceOf[Set[((String, Boolean), Option[Analysis])]]
}.get should contain("Test.scala" -> true -> None)
actual.get should contain("Test.scala" -> true -> None)
}
"replace cache data in cache" in {
val cache = CaffeineCache[Any]
val cacheFile = "Test.scala"
@ -151,12 +151,11 @@ class DefinitionTest extends org.specs2.mutable.Specification {
Definition.updateCache(cache)(cacheFile, falseUseBinary)
Definition.updateCache(cache)(cacheFile, useBinary)
val actual = cache.get(Definition.AnalysesKey)
val actual = Definition.AnalysesAccess.getFrom(cache)
actual.collect {
case s => s.asInstanceOf[Set[((String, Boolean), Option[Analysis])]]
}.get should contain("Test.scala" -> true -> None)
actual.get should contain("Test.scala" -> true -> None)
}
"cache more data in cache" in {
val cache = CaffeineCache[Any]
val cacheFile = "Test.scala"
@ -167,11 +166,9 @@ class DefinitionTest extends org.specs2.mutable.Specification {
Definition.updateCache(cache)(otherCacheFile, otherUseBinary)
Definition.updateCache(cache)(cacheFile, useBinary)
val actual = cache.get(Definition.AnalysesKey)
val actual = Definition.AnalysesAccess.getFrom(cache)
actual.collect {
case s => s.asInstanceOf[Set[((String, Boolean), Option[Analysis])]]
}.get should contain("Test.scala" -> true -> None, "OtherTest.scala" -> false -> None)
actual.get should contain("Test.scala" -> true -> None, "OtherTest.scala" -> false -> None)
}
}
}

View File

@ -122,7 +122,7 @@ object SettingQueryTest extends org.specs2.mutable.Specification {
.put(globalBaseDirectory, globalDirFile)
val config0 = defaultPreGlobal(state, baseFile, globalDirFile, state.log)
val config = defaultWithGlobal(state, baseFile, config0, globalDirFile, state.log)
val config = defaultWithGlobal(state, baseFile, config0, globalDirFile)
val buildUnit: BuildUnit = {
val loadedPlugins: LoadedPlugins =

View File

@ -17,7 +17,7 @@
### autoStartServer setting
sbt 1.1.1 adds a new global `Boolean` setting called `autoStartServer`, which is set to `true` by default.
When set to `true`, sbt shell will automatically start sbt server. Otherwise, it will not start the server until `startSever` command is issued. This could be used to opt out of server for security reasons.
When set to `true`, sbt shell will automatically start sbt server. Otherwise, it will not start the server until `startServer` command is issued. This could be used to opt out of server for security reasons.
[#3922][3922] by [@swaldman][@swaldman]

Some files were not shown because too many files have changed in this diff Show More