When source generators write into the unmanaged source directory, bad
things can happen. Continuous builds will loop indefinitely and
compiling will fail because the generated sources get added to the
source list twice, causing the incremental compiler to complain about
compiling classes it has already seen. My two-pronged solution is to
de-duplicate the sources task and to filter out managed source files in
watch sources. The drawback to the latter is that it causes the source
generation task to be executed twice per compile.
In https://github.com/sbt/io/pull/142, I add a new api for watching for
source file events. This commit updates sbt to use the new EventMonitor
based api. The EventMonitor has an anti-entropy parameter, so that
multiple events on the same file in a short window of time do not
trigger a build. I add a key to tune it.
The implementation of executeContinuously is pretty similar. The main
changes are that shouldTerminate now blocks (EventMonitor spins up a
thread to check the termination condition) and that the
EventMonitor.watch method only returns a Boolean. This is because
the event monitor contains mutable state. It does, however, have a
state() method that returns an immutable snapshot of the state.
The existing filter caused SourceModificationWatch.watch to ignore
deleted files because !file.exists implies !file.isFile. The intention
of the filter was to exclude directories that had a name ending in
".scala".
Fixes#3849
This brings back the 0.13 logic:
```scala
def setGlobalLogLevel(s: State, level: Level.Value): State = {
s.globalLogging.full match {
case a: AbstractLogger => a.setLevel(level)
case _ => ()
}
s.put(BasicKeys.explicitGlobalLogLevels, true).put(Keys.logLevel.key, level)
}
```
* 1.1.x:
Update mimaPreviousArtifacts/sbt.version
Introduce SBT_GLOBAL_SERVER_DIR env var to override too long paths
Handle very long socket file paths on UNIX
Conflicts:
project/build.properties
This works around the name conflict between sbt.test package and sbt.Keys.test.
1. sbt.test package is renamed to sbt.scriptedtest. This allows 1.0 plugins and builds to use `test` to mean `Keys.test`.
2. To keep binary compatibility for sbt 0.13 scripted, I am adding `sbt.test.ScriptedRunner` and `sbt.test.ScriptedTests` in `scripted-plugin` artifact.
3. Another affected user is Giter8 plugin that uses ScriptedPlugin. Since the intereactions are limited to `sbt.ScriptedPlugin.*`, we should be fine here. - https://github.com/foundweekends/giter8/blob/v0.11.0-M2/plugin/src/main/scala-sbt-1.0/giter8/SBTCompat.scala
Fixes#3538
This brings in `sbt.ScriptedPlugin` as `sbt.plugins.ScriptedPlugin` into sbt mothership.
In addition, `sbt.plugins.SbtPlugin` is added that enables the scripted plugin and `sbtPlugin := true`.
This allows plugin authors to bring in scripted plugin by writing:
```scala
lazy val root = (project in file("."))
.enablePlugins(SbtPlugin)
```
Fixes#3841
This fixes console task that internally uses JLine. When `console` is started from batch mode, the tab is printed as is. This is because JLine is not initialized yet.
Calling `usingTerminal` initializes and restores the terminal afterwards.
There are just too many instances in which sbt's code relies on
the `lastModified`/`setLastModified` semantics, so instead of moving
to `get`/`setModifiedTime`, we use new IO calls that offer the new
timestamp precision, but retain the old semantics.
Previously I was seeing the error upon the first scripted test. I thought it was because Main was somehow not early enough. It might just be because scripted technically runs as part of the build.
Ref sbt/io#110
Fixes#3823
When you launch a second instance of sbt on a build, prior to this change it was displaying `java.io.IOException: sbt server is already running` on every command. This make it a bit less aggressive, and just display a warning once.
```
[warn] Is another instance of sbt is running on this build?
[warn] Running multiple instances is unsupported
```
Even with `publishArtifact := false` the user is still forced to define a (dummy) resolver that's never used, e.g. `publishTo := { Some("publishMeNot" at "https://publish/me/not") }`
Otherwise the following error is thrown:
```
publish
[error] java.lang.RuntimeException: Repository for publishing is not specified.
[error] at scala.sys.package$.error(package.scala:27)
[error] at sbt.Classpaths$.$anonfun$getPublishTo$1(Defaults.scala:2436)
[error] at scala.Option.getOrElse(Option.scala:121)
[error] at sbt.Classpaths$.getPublishTo(Defaults.scala:2436)
[error] at sbt.Classpaths$.$anonfun$ivyBaseSettings$48(Defaults.scala:1917)
```
This is to avoid it initialising Log4J2 (via SLF4J), which we initialise
ourselves programmatically in LogExchange. Also there's no need to
removeAll in initialState.
Fixes#3787
Ref https://github.com/sbt/io/pull/96
Under RFC 8089, both u1 and u3 are legal, but many of the other platforms expect traditional u3.
This will increase the compatibility/usability of sbt server, for example to integrate with Vim.
Although in theory the fix in #3776 should be preferable to
synchronize templateStats() manually, it turns out that we
still get errors in some tests. So, reverting to a
synchronized section while we investigate.
This reverts commit ee90917cc4.
Fixes#3786
To configure the log level of the server, this introduces a new task key named `serverLog`. The idea is to set this using `Global / serverLog / logLevel`. It will also check the global log level, and if all else fails, fallback to Warn.
```
lazy val level: Level.Value = (s get serverLogLevel) orElse (s get logLevel) match {
case Some(x) => x
case None => Level.Warn
}
```
`NGUnixDomainSocket` throws `java.io.IOException` instead of `SocketException`, probably because `SocketException` does not expose the contructor with a `Throwable` parameter.
To allow clients to disconnect, we need to catch `IOException`.
This is an implementation of `textDocument/definition` request.
Supports types only, and only in case when type is found in Zinc Analysis. When source(s) are found then editor opens potential source(s).
This simple implementation does not use semantic data.
During the processing of `textDocument/didSave`, we will start collecting the location of Analysis files via `lspCollectAnalyses`.
Later on, when the user asked for `textDocument/definition`, sbt server will invoke a Future call to lspDefinition, which direct reads the files to locate the definition of a class.
In addition to TCP, this adds sbt server support for IPC (interprocess communication) using Unix domain socket and Windows named pipe.
The use of Unix domain socket has performance and security benefits.
The creation of a backgroundLog was always using Debug as the
logging level for console and backing. This commit sets the
levels to those used by the caller. Fixes#3655
This adds a new option `dev` to the `reboot` command, which deletes the only the current sbt artifacts from the boot directory. `reboot dev` reads actively from `build.properties` instead of using the current state since `reboot` can restart into another sbt version.
In general, `reboot dev` is intended for the local development of sbt.
Fixes#3590
Using a recursive Source meant that ~ looked into target. If you have
any source generators and use ~ with anything the invokes them, like
~compile, that means that the act of generating sources triggers ~ to
re-execute compile (perhaps only on macOS where the NIO WatchService
just polls, after an initial delay).
Requires sbt/io#78
Fixes#3501
This adds a sbt.watch.mode system property that if set to 'polling' will
use PollingWatchService instead of WatchServiceAdapter (nio).
On macOS this will default to 'polling' and on all others 'nio'.
This is a temporary workaround for users affected by #3527
This adds a sbt.watch.mode system property that if set to 'polling' will
use PollingWatchService instead of WatchServiceAdapter (nio).
On macOS this will default to 'polling' and on all others 'nio'.
This is a temporary workaround for users affected by #3527
In ca71b4b902 I went about fixing the
inexhaustive matching in Scope's resolveProjectBuild and
resolveProjectRef. Looking back the change was wrong.
For resolveProjectBuild the new implementation is less wrong, but still
not great, seeing as it doesn't actually do any build resolving.
For resolveProjectRef the new implementation now blows up instead of
lies. Which means it's less leneant, more "fail-fast".
isProjectThis is unused; remnant of the pre-AutoPlugin days when build
settings where defined in Plugin.settings.
toString added for REPL testing:
```
scala> Zero / Zero / Zero / name
res0: sbt.SlashSyntax.ScopeAndKey[sbt.SettingKey[String]] = Zero / Zero / Zero / name
```
Prior to this change `Zero / Zero / Zero / name` broke as folllows:
```
scala> Zero / Zero / Zero / name
Zero / Zero / Zero / name
<console>:18: error: inferred type arguments [sbt.Zero.type] do not conform to method /'s type parameter bounds [K <: sbt.SlashSyntax.Key[K]]
Zero / Zero / Zero / name
^
```
This is the first cut for the Language Server Protocol on top of server that is still work in progress.
With this change, sbt is able to invoke `compile` task on saving files in VS Code.
```
Provided by:
ProjectRef(uri("...."), "root") / Test / test
Dependencies:
Test / executeTests
Test / test / streams
Test / state
Test / test / testResultLogger
```
Fixessbt/sbt#1812
This adds unified slash syntax for both sbt shell and the build.sbt DSL.
Instead of the current `<project-id>/config:intask::key`,
this adds `<project-id>/<config-ident>/intask/key` where <config-ident> is the Scala identifier notation for the configurations like `Compile` and `Test`.
This also adds a series of implicits called `SlashSyntax` that adds `/` operators to project refererences, configuration, and keys such that the same syntax works in build.sbt.
These examples work for both from the shell and in build.sbt.
Global / cancelable
ThisBuild / scalaVersion
Test / test
root / Compile / compile / scalacOptions
ProjectRef(uri("file:/xxx/helloworld/"),"root")/Compile/scalacOptions
Zero / Zero / name
The inspect command now outputs something that can be copy-pasted:
> inspect compile
[info] Task: sbt.inc.Analysis
[info] Description:
[info] Compiles sources.
[info] Provided by:
[info] ProjectRef(uri("file:/xxx/helloworld/"),"root")/Compile/compile
[info] Defined at:
[info] (sbt.Defaults) Defaults.scala:326
[info] Dependencies:
[info] Compile/manipulateBytecode
[info] Compile/incCompileSetup
[info] Reverse dependencies:
[info] Compile/printWarnings
[info] Compile/products
[info] Compile/discoveredSbtPlugins
[info] Compile/discoveredMainClasses
[info] Delegates:
[info] Compile/compile
[info] compile
[info] ThisBuild/Compile/compile
[info] ThisBuild/compile
[info] Zero/Compile/compile
[info] Global/compile
[info] Related:
[info] Test/compile
This implements JSON-based port file. Thoughout the lifetime of the sbt server there will be `cwd / "project" / "target" / "active.json"`, which contains `url` field.
Using this `url` the potential client, such as IDEs can find out which port number to hit.
Ref #3508
Adds JVM flag `sbt.server.autostart` to enable/disable the automatic starting of sbt server with the sbt shell.
This also adds a new command `startServer` to manually start the server.
If the read buffer contains more that 2 messages, we need to consume them all before blocking on socket read again. For that we have to loop until the buffer does not contain anymore the message delimiter character.
Same problem in the client ServerConnection code.
Fixes#2776
This allows cross building commands. When issuing a command, it detects
whether there is likely to be any Scala version incompatibilities, by
checking whether all projects have the same Scala cross version
configuration, if not, it outputs a big fat warning.
This no longer injects scalaVersion at the project level, which was interfering with crossScalaVersions delegation to ThisBuild scope.
Fixessbt/sbt#3353
Both the default settings and ^^ together sets the correct scalaVersion based on `sbtVersion in pluginCrossBuild`, but frequently people set up `scalaVersion` on sbt plugin's subproject, which disables the feature.
This change appends the scalaVersionSetting on ^^ so scalaVersion gets switched to 2.12.2 on ^^ 1.0.0-RC2 etc.
Fixes#3205
Ref #3282
We used to wrap InputStream so it will inject Thread.sleep, which then allows the thread to be cancelled, emulating a non-blocking readLine. This trick doesn't seem to work for Windows.
For non-Cygwin, actually just removing the wrapping does the job, but I couldn't get it to work for Cygwin.
To test, run some command via network, and then type `show name` into the terminal. On Cygwin, it will not respond.
This commit adapts `Watched` so that it supports the new `WatchService`
infrastructure introduced in sbt/io. The goal of this infrastructure is
to provide and API for and several implementations of services that
monitor changes to the file system.
The service to use to monitor the file system can be configured with the
key `watchService`.
This undeprecates the syntax, but at the same times moves it out of
implicit scope, therefore requiring a 'import TupleSyntax._' to opt-in
to the old syntax.
Before, we were not preserving the value `insideXXX`. This commit makes
sure that we handle much more complex scenarios and we report them
successfully. Have a look at the tests.
This ports sbt-cross-building's cross (`^`) and switch (`^^`) commands.
Instead of making it a plugin, the default settings are now changed
to use `sbtVersion in pluginCrossBuild` for the sbt dependency.
In sbt 0.13.15, in addition to notifying the user about the existence of
sbt's shell, a feature was added to allow the user to switch to sbt's
shell - a more pro-active approach to just displaying a message.
Unfortunately sbt is often unintentionally invoked in shell scripts in
"interactive mode" when no interaction is expected by, for exmaple,
invoking `sbt package` instead of `sbt package < /dev/null`. In that
case hitting [ENTER] would silently trigger sbt to run its shell,
easily wrecking the script. In addition to that I was unhappy with the
implementation as it created a tight coupling between sbt's command
processing abstraction to sbt's shell command.
If you want to stay in sbt's shell after running a task like `package`
then invoke sbt like so:
sbt package shell
Fixes#3091
This is a change in strategy.
The motivation is the need to find a good balance between:
+ informing the uninformed that would benefit from this information, &
+ not spamming the already informed
Making it dependent on "compile" being present in remainingCommands will
probably make it trigger for, for example, Maven users who are used to
running "mvn compile" and always run "sbt compile", and who therefore
are unneccesarily suffering terribly slow compile speeds by starting up
the jvm and sbt every time.
Fixes#3091Fixes#3097
This commit does the following things:
* Removes the boolean from the instance context passes to the linter.
* Prohibits the use of value inside anonymous functions.
* Improves the previous check of `value` inside if.
The improvements have occurred thanks to the fix of an oversight in the
traverser. As a result, several implementation of tasks have been
rewritten because of new compilation failures by both checks.
Note that the new check that prohibits the use of value inside anonymous
functions ignores all the functions whose parameters have been
synthesized by scalac (that can happen in a number of different
scenarios, like for comprehensions). Other scripted tests have also been
fixed.
Running `.value` inside an anonymous function yields the following
error:
```
[error] /data/rw/code/scala/sbt/main-settings/src/test/scala/sbt/std/TaskPosSpec.scala:50:24: The evaluation of `foo` inside an anonymous function is prohibited.
[error]
[error] Problem: Task invocations inside anonymous functions are evaluated independently of whether the anonymous function is invoked or not.
[error]
[error] Solution:
[error] 1. Make `foo` evaluation explicit outside of the function body if you don't care about its evaluation.
[error] 2. Use a dynamic task to evaluate `foo` and pass that value as a parameter to an anonymous function.
[error]
[error] val anon = () => foo.value + " "
[error] ^
```
`.value` inside the if of a regular task is unsafe. The wrapping task
will always execute the value, no matter what the if predicate yields.
This commit adds the infrastructure to lint code for every sbt DSL
macro. It also adds example of neg tests that check that the DSL checks
are in place.
The sbt checks yield error for this specific case because we may want to
explore changing this behaviour in the future. The solutions to this are
straightforward and explained in the error message, that looks like
this:
```
EXPECTED: The evaluation of `fooNeg` happens always inside a regular task.
PROBLEM: `fooNeg` is inside the if expression of a regular task.
Regular tasks always evaluate task inside the bodies of if expressions.
SOLUTION:
1. If you only want to evaluate it when the if predicate is true, use a dynamic task.
2. Otherwise, make the static evaluation explicit by evaluating `fooNeg` outside the if expression.
```
Aside from those solutions, this commit also adds a way to disable any
DSL check by using the new `sbt.unchecked` annotation. This annotation,
similar to `scala.annotation.unchecked` disables compiler output. In our
case, it will disable any task dsl check, making it silent.
Examples of positive checks have also been added.
There have been only two places in `Defaults.scala` where this check has
made compilation fail.
The first one is inside `allDependencies`. To ensure that we still have
static dependencies for `allDependencies`, I have hoisted up the value
invocation outside the if expression. We may want to explore adding a
dynamic task in the future, though. We are doing unnecessary work there.
The second one is inside `update` and is not important because it's not
exposed to the user. We use a `taskDyn`.
This change is necessary in the cases where we have global
initialization issues that have no position, like:
```
[info] [error] scala.reflect.internal.MissingRequirementError: object scala in compiler mirror not found.
```
Before, it was failing with a `sys.error` exception. Now we will report
these issues with a console reporter that is not meant to be
thread-safe.
Fixes#3178
While working on the Scopes and Scope Delegation document, I noticed that the term Global in sbt is used for two different meaning.
1. Universal fallback scope component `*`
2. An alias for GlobalScope
This disambiguates the two by renaming ScopeAxis instance to Zero.
Since this is mostly internal to sbt code, the impact to the user should be minimal.
The `cachedUpdate` implementation does not need to be in `Defaults`
since it's not using any of the tasks/settings defined there, that's
`updateTask`'s job.
This commit moves the utilities required by `updateTask` to the
`sbt.internal.librarymanagement` namespace.
This commit does some changes to the implementation with the purpose of
making this code more readable. I find that this rewrite was necessary
as I was implementing the dependency lock file.
This commit has two goals:
* Simplify the `load` API endpoints, removing the unused ones to shorten
the surface of the API.
* Add documentation to the main `load` methods.
Sbt has a feature to show timed logs for every operation at startup.
However, its output is cluttered and users cannot read how much time
single methods consume nor if they call other methods.
This commit improves the status quo by adding indentation.
This commit reduces the complexity around `loadPluginDefinition` et al.
`pluginDefinitionLoader` is not used anywhere in sbt, so the extra
definitions are removed.
Both the implementation of `loadPluginDefinition` and
`pluginDefinitionLoader` are reduced to a bare minimum where the
components at hand (definition classpath, dependency classpath) are
properly defined.
Documentation to the three methods has been added.
It mainly does three things:
* Clean up the implementation, removing unused values like
`globalPluginDefs`.
* Make the implementation more readable.
* And the big one: Remove the creation of a classloader that we were
instantiating but whose value we were throwing away. This has an impact
in performance, but I'm yet to benchmark how much it is.
Previous commit used `synchronized` to ensure that the global reporter
was not reporting errors from other parsing sessions. Theoretically,
though, sbt could invoke parsing in parallel, so it's better to ensure
we remove the `synchronized` block, which could also be preventing some
JVM optimizations.
The following commit solves the issue by introducing a reporter id.
A reporter id is a unique identifier that is mapped to a reporter. Every
parsing session gets its own identifier, which then is reused for
recursive parsing. Error reports between recursive parses cannot collide
because the reporter is cleaned in `parse`.
The previous implementation was instantiating a toolbox to parse every
time it parsed a sbt file (and even recursively!!!).
This is inefficient and translates to instantiating a `ReflectGlobal`
every time we want to parse something.
This commit takes another approach:
1. It removes the dependency on `ReflectGlobal`.
2. It reuses the same `Global` and `Run` instances for parsing.
This is an efficient as it can get without doing a whole overhaul of it.
I think that in the future we may want to reimplement it to avoid the
recursive parsing to work around Scalac's bug.
This change was proposed by Jason in case that the new parsing mechanism
implemented later on has to be reverted. This change provides a good
baseline, but it's far from ideal with regard to readability of the
parser and performance.
The previous implementation was using the Scala runtime universe to
check whether a plugin had or not an `autoImport` member. This is a bad
idea for the following reasons:
* The first time you use it, you class load the whole Scalac compiler
universe. Not efficient. Measurements say this is about a second.
* There is a small overhead of going through the reflection API.
There exists a better approach that consists in checking if `autoImport`
exists with pure Java reflection. Since the class is already class
loaded, we check for:
* A class file named after the plugin FQN that includes `autoImport$` at
the end, which means that an object named `autoImport` exists.
* A field in the plugin class that is named `autoImport`.
This complies with the plugin sbt specification:
http://www.scala-sbt.org/1.0/docs/Plugins.html#Controlling+the+import+with+autoImport
We need to communicate the error states in the thread, so I added a `Future[Unit]` called `ready`.
If something goes wrong during the startup, like if the port is already taken, this can be used to communicate back to the main thread, and display the error accordingly.
+ Don't notify ScriptMain users by moving the logic to xMain
+ Only trigger shell if shell is a defined command
+ Use existing Shell/BootCommand strings instead of new ones
The sbt/sbt-launcher-package doesn't invoke sbt with the "shell"
command. sbt has a mechanism for handling this in its "boot" command
that adds an "iflast shell" to the commands. Handle this when displaying
the "Executing in batch mode" warning.
Fixes#3004
Notify & enable users to stay in sbt's shell on the warm JVM by hitting
[ENTER] while sbt is running.
Looks like this; first I run 'sbt about', then I hit [ENTER]:
$ sbt about
[info] !!! Executing in batch mode !!! For better performance, hit [ENTER] to remain in the sbt shell
[info] Loading global plugins from /Users/dnw/.dotfiles/.sbt/0.13/plugins
[info] Loading project definition from /s/t/project
[info] Set current project to t (in build file:/s/t/)
[info] This is sbt 0.13.14-SNAPSHOT
[info] The current project is {file:/s/t/}t 0.1.0-SNAPSHOT
[info] The current project is built against Scala 2.12.1
[info] Available Plugins: sbt.plugins.IvyPlugin, sbt.plugins.JvmPlugin, sbt.plugins.CorePlugin, sbt.plugins.JUnitXmlReportPlugin, sbt.plugins.Giter8TemplatePlugin
[info] sbt, sbt plugins, and build definitions are using Scala 2.10.6
>
>
Fixes#2987
Have sbt.version set in project/build.properties is a best practice
because it makes the build more deterministic and reproducible.
With this change sbt, after ensuring that the base directory is probably
an sbt project, writes out sbt.version in project/build.properties if it
is missing.
Fixes#754
Before this commit, using dotty in your sbt project required to add:
scalaCompilerBridgeSource := ("ch.epfl.lamp" % "dotty-sbt-bridge" %
scalaVersion.value % "component").sources()
in your build.sbt. We might as well automatically do this, this reduces
the boilerplate for using dotty in your project to:
scalaOrganization := "ch.epfl.lamp"
scalaVersion := "0.1.1-SNAPSHOT"
scalaBinaryVersion := "2.11" // dotty itself is only published as a
// 2.11 artefact currently
java.lang.Class#newInstance deprecated since Java 9
http://download.java.net/java/jdk9/docs/api/java/lang/Class.html#newInstance--
```
Deprecated. This method propagates any exception thrown by the nullary constructor, including a checked exception. Use of this method effectively bypasses the compile-time exception checking that would otherwise be performed by the compiler. The Constructor.newInstance method avoids this problem by wrapping any exception thrown by the constructor in a (checked) InvocationTargetException.
The call
clazz.newInstance()
can be replaced by
clazz.getDeclaredConstructor().newInstance()
The latter sequence of calls is inferred to be able to throw the additional exception types InvocationTargetException and NoSuchMethodException. Both of these exception types are subclasses of ReflectiveOperationException.
Creates a new instance of the class represented by this Class object. The class is instantiated as if by a new expression with an empty argument list. The class is initialized if it has not already been initialized.
```
This adds a macro-level hack to support += op for sourceGenerators and resourceGenerators using RHS of Initialize[Task[Seq[File]]].
When the types match up, the macro now calls `.taskValue` automatically.
This setting controls the maximum width of the ASCII graphs printed
by commands like `inspect tree`. Default value corresponds to the
previously hardcoded value of 40 characters.
Copies products to the workind directory, and the rest to the serviceTempDir of this service, both wrapped in SHA-1 hash of the file contents. This is intended to mimize the file copying and accumulation of the unused JAR file. Since working directory is wiped out when the background job ends, the product JAR is deleted too. Meanwhile, the rest of the dependencies are cached for the duration of this service.