java.lang.Class#newInstance deprecated since Java 9
http://download.java.net/java/jdk9/docs/api/java/lang/Class.html#newInstance--
```
Deprecated. This method propagates any exception thrown by the nullary constructor, including a checked exception. Use of this method effectively bypasses the compile-time exception checking that would otherwise be performed by the compiler. The Constructor.newInstance method avoids this problem by wrapping any exception thrown by the constructor in a (checked) InvocationTargetException.
The call
clazz.newInstance()
can be replaced by
clazz.getDeclaredConstructor().newInstance()
The latter sequence of calls is inferred to be able to throw the additional exception types InvocationTargetException and NoSuchMethodException. Both of these exception types are subclasses of ReflectiveOperationException.
Creates a new instance of the class represented by this Class object. The class is instantiated as if by a new expression with an empty argument list. The class is initialized if it has not already been initialized.
```
This adds a macro-level hack to support += op for sourceGenerators and resourceGenerators using RHS of Initialize[Task[Seq[File]]].
When the types match up, the macro now calls `.taskValue` automatically.
This setting controls the maximum width of the ASCII graphs printed
by commands like `inspect tree`. Default value corresponds to the
previously hardcoded value of 40 characters.
Copies products to the workind directory, and the rest to the serviceTempDir of this service, both wrapped in SHA-1 hash of the file contents. This is intended to mimize the file copying and accumulation of the unused JAR file. Since working directory is wiped out when the background job ends, the product JAR is deleted too. Meanwhile, the rest of the dependencies are cached for the duration of this service.
Fixes#2460Fixes#2851
Ref #2707, #2708, #2469
Unlike the previous attempts at fixing the handling of build-level
keys, this change does not change the main parsing logic, which uses
`getKey` to retrieve the key from the key map.
The fact that shell worked pre-0.13.11 means that the parsing was ok.
What this changes is just the "example" keys supplied to the parser so
the tab completion works.
Fixes#2761
With sbt 0.13.13-RC1 rediscovered that the dependency pulled in from
Giter8 was affecting the plugins. To avoid this, this change splits up
the template resolver implementation to another module called
sbt-giter8-resolver, and it will be downloaded using Ivy into
`~/.sbt/0.13/templates/`, and then launched reflectively using Java as
the interface.
This adds `new` command, which helps create a new build definition. The
`new` command is extensible via a mechanism called the template
resolver,
which evaluates the arbitrary arguments passed to the command to find
and run a template.
As a reference implementation [Giter8][g8] is provided as follows:
sbt new eed3si9n/hello.g8
This will run eed3si9n/hello.g8 using Giter8.
[g8]: http://www.foundweekends.org/giter8/
The compiler interface subclasses `scala.tools.nsc.Global`,
and loading this new subclass before each `compile` task forces
HotSpot JIT to deoptimize larges swathes of compiled code. It's
a bit like SBT has rigged the dice to always descend the longest
ladder in a game of Snakes and Ladders.
The slowdown seems to be larger with Scala 2.12. There are a number
of variables at play, but I think the main factor here is that
we now rely on JIT to devirtualize calls to final methods in traits
whereas we used to emit static calls. JIT does a good job at this,
so long as classloading doesn't undo that good work.
This commit extends the existing `ClassLoaderCache` to encompass
the classloader that includes the compiler interface JAR. I've
resorted to adding a var to `AnalyzingCompiler` to inject the
dependency to get the cache to the spot I need it without binary
incompatible changes to the intervening method signatures.
Ref #2634
updateSbtClassifiers uses an artificially created dependency graph set
in classifiersModule. The problem is that ivyScala instance is reused
from the outer scope that has the user project's scalaVersion as
demonstrated as follows:
scala> val is = (ivyScala in updateSbtClassifiers).eval
is: Option[sbt.IvyScala] =
Some(IvyScala(2.9.3,2.9.3,List(),true,false,true,org.scala-lang))
This change fixes#2686 by redefining ivyScala with scalaVersion and
scalaBinaryVersion scoped to updateSbtClassifiers task. The existing
scripted test was modified to reproduce the bug.
Def.make could take 10099ms for 100 subprojects. This would display
logs probably for projects with more than 10 subprojects, which might
pause for a few seconds during load.
LogManager implementation is modified to use ManagedLogger, which can swap out backing Appenders without re-creating the log instance.
The State was also changed to track `currentCommand: Option[Exec]`. `Exec` knows the origin of the command invocation, and using that we can now send the network-originated events only to the network clients.
Combined together, this implements log splitting between the sbt clients (channels).
This is the beginning of a lightweight client, which talks to the
server over Contraband-generated JSON API. Given that the server is
started on port 5173:
```
$ cd /tmp/bogus
$ sbt client localhost:5173
> compile
StatusEvent(Processing, Vector(compile, server))
StatusEvent(Ready, Vector())
StatusEvent(Processing, Vector(, server))
StatusEvent(Ready, Vector())
```
This adds support to generate synthetic subprojects from an auto plugin.
In addition, a method called `projectOrigin` is added to distinguish
Organic, BuildExtra, ProjectExtra, and GenericRoot.
Forward-port of #2717 and #2738
PR #2469 added build keys to tab completion, with the side effect of
considering as available candidate in key selection, thus making sbt
think that some inputs were ambiguous (e.g. `baseDirectory`): should it
apply to the current project or to the build level key?
This commit fixes this issue by improving the key selection:
- If there's no candidate, we return the default key
- If there's a single possible project level key, and zero or more
build level keys, then we select the project level key.
- If there are zero project level key, and a single build level key,
then we select the build level key
- If there are multiple candidates, sbt says that the input is
ambiguous.
Fixes#2707
sbt's shell provided completion only for keys that were relative to a
defined project, but didn't provide completion for keys that belong to
the build definition only.
This commit fixes this issue by defining a new kind of `Parser` (from
which completions are generated) which runs its input simultaneously on
distinct parsers. We now define a parser for project-level keys and
another parser for build-level keys. These two parsers are eventually
combined, and we get the completions of both parsers.
Fixessbt/sbt#2460
The definition of `DefinesClass` has changed from being a function
`File => String => Boolean` to just a function `String => Boolean`. The
changes in this commit reflect that fact.
Also, this commit implements a newly introduced PerClasspathEntryLookup.
When running a sbt script, this change lets the user on UNIX and
Windows platforms to use native file extensions like none/.sh or
.bat/.cmd. The code copies the file to the sbt boot/hash/src_managed
directory with a .scala extension.
Adds `trackInternalDependencies` and `exportToInternal` settings. These
can be used to control whether to trigger compilation of a dependent
subprojects when you call `compile`. Both keys will take one of three
values: `TrackLevel.NoTracking`, `TrackLevel.TrackIfMissing`, and
`TrackLevel.TrackAlways`. By default they are both set to
`TrackLevel.TrackAlways`.
When `trackInternalDependencies` is set to `TrackLevel.TrackIfMissing`,
sbt will no longer try to compile internal (inter-project) dependencies
automatically, unless there are no `*.class` files (or JAR file when
`exportJars` is `true`) in the output directory. When the setting is
set to `TrackLevel.NoTracking`, the compilation of internal
dependencies will be skipped. Note that the classpath will still be
appended, and dependency graph will still show them as dependencies.
The motivation is to save the I/O overhead of checking for the changes
on a build with many subprojects during development. Here's how to set
all subprojects to `TrackIfMissing`.
lazy val root = (project in file(".")).
aggregate(....).
settings(
inThisBuild(Seq(
trackInternalDependencies := TrackLevel.TrackIfMissing,
exportJars := true
))
)
The `exportToInternal` setting allows the dependee subprojects to opt
out of the internal tracking, which might be useful if you want to
track most subprojects except for a few. The intersection of the
`trackInternalDependencies` and `exportToInternal` settings will be
used to determine the actual track level. Here's an example to opt-out
one project:
lazy val dontTrackMe = (project in file("dontTrackMe")).
settings(
exportToInternal := TrackLevel.NoTracking
)