**Problem**
I notice that the synthetic root project ends up conflicting with
the projectMatrix on Scala 3, when the name of the matrix
matches the directory name, which is fairly common.
**Solution**
Append `-root` to the root project when there are multiple subprojects found.
**Problem**
It's been a while since Coursier has been the default library management engine,
and we don't need to support two.
**Solution**
This removes `useCoursier` setting.
**Problem**
We want a more flexible way of aggregating subprojects.
**Solution**
This implements a subproject filtering as a replacement of
the subproject axis in the act command.
**Problem**
Since Scala cross building works better than the plugin cross building `^^`,
it was common for plugin authors to encode plugin cross building as Scala cross building
given that we usually have zero or one sbt release in one Scala version.
**Solution**
This brings in the setting into SbtPlugin so plugin authors can cross build
using sbt 2.x.
**Problem**
The new common settings feature doesn't work when the root isn't created by the user.
**Solution**
This fixes common settings by calling `expandCommonSettingsPerBase(...)` on
the synthetic root's base first.
**Problem**
There's been previous attempt like 1878 to skip publishing of the root,
but it seems like the behavior has regressed at some point in time.
**Solution**
This skips publishing on synthetic aggregate root project.
**Problem**
`run` currently blocks all other commands, such as BSP commands.
**Solution**
`run` no longer blocks the command execution loop.
Instead it pauses the prompt on the current command channel.
**Problem**
testQuick currently does not invalidate on argument changes.
**Solution**
This includes test argument digests.
---------
Co-authored-by: adpi2 <adrien.piquerez@gmail.com>
**Problem**
Currently scripted does version checking to block sbt 2.x plugins
to be cross published from sbt 1.x.
**Solution**
Remove the sbt version matching.
**Problem**
During sbt 1.x we created scripted-sbt-redux for package name change.
**Solution**
We can move the module back to scripted-sbt, which simplifies things.
**Problem**
Current implementation of testQuick depends on the concept of timestamp,
which probably won't work well with the new consistent analysis store or
the idea of remote caching.
**Solution**
This is a step towards cached testing by making the incrementality hermetic
(no longer depends on the timestamp). Instead this calculates the aggregated
SHA-256 of the class files involved in the test.
The BSP server didn't reset old diagnostic messages sent to BSP clients under
certain circumstances. This commit mitigates this edge case and ensures that
diagnostics for files that previously had compilation problems are properly
reset when fresh diagnostics messages are sent.
The culprit was a mismatch of map keys: Files with problems were sometimes recorded
under an absolute path, but later attempted to be retrieved by virtual path.
`clean` should delete the packed dir. If it does not,
the next `compileIncremental`, which is a cache hit, will see that
the packed dir is already there and will not unpack it.
This follows on from #7470, to include all sources, not just managed and
unmanaged, in the source jar, along with all resources (previously only
unmanaged resources were included).
This means that if, for whatever crazy reason, someone does modify the
`sources` task to include additional sources or filter out sources, rather than
using the managed or unmanaged source mechanisms, their changes will still be
reflected in the source jar.
See also https://github.com/sbt/zinc/pull/1326
This adds a new setting `enableConsistentCompileAnalysis`,
which enables the new "Consistent" Analysis format,
which is faster and more repeatable than the status quo.
This is initialized to `true` by default.
It can be opted out either by the setting or using
`-Dsbt.analysis2024=false`.
When loading a scripted test, sbt creates a jar file and loads it.
The path of the jar file is the same for all the batched tests.
We must prevent the JDK from caching this jar file to force a reload after each test.
Otherwise sbt tries to load the auto-plugins of a previous test and fails.
When a macro was compiled against a new scala-library (say 2.13.15),
the runtime classpath of the Scala compiler should not contain an older
scala-library. Otherwise the macro can cause a NoSuchMethodException
during expansion.
This commit updates scala-library in the scalaInstance to the version
on the projects dependency classpath.
The `csrSameVersions` setting can be used to keep dependencies at the
same version.
By default it's used for Scala artifacts. They need to be kept at the
same version because the compiler / reflect are built with
cross-artifact inlining enabled.
`csrSameVersions := Seq(Set(scala-library, scala-reflect, scala-compiler, scalap))`
Users can make use of the new setting in the following way:
- `csrSameVersions += Set[InclExclRule]("com.corp" % "lib", "com.corp" % "lub")`
- `csrSameVersions += Set[InclExclRule]("com.corp" % "lib-family-*")`
When expanding a macro compiled against a new Scala library, the
runtime classpath of the compiler should not contain an older library.
Otherwise a NoSuchMethodException can occur.
A similar issue is present when running the Scala repl through sbt.
An input line compiled against a new library could fail to run if
the repl's runtime classpath is on an old version.
There are a couple of settings / configs that affect this, summary
below. The change in this PR seems to be the most narrow.
`scalaModuleInfo.value.overrideScalaVersion` in sbt
- affects how sbt to sets coursier's `forceScalaVersion` (see below)
- used by librarymanagement.ivy. If true, add a OverrideScalaMediator
See sbt/sbt#2634. Probably not relevant when using coursier.
`autoScalaLibrary` setting in sbt
- automatically add `scala-library` (or `scala3-library`) as a project
dependency
- also used for `forceScalaVersion` (see below)
`CoursierConfiguration.autoScalaLibrary`
- if `true` then Coursier `ResolutionParams.forceScalaVersion` is set
to to `true`
- initialized by sbt to
`autoScalaLibrary.value &&
!ScalaArtifacts.isScala3(sv) &&
!Classpaths.isScala213(sv) && // added in this commit
scalaModuleInfo.forall(_.overrideScalaVersion)`
coursier `ResolutionParams.forceScalaVersion`
- if true, `scala-library` / `scala-reflect` / `scala-compiler` /
`scalap` are forced to the scala version, not actually resolved
- for Scala 3, the `scala3-library` and `scala3-compiler` versions
are forced
For the details about this PR, please see the blog post https://eed3si9n.com/sbt-remote-cache/.
* Add cache basics
* Refactor Attributed to use StringAttributeMap, which is Map[StringAttributeKey, String]
* Implement disk cache
* Rename Package to Pkg
* Virtualize packageBin
* Use HashedVirtualFileRef for packageBin
* Virtualize compile task
Previously, in sbt 2, the Scripted plugin was not resolving any module
that would provide it with the `ScriptedTests` object, and could
therefore not run scripted tests.
With this patch, `scripted-sbt-redux` will be resolved and added to the
classloader that's used to load `sbt.scriptedtest.ScriptedTests`.
**Problem**
Plugins are topologically sorted, but plugins with equal weigh could
modify the same key via `~=` etc, resulting in different builds
depending on the artifact.
**Solution**
This attempts to fix that by first sorting the selected plugins
by the class name before sorting it topologically.
Problem
-------
Starting Scala 2.13.12, Scala 2 has in-sourced the compiler bridge
implementtion, which hopefully will be kept up to date more than the
ones in Zinc.
Solution
--------
This switches to using the pre-compiled compiler bridge for >=2.13.12.
In addition to ExecuteProgress, this new interface allows builds and plugins to receive events when commands start and finish, including the State before and after each command. It also makes cancellation visible to clients by making the Cancelled type top-level.
Fixes https://github.com/sbt/sbt/issues/7327
**Problem**
In builds with mixed Scala patch versions (like scalameta),
it's possible for a core subproject to be set to the lastest 2.12.x,
but the compiler plugin component is cross published to 2.12.0 etc.
`++ 2.12.0` in this case does not work since sbt 1.7.x onwards requires
the queried Scala version to be listed in `crossScalaVersions`.
**Solution**
This implements sbt 1.6.x-like fallback mechanism,
but instead of using the queried version (e.g. 2.12.0) it will set
the Scala version to one of listed versions that is binary compatible.
- remove unused type params
- use `withFilter` if possible
- use `collectFirst` instead of `collect` and `headOption`
- use `length` instead of `size` if `Array` or `String`
- use `foreach` instead of `map`
Problem
-------
`sbt new` (`sbt init`) hardcodes the templates, which I think is ok,
but without changing much, we can make it extensible.
Solution
--------
This adds two new keys `templateDescriptions` and `templateRunLocal`,
which can customize the behavior for in-house usage etc.
Problem
-------
sbt init menu doesn't pick the right template in the latter half.
Solution
--------
This fixes the mapping between the position and the letter.
If the terminal supports ANSI control sequence,
this displays the template list in an interactive way.
The focused template is rendered reversed,
and arrow key can be used to move the focus up/down.
The sbt-reproducible-build fails if two artifacts point to the same file.
When packaging the artifacts of an sbt plugin,
we copy each files to avoid this issue.
**Problem**
You want to get started with sbt, and you don't know
which template to get started with.
**Solution**
This implements an interactive menu on `sbt new` command
when invoked without an argument to list template candidates.
The first option is `scala/toolkit.local`, which locally creates
an sbt build without calling out to Giter8 (GitHub).
Expose what the incremental compiler is doing behind the scenes. The RunProfiler interface has been part of Zinc for a while, but this allows the build itself, or an Sbt plugin, to hook their own implementation.
We expose a list of such listeners to avoid plugins stepping on each other and replacing an existing listener.
This key has been added in 4061dabf4d but it is only available to Sbt itself. Since ExternalHooks is a Java interface, defined in Zinc for a while and fairly stable, I think this should be safe to do.
My main motivation is to allow installing an InvalidationProfiler from an Sbt plugin, thus gaining access to zinc recompilation decisions. See related PR https://github.com/sbt/zinc/pull/1181
Artifactory rejects the legacy artifacts of sbt plugin.
It is now possible to publish to Artifactory
by turning `sbtPluginPublishLegacyMavenStyle` off.
Problem
-------
In sbt 1, platform cross building is implemented using in the user-land
using `%%%` operator, which clevery handles both Scala cross building
and appending platform suffix like sjs1.
However, in general symbolic `%%%` is confusing, and hard to explain.
Solution
--------
In sbt 2, we should subsume the idea of platform cross building,
so `%%` can act as the current `%%%` operator.
This adds a new setting called `platform`, which defaults to
`Platform.jvm` by default.
When a subprojects sets it to `Platform.sjs1`, `ModuleID`s defined using
`%%` operator will inject the platform suffix `_sjs1` **in addition**
to the Scala binary suffix `_2.13` etc.
Note: Explicit JVM dependencies will now require `.platform(Platform.jvm)`.
See https://eed3si9n.com/simplifying-sbt-with-common-settings/
Problem
-------
The behavior of bare settings is confusing in a multi-project build.
This is partly due to the fact that to use `ThisBuild` scoping
the build user needs to be aware of the task implementation,
and know if the task is already defined at project level.
Solution
--------
This changes the interpretation of the baresettings to be common
settings, which works similar to the way `ThisBuild` behaves in sbt 1.x,
but since this would be a simple append at project-level, it should
work for any tasks or settings.
For an sbt plugin, we publish two POM files, the legacy one, and the
new Maven compatible one. The name of the new POM file contains the sbt
cross-version _2.12_1.0. The format of the new POM file is also slightly
different, because we append the sbt cross-version to all artifactIds of
sbt plugins. Hence Maven can resolve the new sbt plugin POM and its
dependencies.
When resolving an sbt plugin, we first try to resolve the new Maven POM
and if it fails we fallback on the legacy one. When parsing the new POM
format, we remove the sbt cross-version from all artifact IDs so that
there is no mismatch between old and new format of dependencies.
Java diagnostics don't have a pointer but we should report them.
Copied implementation from Bloop to translate the position of an
xsbti.Problem to a BSP range.
Problem
-------
Given val root, currently both `root` and synthetic root gets loaded.
This might be caused by build.sbt being virtualized, and no longer
matching the build root directory.
Solution
--------
For now, comparing the canonical paths seems to fix the issue.
Fixes https://github.com/sbt/sbt/issues/7118
Problem
-------
sbtn 1.8.1 was built using ubuntu-latest, which meant picking up newer
glibc.
Solution
--------
This downgraded the ubuntu machine to build sbtn.
I believe this was just an oversight that it's not marked as true since
sbt can handle `debugSession/start`. This change just ensures that
during the initialization process sbt says it's a `debugProvider` for
the same languages as `runProvider` and `testProvider`. It also
correctly marks the build targets as `canDebug`, unless they are sbt
targets.
Problem
-------
Fixes https://github.com/sbt/sbt/issues/7013
In a shared environment, multiple users will try to create
`/tmp/.sbt/` directory, and fail.
Solution
--------
Append a deterministic hash like `/tmp/.sbt1234ABCD` based
on the user home, so the same directory is used for the given
user, but each user would have a unique runtime directory.
Before
remote cache not found for 0.0.0-7c40144bd1c774e6
After
remote cache artifact not found for org.gontard:sbt-test:0.0.0-7c40144bd1c774e6 Some(cached-compile)
This change is a continuation of the work that was done in
https://github.com/sbt/sbt/pull/6874 to allow the Scala 3 compiler to
forward the `diagnosticCode` of an error as well as the other normal
things. This change in dotty was merged in
https://github.com/lampepfl/dotty/pull/15565 and also backported so it
will be in the 3.2.x series release. This change captures the
`diagnosticCode` and forwards it on via BSP so that tools like Metals
can can use this code.
You can see this in the BSP trace now for a diagnostic:
For example with some code that contains the following:
```scala
val x: Int = "hi"
```
You'll see the code in the BSP diagnostic:
``` "diagnostics": [
{
"range": {
"start": {
"line": 9,
"character": 15
},
"end": {
"line": 9,
"character": 19
}
},
"severity": 1,
"code": "7",
"source": "sbt",
"message": "Found: (\u001b[32m\"hi\"\u001b[0m : String)\nRequired: Int\n\nThe following import might make progress towards fixing the problem:\n\n import sourcecode.Text.generate\n\n"
}
]
```
Co-authored-by: Adrien Piquerez <adrien.piquerez@gmail.com>
Refs: https://github.com/lampepfl/dotty/issues/14904
If we use the ProxyTerminal in the background jobs, the logs
would be spread across different terminals, switching from active
client to active client. We want the logs to stick
to the client that started the job.
A new context is created and closed for each state of the MainLoop.
But the context of the backgroundJob must stay alive.
So we use a context that is owned by the BackgroundJobService.
It creates a new logger for each background job and cleans it when
the job stops.
Problem
-------
Since sbt-doge merger `++ <sv> <command1>` has used binary compatibility
as a test to select subproject, but it causes surprising situations like
sbt/sbt#6915, and it blurs the responsibility of YAML file and build
file as the version specified in the version can override the Scala
version test on local laptop.
Solution
--------
This removes the compatibiliy check (backward-only or otherwise),
and require that `<sv>` match one of `crossScalaVersions` using the new
Semantic Version selector pattern.
Suggested by @SethTisue in https://github.com/sbt/sbt/pull/6894#issuecomment-1168042209
Will show multiple lines when different versions are selected for different subprojects.
When no subprojects have a matching Scala version you will get a
'Switch failed' exception anyway, so in that case there is no
change in behavior.
Problem
-------
There's a bug in ipcsocket cleanup logic that effectively
tries to wipe out /tmp.
Workaround
----------
We should fix the underlying bug, but we can start by
explicitly configuring the temp directories ipcsocket uses.
Ref https://github.com/sbt/sbt/issues/6931
This removes OkHttp dependency.
This also removes an experimental feature to customize HTTP for Ivy called
CustomHttp we added in sbt 1.3.0.
Since LM got switched to Coursier in 1.3.0, I don't think we advertized
CustomHttp.