Before, we were not preserving the value `insideXXX`. This commit makes
sure that we handle much more complex scenarios and we report them
successfully. Have a look at the tests.
The missing .value analysis is dumb on purpose because it's expensive.
Detecting valid use cases of idents whose type is an sbt key is
difficult and dangerous because we may miss some corner cases. Instead,
we report on the easiest cases in which we are certain that the user
does not want to have a stale key reference. Those are idents in the rhs
of val definitions with `_` as name and idents in statement position
inside blocks.
In the future, we can cover all val definitions no matter what their
name is. Unfortunately, doing so will come at the cost of speed: we have
to run the unused name analysis in `TypeDiagnostics` and expose it from
the power context in `ContextUtil`.
This is good enough for now. If users express interest in having a
smarter analysis, we can always consider trying the unused name
analysis. I am not sure how slow it will be -- hopefully it won't be
that much.
This ports sbt-cross-building's cross (`^`) and switch (`^^`) commands.
Instead of making it a plugin, the default settings are now changed
to use `sbtVersion in pluginCrossBuild` for the sbt dependency.
In sbt 0.13.15, in addition to notifying the user about the existence of
sbt's shell, a feature was added to allow the user to switch to sbt's
shell - a more pro-active approach to just displaying a message.
Unfortunately sbt is often unintentionally invoked in shell scripts in
"interactive mode" when no interaction is expected by, for exmaple,
invoking `sbt package` instead of `sbt package < /dev/null`. In that
case hitting [ENTER] would silently trigger sbt to run its shell,
easily wrecking the script. In addition to that I was unhappy with the
implementation as it created a tight coupling between sbt's command
processing abstraction to sbt's shell command.
If you want to stay in sbt's shell after running a task like `package`
then invoke sbt like so:
sbt package shell
Fixes#3091
This is a change in strategy.
The motivation is the need to find a good balance between:
+ informing the uninformed that would benefit from this information, &
+ not spamming the already informed
Making it dependent on "compile" being present in remainingCommands will
probably make it trigger for, for example, Maven users who are used to
running "mvn compile" and always run "sbt compile", and who therefore
are unneccesarily suffering terribly slow compile speeds by starting up
the jvm and sbt every time.
Fixes#3091Fixes#3097
This commit does the following things:
* Removes the boolean from the instance context passes to the linter.
* Prohibits the use of value inside anonymous functions.
* Improves the previous check of `value` inside if.
The improvements have occurred thanks to the fix of an oversight in the
traverser. As a result, several implementation of tasks have been
rewritten because of new compilation failures by both checks.
Note that the new check that prohibits the use of value inside anonymous
functions ignores all the functions whose parameters have been
synthesized by scalac (that can happen in a number of different
scenarios, like for comprehensions). Other scripted tests have also been
fixed.
Running `.value` inside an anonymous function yields the following
error:
```
[error] /data/rw/code/scala/sbt/main-settings/src/test/scala/sbt/std/TaskPosSpec.scala:50:24: The evaluation of `foo` inside an anonymous function is prohibited.
[error]
[error] Problem: Task invocations inside anonymous functions are evaluated independently of whether the anonymous function is invoked or not.
[error]
[error] Solution:
[error] 1. Make `foo` evaluation explicit outside of the function body if you don't care about its evaluation.
[error] 2. Use a dynamic task to evaluate `foo` and pass that value as a parameter to an anonymous function.
[error]
[error] val anon = () => foo.value + " "
[error] ^
```
`.value` inside the if of a regular task is unsafe. The wrapping task
will always execute the value, no matter what the if predicate yields.
This commit adds the infrastructure to lint code for every sbt DSL
macro. It also adds example of neg tests that check that the DSL checks
are in place.
The sbt checks yield error for this specific case because we may want to
explore changing this behaviour in the future. The solutions to this are
straightforward and explained in the error message, that looks like
this:
```
EXPECTED: The evaluation of `fooNeg` happens always inside a regular task.
PROBLEM: `fooNeg` is inside the if expression of a regular task.
Regular tasks always evaluate task inside the bodies of if expressions.
SOLUTION:
1. If you only want to evaluate it when the if predicate is true, use a dynamic task.
2. Otherwise, make the static evaluation explicit by evaluating `fooNeg` outside the if expression.
```
Aside from those solutions, this commit also adds a way to disable any
DSL check by using the new `sbt.unchecked` annotation. This annotation,
similar to `scala.annotation.unchecked` disables compiler output. In our
case, it will disable any task dsl check, making it silent.
Examples of positive checks have also been added.
There have been only two places in `Defaults.scala` where this check has
made compilation fail.
The first one is inside `allDependencies`. To ensure that we still have
static dependencies for `allDependencies`, I have hoisted up the value
invocation outside the if expression. We may want to explore adding a
dynamic task in the future, though. We are doing unnecessary work there.
The second one is inside `update` and is not important because it's not
exposed to the user. We use a `taskDyn`.
This commit adds the first version of the checker that will tell users
if they are doing something wrong.
The first version warns any user that write `.value` inside an if
expression.
As the test integration is not yet working correctly and messages are
swallowed, we have to println to get info from the test.
The previous scalafmt plugin had two problems:
* Caching with Coursier did not work correctly
* It failed after the upgrade to 1.0 in my computer (from a clean fork):
```
[error] (testingProj/compile:scalafmtInc) java.lang.NoClassDefFoundError: Could not initialize class scala.meta.package$
[error] (runProj/compile:scalafmtInc) java.lang.NoClassDefFoundError: Could not initialize class scala.meta.package$
[error] (taskProj/compile:scalafmtInc) java.lang.NoClassDefFoundError: Could not initialize class scala.meta.package$
[error] (stdTaskProj/compile:scalafmtInc) java.lang.NoClassDefFoundError: Could not initialize class scala.meta.package$
[error] (actionsProj/compile:scalafmtInc) java.lang.NoClassDefFoundError: Could not initialize class scala.meta.package$
[error] (protocolProj/compile:scalafmtInc) java.lang.NoClassDefFoundError: Could not initialize class scala.meta.package$
[error] (commandProj/compile:scalafmtInc) java.lang.NoClassDefFoundError: Could not initialize class scala.meta.package$
[error] (mainSettingsProj/compile:scalafmtInc) java.lang.NoClassDefFoundError: Could not initialize class scala.meta.package$
[error] (mainProj/compile:scalafmtInc) java.lang.NoClassDefFoundError: Could not initialize class scala.meta.package$
[error] (sbtProj/compile:scalafmtInc) java.lang.NoClassDefFoundError: Could not initialize class scala.meta.package$
[error] (scriptedPluginProj/compile:scalafmtInc) java.lang.NoClassDefFoundError: Could not initialize class scala.meta.package$
[error] (scriptedSbtProj/compile:scalafmtInc) java.lang.NoClassDefFoundError: Could not initialize class scala.meta.package$
[error] Total time: 19 s, completed May 24, 2017 10:44:56 AM
```
This commit replaces the previous scalafmt integration by the one
created by Lucidsoftware, big shoutout! (/cc @pauldraper)
https://github.com/lucidsoftware/neo-sbt-scalafmt
This commit moves everything behind the old util-appmacro over to
sbt/sbt.
appmacro has been renamed as core-macros. It is indeed the project
that defines the main logic to manipulate macros, and defines interfaces
that are then implemented by main-settings. Because of this strong
relationship between these two modules, I find that they have to stay
together and be able to be modified in one commit. I've realised this
while hacking on the macro changes for the usability of sbt.
The reason for this change is because the logic of this module is really
tight to the implementation of sbt and, I would even say, essentially
define the sbt DSL.