Merge remote-tracking branch 'remotesbt/0.13' into 0.13

This commit is contained in:
Matej Urbas 2014-04-18 13:48:19 +01:00
commit 186c4c0db4
57 changed files with 932 additions and 221 deletions

View File

@ -1,36 +1,99 @@
[Setup]: http://www.scala-sbt.org/release/docs/Getting-Started/Setup
[sbt-dev]: https://groups.google.com/d/forum/sbt-dev
[StackOverflow]: http://stackoverflow.com/tags/sbt
[Setup]: http://www.scala-sbt.org/release/docs/Getting-Started/Setup
[Issues]: https://github.com/sbt/sbt/issues
[sbt-dev]: https://groups.google.com/d/forum/sbt-dev
[subscriptions]: http://typesafe.com/how/subscription
[327]: https://github.com/sbt/sbt/issues/327
# Issues and Pull Requests
Issues and Pull Requests
========================
## New issues
When you find a bug in sbt we want to hear about it. Your bug reports play an important part in making sbt more reliable.
Please use the issue tracker to report confirmed bugs.
Do not use it to ask questions.
If you are uncertain whether something is a bug, please ask on StackOverflow or the sbt-dev mailing list first.
Effective bug reports are more likely to be fixed. These guidelines explain how to write such reports and pull requests.
When opening a new issue,
Preliminaries
--------------
* Please state the problem clearly and provide enough context.
+ Code examples and build transcripts are often useful when appropriately edited.
+ Show error messages and stack traces if appropriate.
* Minimize the problem to reduce non-essential factors. For example, dependencies or special environments.
* Include all relevant information needed to reproduce, such as the version of sbt and Scala being used.
- Make sure your sbt version is up to date.
- Search [StackOverflow] and [Issues] to see whether your bug has already been reported.
- Open one case for each problem.
- Proceed to the next steps for details.
Where to get help and/or file a bug report
------------------------------------------
sbt project uses GitHub issue as a publically visible todo list. Please open a GitHub issue only when asked to do so.
- If you need help with sbt, please ask on [StackOverflow] with the tag "sbt" and the name of the sbt plugin if any.
- If you run into an issue, have an enhancement idea, or a general discussion, bring it up to [sbt-dev] Google Group first.
- If you need faster response time, consider one of the [Typesafe subscriptions][subscriptions].
What to report
--------------
The developers need three things from you: **steps**, **problems**, and **expectations**.
### Steps
The most important thing to remember about bug reporting is to clearly distinguish facts and opinions. What we need first is **the exact steps to reproduce your problems on our computers**. This is called *reproduction steps*, which is often shortened to "repro steps" or "steps." Describe your method of running sbt. Provide `build.sbt` that caused the problem and the version of sbt or Scala that was used. Provide sample Scala code if it's to do with incremental compilation. If possible, minimize the problem to reduce non-essential factors.
Repro steps are the most important part of a bug report. If we cannot reproduce the problem in one way or the other, the problem can't be fixed. Telling us the error messages is not enough.
### Problems
Next, describe the problems, or what *you think* is the problem. It might be "obvious" to you that it's a problem, but it could actually be an intentional behavior for some backward compatibility etc. For compilation errors, include the stack trace. The more raw info the better.
### Expectations
Same as the problems. Describe what *you think* should've happened.
### Notes
Add an optional notes section to describe your analysis.
### Subject
The subject of the bug report doesn't matter. A more descriptive subject is certainly better, but a good subject really depends on the analysis of the problem, so don't worry too much about it. "StackOverflowError while name hashing is enabled" is good enough.
### Formatting
If possible, please format code or console outputs.
On Github it's:
```scala
name := "foo"
```
On StackOverflow, it's:
```
<!-- language: lang-scala -->
name := "foo"
```
Here's a simple sample case: [#327][327].
Finally, thank you for taking the time to report a problem.
## Pull Requests
Pull Requests
-------------
Whether implementing a new feature, fixing a bug, or modifying documentation, please work against the latest development branch (currently, 0.13).
Binary compatible changes will be backported to a previous series (currently, 0.12.x) at the time of the next stable release.
See below for instructions on building sbt from source.
## Documentation
Documentation
-------------
Documentation fixes and contributions are welcome.
They are made via pull requests, as described in the previous section.
See below for details on getting sbt sources and modifying the documentation.
# Build from source
Build from source
=================
1. Install the current stable binary release of sbt (see [Setup]), which will be used to build sbt from source.
2. Get the source code.
@ -73,7 +136,8 @@ See below for details on getting sbt sources and modifying the documentation.
4. If a project has `project/build.properties` defined, either delete the file or change `sbt.version` to `0.13.2-SNAPSHOT`.
## Building Documentation
Building Documentation
----------------------
The scala-sbt.org site documentation is built using sphinx and requires some external packages to be manually installed first:

View File

@ -1,4 +1,4 @@
Copyright (c) 2008, 2009, 2010 Steven Blundy, Josh Cough, Mark Harrah, Stuart Roebuck, Tony Sloane, Vesa Vilhonen, Jason Zaugg
Copyright (c) 2008-2014 Typesafe Inc, Mark Harrah, Grzegorz Kossakowski, Josh Suereth, Indrajit Raychaudhuri, Eugene Yokota, and other contributors.
All rights reserved.
Redistribution and use in source and binary forms, with or without

4
NOTICE
View File

@ -1,5 +1,5 @@
Simple Build Tool
Copyright 2008, 2009, 2010 Mark Harrah, Jason Zaugg
sbt
Copyright (c) 2008-2014 Typesafe Inc, Mark Harrah, Grzegorz Kossakowski, Josh Suereth, Indrajit Raychaudhuri, Eugene Yokota, and other contributors.
Licensed under BSD-style license (see LICENSE)
Portions based on code from the Scala compiler. Portions of the Scala

View File

@ -1,13 +1,33 @@
[Google Code]: http://code.google.com/p/simple-build-tool
[CONTRIBUTING]: https://github.com/sbt/sbt/blob/0.13/CONTRIBUTING.md
[Setup]: http://www.scala-sbt.org/release/docs/Getting-Started/Setup
[FAQ]: http://www.scala-sbt.org/release/docs/faq
[Google Code]: http://code.google.com/p/simple-build-tool
[CONTRIBUTING]: CONTRIBUTING.md
[Setup]: http://www.scala-sbt.org/release/docs/Getting-Started/Setup
[FAQ]: http://www.scala-sbt.org/release/docs/faq
[sbt-dev]: https://groups.google.com/d/forum/sbt-dev
[StackOverflow]: http://stackoverflow.com/tags/sbt
[LICENSE]: LICENSE
# sbt 0.13
sbt
===
sbt is a build tool for Scala, Java, and more.
For general documentation, see http://www.scala-sbt.org/.
Issues and Pull Requests
------------------------
Please read [CONTRIBUTING] carefully before opening a GitHub Issue.
The short version: try [StackOverflow] and [sbt-dev]. Don't open an Issue.
sbt 0.13
--------
This is the 0.13.x series of sbt.
* [Setup]: Describes getting started with the latest binary release.
* See [CONTRIBUTING] for how to build from source, open an issue, fix or add documentation, or submit a pull request.
* [FAQ]: Explains how to get help and more.
* [Google Code]: hosts sbt 0.7.7 and earlier versions
* [Google Code]: hosts sbt 0.7.7 and earlier versions
license
-------
See [LICENSE].

View File

@ -1,4 +1,4 @@
package sbt.inc
package xsbt.api
import xsbti.api.SourceAPI
import xsbti.api.Definition
@ -6,7 +6,10 @@ import xsbti.api.DefinitionType
import xsbti.api.ClassLike
import xsbti.api._internalOnly_NameHash
import xsbti.api._internalOnly_NameHashes
import xsbt.api.Visit
import xsbti.api.DefinitionType.ClassDef
import xsbti.api.DefinitionType.Module
import xsbti.api.DefinitionType.PackageModule
import xsbti.api.DefinitionType.Trait
/**
* A class that computes hashes for each group of definitions grouped by a simple name.

View File

@ -1,8 +1,7 @@
package sbt.inc
package xsbt.api
import org.junit.runner.RunWith
import xsbti.api._
import xsbt.api.HashAPI
import org.specs2.mutable.Specification
import org.specs2.runner.JUnitRunner

View File

@ -30,7 +30,18 @@ trait Analysis
/** Mappings between sources, classes, and binaries. */
val relations: Relations
val infos: SourceInfos
/** Information about compiler runs accumulated since `clean` command has been run. */
/**
* Information about compiler runs accumulated since `clean` command has been run.
*
* The main use-case for using `compilations` field is to determine how
* many iterations it took to compilen give code. The `Compilation` object
* are also stored in `Source` objects so there's an indirect way to recover
* information about files being recompiled in every iteration.
*
* The incremental compilation algorithm doesn't use information stored in
* `compilations`. It's safe to prune contents of that field without breaking
* internal consistency of the entire Analysis object.
*/
val compilations: Compilations
/** Concatenates Analysis objects naively, i.e., doesn't internalize external deps on added files. See `Analysis.merge`. */

View File

@ -13,7 +13,7 @@ trait ClassfileManager
* Any empty ancestor directories of deleted files must not exist either.*/
def delete(classes: Iterable[File]): Unit
/** Called once per compilation step with the class files generated during that step.*/
/** Called once per compilation step with the class files generated during that step.*/
def generated(classes: Iterable[File]): Unit
/** Called once at the end of the whole compilation run, with `success` indicating whether compilation succeeded (true) or not (false).*/
@ -29,29 +29,45 @@ object ClassfileManager
def generated(classes: Iterable[File]) {}
def complete(success: Boolean) {}
}
@deprecated("Use overloaded variant that takes additional logger argument, instead.", "0.13.5")
def transactional(tempDir0: File): () => ClassfileManager =
transactional(tempDir0, sbt.Logger.Null)
/** When compilation fails, this ClassfileManager restores class files to the way they were before compilation.*/
def transactional(tempDir0: File): () => ClassfileManager = () => new ClassfileManager
def transactional(tempDir0: File, logger: sbt.Logger): () => ClassfileManager = () => new ClassfileManager
{
val tempDir = tempDir0.getCanonicalFile
IO.delete(tempDir)
IO.createDirectory(tempDir)
logger.debug(s"Created transactional ClassfileManager with tempDir = $tempDir")
private[this] val generatedClasses = new mutable.HashSet[File]
private[this] val movedClasses = new mutable.HashMap[File, File]
private def showFiles(files: Iterable[File]): String = files.map(f => s"\t$f").mkString("\n")
def delete(classes: Iterable[File])
{
for(c <- classes) if(c.exists && !movedClasses.contains(c) && !generatedClasses(c))
logger.debug(s"About to delete class files:\n${showFiles(classes)}")
val toBeBackedUp = classes.filter(c => c.exists && !movedClasses.contains(c) && !generatedClasses(c))
logger.debug(s"We backup classs files:\n${showFiles(toBeBackedUp)}")
for(c <- toBeBackedUp) {
movedClasses.put(c, move(c))
}
IO.deleteFilesEmptyDirs(classes)
}
def generated(classes: Iterable[File]): Unit = generatedClasses ++= classes
def generated(classes: Iterable[File]): Unit = {
logger.debug(s"Registering generated classes:\n${showFiles(classes)}")
generatedClasses ++= classes
}
def complete(success: Boolean)
{
if(!success) {
logger.debug("Rolling back changes to class files.")
logger.debug(s"Removing generated classes:\n${showFiles(generatedClasses)}")
IO.deleteFilesEmptyDirs(generatedClasses)
logger.debug(s"Restoring class files: \n${showFiles(movedClasses.map(_._1))}")
for( (orig, tmp) <- movedClasses ) IO.move(tmp, orig)
}
logger.debug(s"Removing the temporary directory used for backing up class files: $tempDir")
IO.delete(tempDir)
}

View File

@ -154,7 +154,7 @@ private final class AnalysisCallback(internalMap: File => Option[File], external
if (APIUtil.isScalaSourceName(sourceFile.getName) && APIUtil.hasMacro(source)) macroSources += sourceFile
publicNameHashes(sourceFile) = {
if (nameHashing)
(new NameHashing).nameHashes(source)
(new xsbt.api.NameHashing).nameHashes(source)
else
emptyNameHashes
}

View File

@ -234,9 +234,12 @@ object IncOptions extends Serializable {
private def readResolve(): Object = IncOptions
//- EXPANDED CASE CLASS METHOD END -//
def defaultTransactional(tempDir: File): IncOptions = setTransactional(Default, tempDir)
@deprecated("Use IncOptions.Default.withNewClassfileManager(ClassfileManager.transactional(tempDir)), instead.", "0.13.5")
def defaultTransactional(tempDir: File): IncOptions =
setTransactional(Default, tempDir)
@deprecated("Use opts.withNewClassfileManager(ClassfileManager.transactional(tempDir)), instead.", "0.13.5")
def setTransactional(opts: IncOptions, tempDir: File): IncOptions =
opts.copy(newClassfileManager = ClassfileManager.transactional(tempDir))
opts.withNewClassfileManager(ClassfileManager.transactional(tempDir, sbt.Logger.Null))
private val transitiveStepKey = "transitiveStep"
private val recompileAllFractionKey = "recompileAllFraction"

View File

@ -146,7 +146,13 @@ final class Dependency(val global: CallbackGlobal) extends LocateClassFile
deps.foreach(addDependency)
case Template(parents, self, body) =>
traverseTrees(body)
case MacroExpansionOf(original) =>
/*
* Some macros appear to contain themselves as original tree
* In this case, we don't need to inspect the original tree because
* we already inspected its expansion, which is equal.
* See https://issues.scala-lang.org/browse/SI-8486
*/
case MacroExpansionOf(original) if original != tree =>
this.traverse(original)
case other => ()
}

View File

@ -55,7 +55,14 @@ class ExtractUsedNames[GlobalType <: CallbackGlobal](val global: GlobalType) ext
}
def handleTreeNode(node: Tree): Unit = {
def handleMacroExpansion(original: Tree): Unit = original.foreach(handleTreeNode)
def handleMacroExpansion(original: Tree): Unit = {
// Some macros seem to have themselves registered as original tree.
// In this case, we only need to handle the children of the original tree,
// because we already handled the expanded tree.
// See https://issues.scala-lang.org/browse/SI-8486
if(original == node) original.children.foreach(handleTreeNode)
else original.foreach(handleTreeNode)
}
def handleClassicTreeNode(node: Tree): Unit = node match {
case _: DefTree | _: Template => ()

View File

@ -10,13 +10,91 @@ import core.module.id.ModuleRevisionId
import core.module.descriptor.DependencyDescriptor
import core.resolve.ResolveData
import core.settings.IvySettings
import plugins.resolver.{BasicResolver, DependencyResolver, IBiblioResolver}
import plugins.resolver.{BasicResolver, DependencyResolver, IBiblioResolver, RepositoryResolver}
import plugins.resolver.{AbstractPatternsBasedResolver, AbstractSshBasedResolver, FileSystemResolver, SFTPResolver, SshResolver, URLResolver}
import plugins.repository.url.{URLRepository => URLRepo}
import plugins.repository.file.{FileRepository => FileRepo, FileResource}
import java.io.File
import org.apache.ivy.util.ChecksumHelper
import org.apache.ivy.core.module.descriptor.{Artifact=>IArtifact}
private object ConvertResolver
{
/** This class contains all the reflective lookups used in the
* checksum-friendly URL publishing shim.
*/
private object ChecksumFriendlyURLResolver {
// TODO - When we dump JDK6 support we can remove this hackery
// import java.lang.reflect.AccessibleObject
type AccessibleObject = {
def setAccessible(value: Boolean): Unit
}
private def reflectiveLookup[A <: AccessibleObject](f: Class[_] => A): Option[A] =
try {
val cls = classOf[RepositoryResolver]
val thing = f(cls)
import scala.language.reflectiveCalls
thing.setAccessible(true)
Some(thing)
} catch {
case (_: java.lang.NoSuchFieldException) |
(_: java.lang.SecurityException) |
(_: java.lang.NoSuchMethodException) => None
}
private val signerNameField: Option[java.lang.reflect.Field] =
reflectiveLookup(_.getDeclaredField("signerName"))
private val putChecksumMethod: Option[java.lang.reflect.Method] =
reflectiveLookup(_.getDeclaredMethod("putChecksum",
classOf[IArtifact], classOf[File], classOf[String],
classOf[Boolean], classOf[String]))
private val putSignatureMethod: Option[java.lang.reflect.Method] =
reflectiveLookup(_.getDeclaredMethod("putSignature",
classOf[IArtifact], classOf[File], classOf[String],
classOf[Boolean]))
}
/**
* The default behavior of ivy's overwrite flags ignores the fact that a lot of repositories
* will autogenerate checksums *for* an artifact if it doesn't already exist. Therefore
* if we succeed in publishing an artifact, we need to just blast the checksums in place.
* This acts as a "shim" on RepositoryResolvers so that we can hook our methods into
* both the IBiblioResolver + URLResolver without having to duplicate the code in two
* places. However, this does mean our use of reflection is awesome.
*
* TODO - See about contributing back to ivy.
*/
private trait ChecksumFriendlyURLResolver extends RepositoryResolver {
import ChecksumFriendlyURLResolver._
private def signerName: String = signerNameField match {
case Some(field) => field.get(this).asInstanceOf[String]
case None => null
}
override protected def put(artifact: IArtifact, src: File, dest: String, overwrite: Boolean): Unit = {
// verify the checksum algorithms before uploading artifacts!
val checksums = getChecksumAlgorithms()
val repository = getRepository()
for {
checksum <- checksums
if !ChecksumHelper.isKnownAlgorithm(checksum)
} throw new IllegalArgumentException("Unknown checksum algorithm: " + checksum)
repository.put(artifact, src, dest, overwrite);
// Fix for sbt#1156 - Artifactory will auto-generate MD5/sha1 files, so
// we need to overwrite what it has.
for (checksum <- checksums) {
putChecksumMethod match {
case Some(method) => method.invoke(this, artifact, src, dest, true: java.lang.Boolean, checksum)
case None => // TODO - issue warning?
}
}
if (signerName != null) {
putSignatureMethod match {
case None => ()
case Some(method) => method.invoke(artifact, src, dest, true: java.lang.Boolean)
}
}
}
}
/** Converts the given sbt resolver into an Ivy resolver..*/
def apply(r: Resolver, settings: IvySettings, log: Logger) =
{
@ -25,7 +103,7 @@ private object ConvertResolver
case repo: MavenRepository =>
{
val pattern = Collections.singletonList(Resolver.resolvePattern(repo.root, Resolver.mavenStyleBasePattern))
final class PluginCapableResolver extends IBiblioResolver with DescriptorRequired {
final class PluginCapableResolver extends IBiblioResolver with ChecksumFriendlyURLResolver with DescriptorRequired {
def setPatterns() { // done this way for access to protected methods.
setArtifactPatterns(pattern)
setIvyPatterns(pattern)
@ -77,7 +155,7 @@ private object ConvertResolver
}
case repo: URLRepository =>
{
val resolver = new URLResolver with DescriptorRequired
val resolver = new URLResolver with ChecksumFriendlyURLResolver with DescriptorRequired
resolver.setName(repo.name)
initializePatterns(resolver, repo.patterns, settings)
resolver

View File

@ -7,7 +7,6 @@ import Resolver.PluginPattern
import java.io.File
import java.net.URI
import java.text.ParseException
import java.util.concurrent.Callable
import java.util.{Collection, Collections => CS}
import CS.singleton
@ -24,9 +23,7 @@ import core.settings.IvySettings
import plugins.latest.LatestRevisionStrategy
import plugins.matcher.PatternMatcher
import plugins.parser.m2.PomModuleDescriptorParser
import plugins.repository.ResourceDownloader
import plugins.resolver.{ChainResolver, DependencyResolver}
import plugins.resolver.util.ResolvedResource
import util.{Message, MessageLogger}
import util.extendable.ExtendableItem
@ -358,41 +355,8 @@ private object IvySbt
case pr: ProjectResolver => true
case _ => false
}
/** This is overridden to delete outofdate artifacts of changing modules that are not listed in the metadata.
* This occurs for artifacts with classifiers, for example. */
@throws(classOf[ParseException])
override def cacheModuleDescriptor(resolver: DependencyResolver, mdRef: ResolvedResource, dd: DependencyDescriptor, moduleArtifact: IArtifact, downloader: ResourceDownloader, options: CacheMetadataOptions): ResolvedModuleRevision =
{
val rmrRaw = super.cacheModuleDescriptor(null, mdRef, dd, moduleArtifact, downloader, options)
val rmr = resetArtifactResolver(rmrRaw)
val mrid = moduleArtifact.getModuleRevisionId
def shouldClear(): Boolean = rmr != null &&
( (rmr.getReport != null && rmr.getReport.isSearched && isChanging(dd, mrid)) ||
isProjectResolver(rmr.getResolver) )
// only handle changing modules whose metadata actually changed.
// Typically, the publication date in the metadata has to change to get here.
if(shouldClear()) {
// this is the locally cached metadata as originally retrieved (e.g. the pom)
val original = rmr.getReport.getOriginalLocalFile
if(original != null) {
// delete all files in subdirectories that are older than the original metadata file's publication date
// The publication date is used because the metadata will be redownloaded for changing files,
// so the last modified time changes, but the publication date doesn't
val pubDate = rmrRaw.getPublicationDate
val lm = if(pubDate eq null) original.lastModified else pubDate.getTime
val indirectFiles = PathFinder(original.getParentFile).*(DirectoryFilter).**(-DirectoryFilter).get.toList
val older = indirectFiles.filter(f => f.lastModified < lm).toList
Message.verbose("Deleting additional old artifacts from cache for changed module " + mrid + older.mkString(":\n\t", "\n\t", ""))
IO.delete(older)
}
}
rmr
}
// ignore the original resolver wherever possible to avoid issues like #704
override def saveResolvers(descriptor: ModuleDescriptor, metadataResolverName: String, artifactResolverName: String) {}
def isChanging(dd: DependencyDescriptor, requestedRevisionId: ModuleRevisionId): Boolean =
!localOnly && (dd.isChanging || requestedRevisionId.getRevision.contains("-SNAPSHOT"))
}
manager.setArtifactPattern(PluginPattern + manager.getArtifactPattern)
manager.setDataFilePattern(PluginPattern + manager.getDataFilePattern)

View File

@ -31,7 +31,7 @@ private final class IvyLoggerInterface(logger: Logger) extends MessageLogger
def warn(msg: String) = logger.warn(msg)
def error(msg: String) = if(SbtIvyLogger.acceptError(msg)) logger.error(msg)
private def emptyList = java.util.Collections.emptyList[T forSome { type T}]
private def emptyList = java.util.Collections.emptyList[String]
def getProblems = emptyList
def getWarns = emptyList
def getErrors = emptyList

View File

@ -1,7 +1,9 @@
package xsbti;
/** A launched application returns an instance of this class in order to communicate to the launcher
* that the application is completely finished and the launcher should exit with the given exit code.*/
/**
* A launched application returns an instance of this class in order to communicate to the launcher
* that the application finished and the launcher should exit with the given exit code.
*/
public interface Exit extends MainResult
{
public int code();

View File

@ -1,8 +1,12 @@
package xsbti;
/** A launched application should return an instance of this from its 'run' method
* to communicate to the launcher what should be done now that the application
* has competed. This interface should be treated as 'sealed', with Exit and Reboot the only
* direct subtypes.
*/
/**
* A launched application should return an instance of this from its 'run' method
* to communicate to the launcher what should be done now that the application
* has completed. This interface should be treated as 'sealed', with Exit and Reboot the only
* direct subtypes.
*
* @see xsbti.Exit
* @see xsbti.Reboot
*/
public interface MainResult {}

View File

@ -2,9 +2,11 @@ package xsbti;
import java.io.File;
/** A launched application returns an instance of this class in order to communicate to the launcher
* that the application should be restarted. Different versions of the application and Scala can be used.
* The application can be given different arguments and a new working directory as well.*/
/**
* A launched application returns an instance of this class in order to communicate to the launcher
* that the application should be restarted. Different versions of the application and Scala can be used.
* The application can be given different arguments as well as a new working directory.
*/
public interface Reboot extends MainResult
{
public String[] arguments();

View File

@ -150,7 +150,12 @@ object ServerLauncher {
} finally {
errorDumper.close()
stdout.close()
stderr.close()
// Note: Closing this causes windows to block waiting for the server
// to close, but it ne'er will, as it is obstinate, and not designed
// to close immediately, unlike this process.
// We leave it open because this JVM shold be shut down soon anyway,
// and that will clean up al this memory.
//stderr.close()
}
}

View File

@ -0,0 +1,159 @@
package sbt
import sbt.Tests.{Output, Summary}
/**
* Logs information about tests after they finish.
*
* Log output can be customised by providing a specialised instance of this
* trait via the `testTestResultLogger` setting.
*
* @since 0.13.5
*/
trait TestResultLogger {
/**
* Perform logging.
*
* @param log The target logger to write output to.
* @param results The test results about which to log.
* @param taskName The task about which we are logging. Eg. "my-module-b/test:test"
*/
def run(log: Logger, results: Output, taskName: String): Unit
/** Only allow invocation if certain criteria is met, else use another `TestResultLogger` (defaulting to nothing) . */
final def onlyIf(f: (Output, String) => Boolean, otherwise: TestResultLogger = TestResultLogger.Null) =
TestResultLogger.choose(f, this, otherwise)
/** Allow invocation unless a certain predicate passes, in which case use another `TestResultLogger` (defaulting to nothing) . */
final def unless(f: (Output, String) => Boolean, otherwise: TestResultLogger = TestResultLogger.Null) =
TestResultLogger.choose(f, otherwise, this)
}
object TestResultLogger {
/** A `TestResultLogger` that does nothing. */
val Null = const(_ => ())
/** SBT's default `TestResultLogger`. Use `copy()` to change selective portions. */
val Default = Defaults.Main()
/** Twist on the default which is completely silent when the subject module doesn't contain any tests. */
def SilentWhenNoTests = silenceWhenNoTests(Default)
/** Creates a `TestResultLogger` using a given function. */
def apply(f: (Logger, Output, String) => Unit): TestResultLogger =
new TestResultLogger {
override def run(log: Logger, results: Output, taskName: String) =
f(log, results, taskName)
}
/** Creates a `TestResultLogger` that ignores its input and always performs the same logging. */
def const(f: Logger => Unit) = apply((l,_,_) => f(l))
/**
* Selects a `TestResultLogger` based on a given predicate.
*
* @param t The `TestResultLogger` to choose if the predicate passes.
* @param f The `TestResultLogger` to choose if the predicate fails.
*/
def choose(cond: (Output, String) => Boolean, t: TestResultLogger, f: TestResultLogger) =
TestResultLogger((log, results, taskName) =>
(if (cond(results, taskName)) t else f).run(log, results, taskName))
/** Transforms the input to be completely silent when the subject module doesn't contain any tests. */
def silenceWhenNoTests(d: Defaults.Main) =
d.copy(
printStandard = d.printStandard.unless((results, _) => results.events.isEmpty),
printNoTests = Null
)
object Defaults {
/** SBT's default `TestResultLogger`. Use `copy()` to change selective portions. */
case class Main(
printStandard_? : Output => Boolean = Defaults.printStandard_?,
printSummary : TestResultLogger = Defaults.printSummary,
printStandard : TestResultLogger = Defaults.printStandard,
printFailures : TestResultLogger = Defaults.printFailures,
printNoTests : TestResultLogger = Defaults.printNoTests
) extends TestResultLogger {
override def run(log: Logger, results: Output, taskName: String): Unit = {
def run(r: TestResultLogger): Unit = r.run(log, results, taskName)
run(printSummary)
if (printStandard_?(results))
run(printStandard)
if (results.events.isEmpty)
run(printNoTests)
else
run(printFailures)
results.overall match {
case TestResult.Error | TestResult.Failed => throw new TestsFailedException
case TestResult.Passed =>
}
}
}
val printSummary = TestResultLogger((log, results, _) => {
val multipleFrameworks = results.summaries.size > 1
for (Summary(name, message) <- results.summaries)
if(message.isEmpty)
log.debug("Summary for " + name + " not available.")
else {
if(multipleFrameworks) log.info(name)
log.info(message)
}
})
val printStandard_? : Output => Boolean =
results =>
// Print the standard one-liner statistic if no framework summary is defined, or when > 1 framework is in used.
results.summaries.size > 1 || results.summaries.headOption.forall(_.summaryText.size == 0)
val printStandard = TestResultLogger((log, results, _) => {
val (skippedCount, errorsCount, passedCount, failuresCount, ignoredCount, canceledCount, pendingCount) =
results.events.foldLeft((0, 0, 0, 0, 0, 0, 0)) { case ((skippedAcc, errorAcc, passedAcc, failureAcc, ignoredAcc, canceledAcc, pendingAcc), (name, testEvent)) =>
(skippedAcc + testEvent.skippedCount, errorAcc + testEvent.errorCount, passedAcc + testEvent.passedCount, failureAcc + testEvent.failureCount,
ignoredAcc + testEvent.ignoredCount, canceledAcc + testEvent.canceledCount, pendingAcc + testEvent.pendingCount)
}
val totalCount = failuresCount + errorsCount + skippedCount + passedCount
val base = s"Total $totalCount, Failed $failuresCount, Errors $errorsCount, Passed $passedCount"
val otherCounts = Seq("Skipped" -> skippedCount, "Ignored" -> ignoredCount, "Canceled" -> canceledCount, "Pending" -> pendingCount)
val extra = otherCounts.filter(_._2 > 0).map{case(label,count) => s", $label $count" }
val postfix = base + extra.mkString
results.overall match {
case TestResult.Error => log.error("Error: " + postfix)
case TestResult.Passed => log.info("Passed: " + postfix)
case TestResult.Failed => log.error("Failed: " + postfix)
}
})
val printFailures = TestResultLogger((log, results, _) => {
def select(resultTpe: TestResult.Value) = results.events collect {
case (name, tpe) if tpe.result == resultTpe =>
scala.reflect.NameTransformer.decode(name)
}
def show(label: String, level: Level.Value, tests: Iterable[String]): Unit =
if (!tests.isEmpty) {
log.log(level, label)
log.log(level, tests.mkString("\t", "\n\t", ""))
}
show("Passed tests:", Level.Debug, select(TestResult.Passed))
show("Failed tests:", Level.Error, select(TestResult.Failed))
show("Error during tests:", Level.Error, select(TestResult.Error))
})
val printNoTests = TestResultLogger((log, results, taskName) =>
log.info("No tests to run for " + taskName)
)
}
}

View File

@ -264,78 +264,10 @@ object Tests
(tests, mains.toSet)
}
@deprecated("Tests.showResults() has been superseded with TestResultLogger and setting 'testResultLogger'.", "0.13.5")
def showResults(log: Logger, results: Output, noTestsMessage: =>String): Unit =
{
val multipleFrameworks = results.summaries.size > 1
def printSummary(name: String, message: String)
{
if(message.isEmpty)
log.debug("Summary for " + name + " not available.")
else
{
if(multipleFrameworks) log.info(name)
log.info(message)
}
}
for (Summary(name, messages) <- results.summaries)
printSummary(name, messages)
val noSummary = results.summaries.headOption.forall(_.summaryText.size == 0)
val printStandard = multipleFrameworks || noSummary
// Print the standard one-liner statistic if no framework summary is defined, or when > 1 framework is in used.
if (printStandard)
{
val (skippedCount, errorsCount, passedCount, failuresCount, ignoredCount, canceledCount, pendingCount) =
results.events.foldLeft((0, 0, 0, 0, 0, 0, 0)) { case ((skippedAcc, errorAcc, passedAcc, failureAcc, ignoredAcc, canceledAcc, pendingAcc), (name, testEvent)) =>
(skippedAcc + testEvent.skippedCount, errorAcc + testEvent.errorCount, passedAcc + testEvent.passedCount, failureAcc + testEvent.failureCount,
ignoredAcc + testEvent.ignoredCount, canceledAcc + testEvent.canceledCount, pendingAcc + testEvent.pendingCount)
}
val totalCount = failuresCount + errorsCount + skippedCount + passedCount
val base = s"Total $totalCount, Failed $failuresCount, Errors $errorsCount, Passed $passedCount"
val otherCounts = Seq("Skipped" -> skippedCount, "Ignored" -> ignoredCount, "Canceled" -> canceledCount, "Pending" -> pendingCount)
val extra = otherCounts.filter(_._2 > 0).map{case(label,count) => s", $label $count" }
val postfix = base + extra.mkString
results.overall match {
case TestResult.Error => log.error("Error: " + postfix)
case TestResult.Passed => log.info("Passed: " + postfix)
case TestResult.Failed => log.error("Failed: " + postfix)
}
}
// Let's always print out Failed tests for now
if (results.events.isEmpty)
log.info(noTestsMessage)
else {
import TestResult.{Error, Failed, Passed}
import scala.reflect.NameTransformer.decode
def select(resultTpe: TestResult.Value) = results.events collect {
case (name, tpe) if tpe.result == resultTpe =>
decode(name)
}
val failures = select(Failed)
val errors = select(Error)
val passed = select(Passed)
def show(label: String, level: Level.Value, tests: Iterable[String]): Unit =
if(!tests.isEmpty)
{
log.log(level, label)
log.log(level, tests.mkString("\t", "\n\t", ""))
}
show("Passed tests:", Level.Debug, passed )
show("Failed tests:", Level.Error, failures)
show("Error during tests:", Level.Error, errors)
}
results.overall match {
case TestResult.Error | TestResult.Failed => throw new TestsFailedException
case TestResult.Passed =>
}
}
TestResultLogger.Default.copy(printNoTests = TestResultLogger.const(_ info noTestsMessage))
.run(log, results, "")
}
final class TestsFailedException extends RuntimeException("Tests unsuccessful") with FeedbackProvidedException

View File

@ -19,7 +19,7 @@ object BasicCommandStrings
/** The command name to terminate the program.*/
val TerminateAction: String = Exit
def helpBrief = (HelpCommand, "Displays this help message or prints detailed help on requested commands (run 'help <command>').")
def helpBrief = (HelpCommand, s"Displays this help message or prints detailed help on requested commands (run '$HelpCommand <command>').")
def helpDetailed = HelpCommand + """
Prints a help summary.

View File

@ -59,7 +59,7 @@ final object Aggregation
import extracted.structure
val toRun = ts map { case KeyValue(k,t) => t.map(v => KeyValue(k,v)) } join;
val roots = ts map { case KeyValue(k,_) => k }
val config = extractedConfig(extracted, structure, s)
val config = extractedTaskConfig(extracted, structure, s)
val start = System.currentTimeMillis
val (newS, result) = withStreams(structure, s){ str =>

View File

@ -38,7 +38,7 @@ s"""$multiTaskSyntax
def multiTaskBrief = """Executes all of the specified tasks concurrently."""
def showHelp = Help(ShowCommand, (ShowCommand + " <key>", actBrief), actDetailed)
def showHelp = Help(ShowCommand, (s"$ShowCommand <key>", showBrief), showDetailed)
def showBrief = "Displays the result of evaluating the setting or task associated with 'key'."
def showDetailed =
s"""$ShowCommand <setting>

View File

@ -120,6 +120,10 @@ object Defaults extends BuildCommon
trapExit :== true,
connectInput :== false,
cancelable :== false,
taskCancelStrategy := { state: State =>
if(cancelable.value) TaskCancellationStrategy.Signal
else TaskCancellationStrategy.Null
},
envVars :== Map.empty,
sbtVersion := appConfiguration.value.provider.id.version,
sbtBinaryVersion := binarySbtVersion(sbtVersion.value),
@ -211,7 +215,8 @@ object Defaults extends BuildCommon
)
def compileBase = inTask(console)(compilersSetting :: Nil) ++ compileBaseGlobal ++ Seq(
incOptions := IncOptions.setTransactional(incOptions.value, crossTarget.value / "classes.bak"),
incOptions := incOptions.value.withNewClassfileManager(
sbt.inc.ClassfileManager.transactional(crossTarget.value / "classes.bak", sbt.Logger.Null)),
scalaInstance <<= scalaInstanceTask,
crossVersion := (if(crossPaths.value) CrossVersion.binary else CrossVersion.Disabled),
crossTarget := makeCrossTarget(target.value, scalaBinaryVersion.value, sbtBinaryVersion.value, sbtPlugin.value, crossPaths.value)
@ -367,6 +372,7 @@ object Defaults extends BuildCommon
},
testListeners :== Nil,
testOptions :== Nil,
testResultLogger :== TestResultLogger.Default,
testFilter in testOnly :== (selectedFilter _)
))
lazy val testTasks: Seq[Setting[_]] = testTaskOptions(test) ++ testTaskOptions(testOnly) ++ testTaskOptions(testQuick) ++ testDefaults ++ Seq(
@ -376,16 +382,15 @@ object Defaults extends BuildCommon
definedTestNames <<= definedTests map ( _.map(_.name).distinct) storeAs definedTestNames triggeredBy compile,
testFilter in testQuick <<= testQuickFilter,
executeTests <<= (streams in test, loadedTestFrameworks, testLoader, testGrouping in test, testExecution in test, fullClasspath in test, javaHome in test, testForkedParallel) flatMap allTestGroupsTask,
testResultLogger in (Test, test) :== TestResultLogger.SilentWhenNoTests, // https://github.com/sbt/sbt/issues/1185
test := {
implicit val display = Project.showContextKey(state.value)
Tests.showResults(streams.value.log, executeTests.value, noTestsMessage(resolvedScoped.value))
val trl = (testResultLogger in (Test, test)).value
val taskName = Project.showContextKey(state.value)(resolvedScoped.value)
trl.run(streams.value.log, executeTests.value, taskName)
},
testOnly <<= inputTests(testOnly),
testQuick <<= inputTests(testQuick)
)
private[this] def noTestsMessage(scoped: ScopedKey[_])(implicit display: Show[ScopedKey[_]]): String =
"No tests to run for " + display(scoped)
lazy val TaskGlobal: Scope = ThisScope.copy(task = Global)
lazy val ConfigGlobal: Scope = ThisScope.copy(config = Global)
def testTaskOptions(key: Scoped): Seq[Setting[_]] = inTask(key)( Seq(
@ -484,9 +489,9 @@ object Defaults extends BuildCommon
val modifiedOpts = Tests.Filters(filter(selected)) +: Tests.Argument(frameworkOptions : _*) +: config.options
val newConfig = config.copy(options = modifiedOpts)
val output = allTestGroupsTask(s, loadedTestFrameworks.value, testLoader.value, testGrouping.value, newConfig, fullClasspath.value, javaHome.value, testForkedParallel.value)
val processed =
for(out <- output) yield
Tests.showResults(s.log, out, noTestsMessage(resolvedScoped.value))
val taskName = display(resolvedScoped.value)
val trl = testResultLogger.value
val processed = output.map(out => trl.run(s.log, out, taskName))
Def.value(processed)
}
}

View File

@ -11,21 +11,51 @@ package sbt
import Scope.GlobalScope
import scala.annotation.tailrec
/**
* This file is responsible for compiling the .sbt files used to configure sbt builds.
*
* Compilation is done in three phases:
*
* 1. Parsing high-level constructs (definitions, settings, imports)
* 2. Compiling scala code into local .class files
* 3. Evaluating the expressions and obtaining in-memory objects of the results (Setting[_] instances, or val references).
*
*
*/
object EvaluateConfigurations
{
/**
* This represents the parsed expressions in a build sbt, as well as where they were defined.
*/
private[this] final class ParsedFile(val imports: Seq[(String,Int)], val definitions: Seq[(String,LineRange)], val settings: Seq[(String,LineRange)])
/** The keywords we look for when classifying a string as a definition. */
private[this] val DefinitionKeywords = Seq("lazy val ", "def ", "val ")
/** Using an evaluating instance of the scala compiler, a sequence of files and
* the default imports to use, this method will take a ClassLoader of sbt-classes and
* return a parsed, compiled + evaluated [[LoadedSbtFile]]. The result has
* raw sbt-types that can be accessed and used.
*/
def apply(eval: Eval, srcs: Seq[File], imports: Seq[String]): ClassLoader => LoadedSbtFile =
{
val loadFiles = srcs.sortBy(_.getName) map { src => evaluateSbtFile(eval, src, IO.readLines(src), imports, 0) }
loader => (LoadedSbtFile.empty /: loadFiles) { (loaded, load) => loaded merge load(loader) }
}
/**
* Reads a given .sbt file and evaluates it into a sequence of setting values.
*/
def evaluateConfiguration(eval: Eval, src: File, imports: Seq[String]): ClassLoader => Seq[Setting[_]] =
evaluateConfiguration(eval, src, IO.readLines(src), imports, 0)
/**
* Parses a sequence of build.sbt lines into a [[ParsedFile]]. The result contains
* a fragmentation of all imports, settings and definitions.
*
* @param buildinImports The set of import statements to add to those parsed in the .sbt file.
*/
private[this] def parseConfiguration(lines: Seq[String], builtinImports: Seq[String], offset: Int): ParsedFile =
{
val (importStatements, settingsAndDefinitions) = splitExpressions(lines)
@ -34,12 +64,33 @@ object EvaluateConfigurations
new ParsedFile(allImports, definitions, settings)
}
/**
* Evaluates a parsed sbt configuration file.
*
* @param eval The evaluating scala compiler instance we use to handle evaluating scala configuration.
* @param file The file we've parsed
* @param imports The default imports to use in this .sbt configuration
* @param lines The lines of the configurtion we'd like to evaluate.
*
* @return Just the Setting[_] instances defined in the .sbt file.
*/
def evaluateConfiguration(eval: Eval, file: File, lines: Seq[String], imports: Seq[String], offset: Int): ClassLoader => Seq[Setting[_]] =
{
val l = evaluateSbtFile(eval, file, lines, imports, offset)
loader => l(loader).settings
}
/**
* Evaluates a parsed sbt configuration file.
*
* @param eval The evaluating scala compiler instance we use to handle evaluating scala configuration.
* @param file The file we've parsed
* @param lines The lines of the configurtion we'd like to evaluate.
* @param imports The default imports to use in this .sbt configuration.
*
* @return A function which can take an sbt classloader and return the raw types/configuratoin
* which was compiled/parsed for the given file.
*/
private[sbt] def evaluateSbtFile(eval: Eval, file: File, lines: Seq[String], imports: Seq[String], offset: Int): ClassLoader => LoadedSbtFile =
{
val name = file.getPath
@ -58,6 +109,7 @@ object EvaluateConfigurations
val loadSettings = flatten(settings)
loader => new LoadedSbtFile(loadSettings(loader), projects(loader), importDefs)
}
/** move a project to be relative to this file after we've evaluated it. */
private[this] def resolveBase(f: File, p: Project) = p.copy(base = IO.resolve(f, p.base))
def flatten(mksettings: Seq[ClassLoader => Seq[Setting[_]]]): ClassLoader => Seq[Setting[_]] =
loader => mksettings.flatMap(_ apply loader)
@ -66,10 +118,26 @@ object EvaluateConfigurations
def addOffsetToRange(offset: Int, ranges: Seq[(String,LineRange)]): Seq[(String,LineRange)] =
ranges.map { case (s, r) => (s, r shift offset) }
/**
* The name of the class we cast DSL "setting" (vs. definition) lines to.
*/
val SettingsDefinitionName = {
val _ = classOf[sbt.Def.SettingsDefinition] // this line exists to try to provide a compile-time error when the following line needs to be changed
"sbt.Def.SettingsDefinition"
}
/**
* This actually compiles a scala expression which represents a Seq[Setting[_]], although the
* expression may be just a single setting.
*
* @param eval The mechanism to compile and evaluate Scala expressions.
* @param name The name for the thing we're compiling
* @param imports The scala imports to have in place when we compile the expression
* @param expression The scala expression we're compiling
* @param range The original position in source of the expression, for error messages.
*
* @return A method that given an sbt classloader, can return the actual Seq[Setting[_]] defined by
* the expression.
*/
def evaluateSetting(eval: Eval, name: String, imports: Seq[(String,Int)], expression: String, range: LineRange): ClassLoader => Seq[Setting[_]] =
{
val result = try {
@ -86,6 +154,10 @@ object EvaluateConfigurations
private[this] def fstS(f: String => Boolean): ((String,Int)) => Boolean = { case (s,i) => f(s) }
private[this] def firstNonSpaceIs(lit: String) = (_: String).view.dropWhile(isSpace).startsWith(lit)
private[this] def or[A](a: A => Boolean, b: A => Boolean): A => Boolean = in => a(in) || b(in)
/**
* Splits a set of lines into (imports, expressions). That is,
* anything on the right of the tuple is a scala expression (definition or setting).
*/
def splitExpressions(lines: Seq[String]): (Seq[(String,Int)], Seq[(String,LineRange)]) =
{
val blank = (_: String).forall(isSpace)

View File

@ -14,7 +14,105 @@ package sbt
import std.Transform.{DummyTaskMap,TaskAndValue}
import TaskName._
@deprecated("Use EvaluateTaskConfig instead.", "0.13.5")
final case class EvaluateConfig(cancelable: Boolean, restrictions: Seq[Tags.Rule], checkCycles: Boolean = false, progress: ExecuteProgress[Task] = EvaluateTask.defaultProgress)
/** An API that allows you to cancel executing tasks upon some signal.
*
* For example, this is implemented by the TaskEngine; invoking `cancel()` allows you
* to cancel the current task exeuction. A `TaskCancel` is passed to the
* [[TaskEvalautionCancelHandler]] which is responsible for calling `cancel()` when
* appropriate.
*/
trait RunningTaskEngine {
/** Attempts to kill and shutdown the running task engine.*/
def cancelAndShutdown(): Unit
}
/**
* A startegy for being able to cancle tasks.
*
* Implementations of this trait determine what will trigger `cancel()` for
* the task engine, providing in the `start` method.
*
* All methods on this API are expected to be called from the same thread.
*/
trait TaskCancellationStrategy {
/** The state used by this task. */
type State
/** Called when task evaluation starts.
*
* @param canceller An object that can cancel the current task evaluation session.
* @return Whatever state you need to cleanup in your finish method.
*/
def onTaskEngineStart(canceller: RunningTaskEngine): State
/** Called when task evaluation completes, either in success or failure. */
def onTaskEngineFinish(state: State): Unit
}
object TaskCancellationStrategy {
/** An empty handler that does not cancel tasks. */
object Null extends TaskCancellationStrategy {
type State = Unit
def onTaskEngineStart(canceller: RunningTaskEngine): Unit = ()
def onTaskEngineFinish(state: Unit): Unit = ()
}
/** Cancel handler which registers for SIGINT and cancels tasks when it is received. */
object Signal extends TaskCancellationStrategy {
type State = Signals.Registration
def onTaskEngineStart(canceller: RunningTaskEngine): Signals.Registration = {
Signals.register(() => canceller.cancelAndShutdown())
}
def onTaskEngineFinish(registration: Signals.Registration): Unit =
registration.remove()
}
}
/** The new API for running tasks.
*
* This represents all the hooks possible when running the task engine.
* We expose this trait so that we can, in a binary compatible way, modify what is used
* inside this configuration and how to construct it.
*/
sealed trait EvaluateTaskConfig {
def restrictions: Seq[Tags.Rule]
def checkCycles: Boolean
def progressReporter: ExecuteProgress[Task]
def cancelStrategy: TaskCancellationStrategy
}
final object EvaluateTaskConfig {
/** Pulls in the old configuration format. */
def apply(old: EvaluateConfig): EvaluateTaskConfig = {
object AdaptedTaskConfig extends EvaluateTaskConfig {
def restrictions: Seq[Tags.Rule] = old.restrictions
def checkCycles: Boolean = old.checkCycles
def progressReporter: ExecuteProgress[Task] = old.progress
def cancelStrategy: TaskCancellationStrategy =
if(old.cancelable) TaskCancellationStrategy.Signal
else TaskCancellationStrategy.Null
}
AdaptedTaskConfig
}
/** Raw constructor for EvaluateTaskConfig. */
def apply(restrictions: Seq[Tags.Rule],
checkCycles: Boolean,
progressReporter: ExecuteProgress[Task],
cancelStrategy: TaskCancellationStrategy): EvaluateTaskConfig = {
val r = restrictions
val check = checkCycles
val cs = cancelStrategy
val pr = progressReporter
object SimpleEvaluateTaskConfig extends EvaluateTaskConfig {
def restrictions = r
def checkCycles = check
def progressReporter = pr
def cancelStrategy = cs
}
SimpleEvaluateTaskConfig
}
}
final case class PluginData(dependencyClasspath: Seq[Attributed[File]], definitionClasspath: Seq[Attributed[File]], resolvers: Option[Seq[Resolver]], report: Option[UpdateReport], scalacOptions: Seq[String])
{
val classpath: Seq[Attributed[File]] = definitionClasspath ++ dependencyClasspath
@ -40,24 +138,25 @@ object EvaluateTask
val SystemProcessors = Runtime.getRuntime.availableProcessors
@deprecated("Use extractedConfig.", "0.13.0")
@deprecated("Use extractedTaskConfig.", "0.13.0")
def defaultConfig(state: State): EvaluateConfig =
{
val extracted = Project.extract(state)
extractedConfig(extracted, extracted.structure, state)
}
@deprecated("Use extractedConfig.", "0.13.0")
@deprecated("Use extractedTaskConfig.", "0.13.0")
def defaultConfig(extracted: Extracted, structure: BuildStructure) =
EvaluateConfig(false, restrictions(extracted, structure), progress = defaultProgress)
@deprecated("Use other extractedConfig", "0.13.2")
@deprecated("Use other extractedTaskConfig", "0.13.2")
def extractedConfig(extracted: Extracted, structure: BuildStructure): EvaluateConfig =
{
val workers = restrictions(extracted, structure)
val canCancel = cancelable(extracted, structure)
EvaluateConfig(cancelable = canCancel, restrictions = workers, progress = defaultProgress)
}
@deprecated("Use other extractedTaskConfig", "0.13.5")
def extractedConfig(extracted: Extracted, structure: BuildStructure, state: State): EvaluateConfig =
{
val workers = restrictions(extracted, structure)
@ -65,6 +164,13 @@ object EvaluateTask
val progress = executeProgress(extracted, structure, state)
EvaluateConfig(cancelable = canCancel, restrictions = workers, progress = progress)
}
def extractedTaskConfig(extracted: Extracted, structure: BuildStructure, state: State): EvaluateTaskConfig =
{
val rs = restrictions(extracted, structure)
val canceller = cancelStrategy(extracted, structure, state)
val progress = executeProgress(extracted, structure, state)
EvaluateTaskConfig(rs, false, progress, canceller)
}
def defaultRestrictions(maxWorkers: Int) = Tags.limitAll(maxWorkers) :: Nil
def defaultRestrictions(extracted: Extracted, structure: BuildStructure): Seq[Tags.Rule] =
@ -84,11 +190,13 @@ object EvaluateTask
1
def cancelable(extracted: Extracted, structure: BuildStructure): Boolean =
getSetting(Keys.cancelable, false, extracted, structure)
def cancelStrategy(extracted: Extracted, structure: BuildStructure, state: State): TaskCancellationStrategy =
getSetting(Keys.taskCancelStrategy, {(_: State) => TaskCancellationStrategy.Null}, extracted, structure)(state)
private[sbt] def executeProgress(extracted: Extracted, structure: BuildStructure, state: State): ExecuteProgress[Task] = {
import Types.const
val maker: State => Keys.TaskProgress = getSetting(Keys.executeProgress, const(new Keys.TaskProgress(defaultProgress)), extracted, structure)
maker(state).progress
val maker: State => Keys.TaskProgress = getSetting(Keys.executeProgress, const(new Keys.TaskProgress(defaultProgress)), extracted, structure)
maker(state).progress
}
def getSetting[T](key: SettingKey[T], default: T, extracted: Extracted, structure: BuildStructure): T =
@ -119,16 +227,20 @@ object EvaluateTask
* If the task is not defined, None is returned. The provided task key is resolved against the current project `ref`.
* Task execution is configured according to settings defined in the loaded project.*/
def apply[T](structure: BuildStructure, taskKey: ScopedKey[Task[T]], state: State, ref: ProjectRef): Option[(State, Result[T])] =
apply[T](structure, taskKey, state, ref, extractedConfig(Project.extract(state), structure))
apply[T](structure, taskKey, state, ref, extractedTaskConfig(Project.extract(state), structure, state))
/** Evaluates `taskKey` and returns the new State and the result of the task wrapped in Some.
* If the task is not defined, None is returned. The provided task key is resolved against the current project `ref`.
* `config` configures concurrency and canceling of task execution. */
@deprecated("Use EvalauteTaskConfig option instead.", "0.13.5")
def apply[T](structure: BuildStructure, taskKey: ScopedKey[Task[T]], state: State, ref: ProjectRef, config: EvaluateConfig): Option[(State, Result[T])] =
apply(structure, taskKey, state, ref, EvaluateTaskConfig(config))
def apply[T](structure: BuildStructure, taskKey: ScopedKey[Task[T]], state: State, ref: ProjectRef, config: EvaluateTaskConfig): Option[(State, Result[T])] = {
withStreams(structure, state) { str =>
for( (task, toNode) <- getTask(structure, taskKey, state, str, ref) ) yield
runTask(task, state, str, structure.index.triggers, config)(toNode)
}
}
def logIncResult(result: Result[_], state: State, streams: Streams) = result match { case Inc(i) => logIncomplete(i, state, streams); case _ => () }
def logIncomplete(result: Incomplete, state: State, streams: Streams)
{
@ -176,12 +288,18 @@ object EvaluateTask
def nodeView[HL <: HList](state: State, streams: Streams, roots: Seq[ScopedKey[_]], dummies: DummyTaskMap = DummyTaskMap(Nil)): NodeView[Task] =
Transform((dummyRoots, roots) :: (dummyStreamsManager, streams) :: (dummyState, state) :: dummies )
def runTask[T](root: Task[T], state: State, streams: Streams, triggers: Triggers[Task], config: EvaluateConfig)(implicit taskToNode: NodeView[Task]): (State, Result[T]) =
@deprecated("Use new EvalauteTaskConfig option to runTask", "0.13.5")
def runTask[T](root: Task[T], state: State, streams: Streams, triggers: Triggers[Task], config: EvaluateConfig)(implicit taskToNode: NodeView[Task]): (State, Result[T]) =
{
val newConfig = EvaluateTaskConfig(config)
runTask(root, state, streams, triggers, newConfig)(taskToNode)
}
def runTask[T](root: Task[T], state: State, streams: Streams, triggers: Triggers[Task], config: EvaluateTaskConfig)(implicit taskToNode: NodeView[Task]): (State, Result[T]) =
{
import ConcurrentRestrictions.{completionService, TagMap, Tag, tagged, tagsKey}
import ConcurrentRestrictions.{completionService, TagMap, Tag, tagged, tagsKey}
val log = state.log
log.debug("Running task... Cancelable: " + config.cancelable + ", check cycles: " + config.checkCycles)
log.debug("Running task... Cancel: " + config.cancelStrategy + ", check cycles: " + config.checkCycles)
val tags = tagged[Task[_]](_.info get tagsKey getOrElse Map.empty, Tags.predicate(config.restrictions))
val (service, shutdown) = completionService[Task[_], Completed](tags, (s: String) => log.warn(s))
@ -191,7 +309,7 @@ object EvaluateTask
case _ => true
}
def run() = {
val x = new Execute[Task]( Execute.config(config.checkCycles, overwriteNode), triggers, config.progress)(taskToNode)
val x = new Execute[Task]( Execute.config(config.checkCycles, overwriteNode), triggers, config.progressReporter)(taskToNode)
val (newState, result) =
try {
val results = x.runKeep(root)(service)
@ -203,15 +321,18 @@ object EvaluateTask
logIncResult(replaced, state, streams)
(newState, replaced)
}
val cancel = () => {
println("")
log.warn("Canceling execution...")
shutdown()
object runningEngine extends RunningTaskEngine {
def cancelAndShutdown(): Unit = {
println("")
log.warn("Canceling execution...")
shutdown()
}
}
if(config.cancelable)
Signals.withHandler(cancel) { run }
else
run()
// Register with our cancel handler we're about to start.
val strat = config.cancelStrategy
val cancelState = strat.onTaskEngineStart(runningEngine)
try run()
finally strat.onTaskEngineFinish(cancelState)
}
private[this] def storeValuesForPrevious(results: RMap[Task, Result], state: State, streams: Streams): Unit =

View File

@ -57,10 +57,10 @@ object GlobalPlugin
}
def evaluate[T](state: State, structure: BuildStructure, t: Task[T], roots: Seq[ScopedKey[_]]): (State, T) =
{
import EvaluateTask._
import EvaluateTask._
withStreams(structure, state) { str =>
val nv = nodeView(state, str, roots)
val config = EvaluateTask.defaultConfig(Project.extract(state), structure)
val config = EvaluateTask.extractedTaskConfig(Project.extract(state), structure, state)
val (newS, result) = runTask(t, state, str, structure.index.triggers, config)(nv)
(newS, processResult(result, newS.log))
}

View File

@ -197,6 +197,7 @@ object Keys
val testForkedParallel = SettingKey[Boolean]("test-forked-parallel", "Whether forked tests should be executed in parallel", CTask)
val testExecution = TaskKey[Tests.Execution]("test-execution", "Settings controlling test execution", DTask)
val testFilter = TaskKey[Seq[String] => Seq[String => Boolean]]("test-filter", "Filter controlling whether the test is executed", DTask)
val testResultLogger = SettingKey[TestResultLogger]("test-result-logger", "Logs results after a test task completes.", DTask)
val testGrouping = TaskKey[Seq[Tests.Group]]("test-grouping", "Collects discovered tests into groups. Whether to fork and the options for forking are configurable on a per-group basis.", BMinusTask)
val isModule = AttributeKey[Boolean]("is-module", "True if the target is a module.", DSetting)
@ -346,6 +347,7 @@ object Keys
// wrapper to work around SI-2915
private[sbt] final class TaskProgress(val progress: ExecuteProgress[Task])
private[sbt] val executeProgress = SettingKey[State => TaskProgress]("executeProgress", "Experimental task execution listener.", DTask)
private[sbt] val taskCancelStrategy = SettingKey[State => TaskCancellationStrategy]("taskCancelStrategy", "Experimental task cancellation handler.", DTask)
// Experimental in sbt 0.13.2 to enable grabing semantic compile failures.
private[sbt] val compilerReporter = TaskKey[Option[xsbti.Reporter]]("compilerReporter", "Experimental hook to listen (or send) compilation failure messages.", DTask)

View File

@ -469,7 +469,7 @@ object Load
val autoConfigs = autoPlugins.flatMap(_.projectConfigurations)
val loadedSbtFiles = loadSbtFiles(project.auto, project.base, autoPlugins, project.settings)
// add the automatically selected settings, record the selected AutoPlugins, and register the automatically selected configurations
val transformed = project.copy(settings = loadedSbtFiles.settings).setAutoPlugins(autoPlugins).overrideConfigs(autoConfigs : _*)
val transformed = project.copy(settings = loadedSbtFiles.settings).setAutoPlugins(autoPlugins).prefixConfigs(autoConfigs : _*)
(transformed, loadedSbtFiles.projects)
}
def defaultLoad = loadSbtFiles(AddSettings.defaultSbtFiles, buildBase, Nil, Nil).projects
@ -612,7 +612,7 @@ object Load
{
val (eval,pluginDef) = apply(dir, s, config)
val pluginState = Project.setProject(Load.initialSession(pluginDef, eval), pluginDef, s)
config.evalPluginDef(pluginDef, pluginState)
config.evalPluginDef(Project.structure(pluginState), pluginState)
}
@deprecated("Use ModuleUtilities.getCheckedObjects[Build].", "0.13.2")

View File

@ -52,8 +52,8 @@ object DefaultOptions {
def pluginResolvers(plugin: Boolean, snapshot: Boolean): Seq[Resolver] = {
if (plugin && snapshot) Seq(Classpaths.typesafeSnapshots, Classpaths.sbtPluginSnapshots) else Nil
}
def addResolvers: Setting[_] = Keys.resolvers <<= Keys.isSnapshot apply resolvers
def addPluginResolvers: Setting[_] = Keys.resolvers <<= (Keys.sbtPlugin, Keys.isSnapshot) apply pluginResolvers
def addResolvers: Setting[_] = Keys.resolvers <++= Keys.isSnapshot apply resolvers
def addPluginResolvers: Setting[_] = Keys.resolvers <++= (Keys.sbtPlugin, Keys.isSnapshot) apply pluginResolvers
@deprecated("Use `credentials(State)` instead to make use of configuration path dynamically configured via `Keys.globalSettingsDirectory`; relying on ~/.ivy2 is not recommended anymore.", "0.12.0")
def credentials: Credentials = Credentials(userHome / ".ivy2" / ".credentials")

View File

@ -41,11 +41,11 @@ Steps for users:
For example, given plugins Web and Javascript (perhaps provided by plugins added with addSbtPlugin),
<Project>.addPlugins( Web && Javascript )
<Project>.enablePlugins( Web && Javascript )
will activate `MyPlugin` defined above and have its settings automatically added. If the user instead defines
<Project>.addPlugins( Web && Javascript ).disablePlugins(MyPlugin)
<Project>.enablePlugins( Web && Javascript ).disablePlugins(MyPlugin)
then the `MyPlugin` settings (and anything that activates only when `MyPlugin` is activated) will not be added.

View File

@ -109,6 +109,11 @@ sealed trait Project extends ProjectDefinition[ProjectReference]
/** Adds configurations to this project. Added configurations replace existing configurations with the same name.*/
def overrideConfigs(cs: Configuration*): Project = copy(configurations = Defaults.overrideConfigs(cs : _*)(configurations))
/** Adds configuration at the *start* of the configuration list for this rpoject. Prevous configurations replace this prefix
* list with the same name.
*/
private[sbt] def prefixConfigs(cs: Configuration*): Project = copy(configurations = Defaults.overrideConfigs(configurations : _*)(cs))
/** Adds new configurations directly to this project. To override an existing configuration, use `overrideConfigs`. */
def configs(cs: Configuration*): Project = copy(configurations = configurations ++ cs)
@ -140,8 +145,8 @@ sealed trait Project extends ProjectDefinition[ProjectReference]
def setSbtFiles(files: File*): Project = copy(auto = AddSettings.append( AddSettings.clearSbtFiles(auto), AddSettings.sbtFiles(files: _*)) )
/** Sets the [[AutoPlugin]]s of this project.
A [[AutoPlugin]] is a common label that is used by plugins to determine what settings, if any, to add to a project. */
def addPlugins(ns: Plugins*): Project = setPlugins(ns.foldLeft(plugins)(Plugins.and))
A [[AutoPlugin]] is a common label that is used by plugins to determine what settings, if any, to enable on a project. */
def enablePlugins(ns: Plugins*): Project = setPlugins(ns.foldLeft(plugins)(Plugins.and))
/** Disable the given plugins on this project. */
def disablePlugins(ps: AutoPlugin*): Project =
@ -498,13 +503,25 @@ object Project extends ProjectExtra
@deprecated("This method does not apply state changes requested during task execution. Use 'runTask' instead, which does.", "0.11.1")
def evaluateTask[T](taskKey: ScopedKey[Task[T]], state: State, checkCycles: Boolean = false, maxWorkers: Int = EvaluateTask.SystemProcessors): Option[Result[T]] =
runTask(taskKey, state, EvaluateConfig(true, EvaluateTask.defaultRestrictions(maxWorkers), checkCycles)).map(_._2)
def runTask[T](taskKey: ScopedKey[Task[T]], state: State, checkCycles: Boolean = false): Option[(State, Result[T])] =
runTask(taskKey, state, EvaluateConfig(true, EvaluateTask.restrictions(state), checkCycles))
{
val extracted = Project.extract(state)
val ch = EvaluateTask.cancelStrategy(extracted, extracted.structure, state)
val p = EvaluateTask.executeProgress(extracted, extracted.structure, state)
val r = EvaluateTask.restrictions(state)
runTask(taskKey, state, EvaluateTaskConfig(r, checkCycles, p, ch))
}
@deprecated("Use EvalauteTaskConfig option instead.", "0.13.5")
def runTask[T](taskKey: ScopedKey[Task[T]], state: State, config: EvaluateConfig): Option[(State, Result[T])] =
{
val extracted = Project.extract(state)
EvaluateTask(extracted.structure, taskKey, state, extracted.currentRef, config)
}
def runTask[T](taskKey: ScopedKey[Task[T]], state: State, config: EvaluateTaskConfig): Option[(State, Result[T])] = {
val extracted = Project.extract(state)
EvaluateTask(extracted.structure, taskKey, state, extracted.currentRef, config)
}
implicit def projectToRef(p: Project): ProjectReference = LocalProject(p.id)

View File

@ -168,7 +168,7 @@ object Common
def lib(m: ModuleID) = libraryDependencies += m
lazy val jlineDep = "jline" % "jline" % "2.11"
lazy val jline = lib(jlineDep)
lazy val ivy = lib("org.apache.ivy" % "ivy" % "2.3.0")
lazy val ivy = lib("org.scala-sbt.ivy" % "ivy" % "2.4.0-sbt-d6fca11d63402c92e4167cdf2da91a660d043392")
lazy val httpclient = lib("commons-httpclient" % "commons-httpclient" % "3.1")
lazy val jsch = lib("com.jcraft" % "jsch" % "0.1.46" intransitive() )
lazy val sbinary = libraryDependencies <+= Util.nightly211(n => "org.scala-tools.sbinary" % "sbinary" % "0.4.2" cross(if(n) CrossVersion.full else CrossVersion.binary))

View File

@ -0,0 +1,17 @@
lazy val foo = taskKey[Unit]("Runs the foo task")
lazy val bar = taskKey[Unit]("Runs the bar task")
def makeFoo(config: Configuration): Setting[_] =
foo in config := IO.write(file(s"${config.name}-foo"), "foo")
lazy val PerformanceTest = (config("pt") extend Test)
lazy val root = (
(project in file("."))
.configs(PerformanceTest)
.settings(Seq(Compile, Test, Runtime, PerformanceTest).map(makeFoo) :_*)
.settings(
bar in PerformanceTest := IO.write(file("pt-bar"), "bar")
)
)

View File

@ -0,0 +1,8 @@
# First make sure a task just on the new config delegates.
> bar
$ exists pt-bar
# Now make sure compile is the default configuration
> foo
$ exists compile-foo

View File

@ -0,0 +1,9 @@
import sbt.ExposeYourself._
taskCancelHandler := { (state: State) =>
new TaskEvaluationCancelHandler {
type State = Unit
override def onTaskEngineStart(canceller: TaskCancel): Unit = canceller.cancel()
override def finish(result: Unit): Unit = ()
}
}

View File

@ -0,0 +1,5 @@
package sbt // this API is private[sbt], so only exposed for trusted clients and folks who like breaking.
object ExposeYourself {
val taskCancelHandler = sbt.Keys.taskCancelHandler
}

View File

@ -0,0 +1,5 @@
import scala
object Foo {
val x = "this should be long to compile or the test may fail."
}

View File

@ -0,0 +1,3 @@
# All tasks should fail.
-> compile
-> test

View File

@ -1,17 +1,17 @@
// disablePlugins(Q) will prevent R from being auto-added
lazy val projA = project.addPlugins(A, B).disablePlugins(Q)
lazy val projA = project.enablePlugins(A, B).disablePlugins(Q)
// without B, Q is not added
lazy val projB = project.addPlugins(A)
lazy val projB = project.enablePlugins(A)
// with both A and B, Q is selected, which in turn selects R, but not S
lazy val projC = project.addPlugins(A, B)
lazy val projC = project.enablePlugins(A, B)
// with no natures defined, nothing is auto-added
lazy val projD = project
// with S selected, Q is loaded automatically, which in turn selects R
lazy val projE = project.addPlugins(S)
lazy val projE = project.enablePlugins(S)
check := {
val adel = (del in projA).?.value // should be None

View File

@ -23,5 +23,5 @@ object A extends AutoPlugin {
}
object B extends Build {
lazy val extra = project.addPlugins(bN)
lazy val extra = project.enablePlugins(bN)
}

View File

@ -0,0 +1,5 @@
package macros
object Client {
Provider.printTree(Foo.str)
}

View File

@ -0,0 +1,5 @@
package macros
object Foo {
def str: String = "abc"
}

View File

@ -0,0 +1,3 @@
package macros
object Foo {
}

View File

@ -0,0 +1,12 @@
package macros
import scala.language.experimental.macros
import scala.reflect.macros._
object Provider {
def printTree(arg: Any) = macro printTreeImpl
def printTreeImpl(c: Context)(arg: c.Expr[Any]): c.Expr[String] = {
val argStr = arg.tree.toString
val literalStr = c.universe.Literal(c.universe.Constant(argStr))
c.Expr[String](literalStr)
}
}

View File

@ -0,0 +1,30 @@
import sbt._
import Keys._
object build extends Build {
val defaultSettings = Seq(
libraryDependencies <+= scalaVersion("org.scala-lang" % "scala-reflect" % _ ),
incOptions := incOptions.value.withNameHashing(true),
scalaVersion := "2.11.0-RC3"
)
lazy val root = Project(
base = file("."),
id = "macro",
aggregate = Seq(macroProvider, macroClient),
settings = Defaults.defaultSettings ++ defaultSettings
)
lazy val macroProvider = Project(
base = file("macro-provider"),
id = "macro-provider",
settings = Defaults.defaultSettings ++ defaultSettings
)
lazy val macroClient = Project(
base = file("macro-client"),
id = "macro-client",
dependencies = Seq(macroProvider),
settings = Defaults.defaultSettings ++ defaultSettings
)
}

View File

@ -0,0 +1,12 @@
> compile
# remove `Foo.str` which is an argument to a macro
$ copy-file macro-client/changes/Foo.scala macro-client/Foo.scala
# we should recompile Foo.scala first and then fail to compile Client.scala due to missing
# `Foo.str`
-> macro-client/compile
> clean
-> compile

View File

@ -0,0 +1,5 @@
package macros
object Client {
Provider.printTree(Provider.printTree(Foo.str))
}

View File

@ -0,0 +1,5 @@
package macros
object Foo {
def str: String = "abc"
}

View File

@ -0,0 +1,3 @@
package macros
object Foo {
}

View File

@ -0,0 +1,12 @@
package macros
import scala.language.experimental.macros
import scala.reflect.macros._
object Provider {
def printTree(arg: Any) = macro printTreeImpl
def printTreeImpl(c: Context)(arg: c.Expr[Any]): c.Expr[String] = {
val argStr = arg.tree.toString
val literalStr = c.universe.Literal(c.universe.Constant(argStr))
c.Expr[String](literalStr)
}
}

View File

@ -0,0 +1,30 @@
import sbt._
import Keys._
object build extends Build {
val defaultSettings = Seq(
libraryDependencies <+= scalaVersion("org.scala-lang" % "scala-reflect" % _ ),
incOptions := incOptions.value.withNameHashing(true),
scalaVersion := "2.11.0-RC3"
)
lazy val root = Project(
base = file("."),
id = "macro",
aggregate = Seq(macroProvider, macroClient),
settings = Defaults.defaultSettings ++ defaultSettings
)
lazy val macroProvider = Project(
base = file("macro-provider"),
id = "macro-provider",
settings = Defaults.defaultSettings ++ defaultSettings
)
lazy val macroClient = Project(
base = file("macro-client"),
id = "macro-client",
dependencies = Seq(macroProvider),
settings = Defaults.defaultSettings ++ defaultSettings
)
}

View File

@ -0,0 +1,13 @@
> compile
# remove `Foo.str` which is an argument to a macro
# (this macro itself that is an argument to another macro)
$ copy-file macro-client/changes/Foo.scala macro-client/Foo.scala
# we should recompile Foo.scala first and then fail to compile Client.scala due to missing
# `Foo.str`
-> macro-client/compile
> clean
-> compile

View File

@ -5,6 +5,15 @@ Changes
0.13.2 to 0.13.5
~~~~~~~~~~~~~~~~
- The Scala version for sbt and sbt plugins is now 2.10.4. This is a compatible version bump.
- Added a new setting ``testResultLogger`` to allow customisation of logging of test results. (gh-1225)
- When ``test`` is run and there are no tests available, omit logging output.
Especially useful for aggregate modules. ``test-only`` et al unaffected. (gh-1185)
- sbt now uses minor-patch version of ivy 2.4 (org.scala-sbt.ivy:ivy:2.4.0-sbt-<git sha>)
- ``sbt.Plugin`` deprecated in favor of ``sbt.AutoPlugin``
- name-hashing incremental compiler now supports scala macros.
- ``testResultLogger`` is now configured.
- sbt-server hooks for task cancellation.
0.13.1 to 0.13.2
~~~~~~~~~~~~~~~~

View File

@ -19,6 +19,38 @@ object Signals
case Right(v) => v
}
}
/** Helper interface so we can expose internals of signal-isms to others. */
sealed trait Registration {
def remove(): Unit
}
/** Register a signal handler that can be removed later.
* NOTE: Does not stack with other signal handlers!!!!
*/
def register(handler: () => Unit, signal: String = INT): Registration =
// TODO - Maybe we can just ignore things if not is-supported.
if(supported(signal)) {
import sun.misc.{Signal,SignalHandler}
val intSignal = new Signal(signal)
val newHandler = new SignalHandler {
def handle(sig: Signal) { handler() }
}
val oldHandler = Signal.handle(intSignal, newHandler)
object unregisterNewHandler extends Registration {
override def remove(): Unit = {
Signal.handle(intSignal, oldHandler)
}
}
unregisterNewHandler
} else {
// TODO - Maybe we should just throw an exception if we don't support signals...
object NullUnregisterNewHandler extends Registration {
override def remove(): Unit = ()
}
NullUnregisterNewHandler
}
def supported(signal: String): Boolean =
try
{

View File

@ -118,25 +118,21 @@ sealed abstract class PathFinder
final def \ (literal: String): PathFinder = this / literal
@deprecated("Use pair.", "0.13.1")
def x_![T](mapper: File => Option[T]): Traversable[(File,T)] = x(mapper, false)
def x_![T](mapper: File => Option[T]): Traversable[(File,T)] = pair(mapper, false)
/** Applies `mapper` to each path selected by this PathFinder and returns the path paired with the non-empty result.
* If the result is empty (None) and `errorIfNone` is true, an exception is thrown.
* If `errorIfNone` is false, the path is dropped from the returned Traversable.*/
def pair[T](mapper: File => Option[T], errorIfNone: Boolean = true): Seq[(File,T)] =
x(mapper, errorIfNone)
{
val apply = if(errorIfNone) mapper | fail else mapper
for(file <- get; mapped <- apply(file)) yield (file, mapped)
}
/** Applies `mapper` to each path selected by this PathFinder and returns the path paired with the non-empty result.
* If the result is empty (None) and `errorIfNone` is true, an exception is thrown.
* If `errorIfNone` is false, the path is dropped from the returned Traversable.*/
@deprecated("Use pair.", "0.13.1")
def x[T](mapper: File => Option[T], errorIfNone: Boolean = true): Seq[(File,T)] =
{
val apply = if(errorIfNone) mapper | fail else mapper
for(file <- get; mapped <- apply(file)) yield (file, mapped)
}
def x[T](mapper: File => Option[T], errorIfNone: Boolean = true): Seq[(File,T)] = pair(mapper, errorIfNone)
/** Selects all descendant paths with a name that matches <code>include</code> and do not have an intermediate
/** Selects all descendant paths with a name that matches <code>include</code> and do not have an intermediate
* path with a name that matches <code>intermediateExclude</code>. Typical usage is:
*
* <code>descendantsExcept("*.jar", ".svn")</code>*/